DETAILED ACTION
Claims 1-14 are presented for examination.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 13 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 13 recites a non-transitory computer-readable recording medium storing a program for causing a processor of a computer to perform the information analysis method according to claim 12. Based on a broadest reasonable interpretation, a “program” may simply be data, such as a plan of action, and not necessarily processor-executable instructions. If the program is simply a plan of action, then claim 13 reads as more of an apparatus claim. However, if the program is meant to be processor-executable, then claim 13 would read as an article of manufacture claim. There is some ambiguity as to whether or not the recited “program” is meant to be limited to being processor-executable (e.g., processor-executable instructions) or not. Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
Claims 1-14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claimed invention is directed to information analysis regarding a plurality of facilities and moving objects (Spec: ¶ 3) without significantly more.
Step
Analysis
1: Statutory Category?
Yes – The claims fall within at least one of the four categories of patent eligible subject matter. Process (claim 12), Apparatus (claims 1-11, 14)
As explained in the rejection under 35 U.S.C. § 112(b) above, it is not clear if claim 13 is meant to be an apparatus claim or an article of manufacture. Nevertheless, since claim 13 recites the structure of a non-transitory computer-readable recording medium, it at least recites structure, thereby reading as an apparatus claim at present. If the claim is amended to clarify that the program is processor-executable, then claim 13 would instead be more clearly an article of manufacture claim. Nevertheless, claim 13 is interpreted as falling into at least one of the four categories of patent eligible subject matter.
Independent claims:
Step
Analysis
2A – Prong 1: Judicial Exception Recited?
Yes – Aside from the additional elements identified in Step 2A – Prong 2 below, the claims recite:
[Claims 1, 12, 13] displaying information on a moving object in plural areas corresponding to plural facilities;
display information to be presented on a display;
receive a user operation;
control the display, based on the user operation received; and
store status information and detection information, the status information indicating a state where each facility in the plural facilities operates, and the detection information indicating a detection result of the moving object in each area in the plural areas,
receive a status information operation, the status information operation designating information on a specific facility in the status information on the plural facilities; and
limit information to be displayed on the display among the status information on the plural facilities and the detection information on the plural areas, based on information on the specific facility designated by the status information operation.
Aside from the additional elements, the aforementioned claim details exemplify the abstract idea(s) of a mental process (since the details include concepts performed in the human mind, including an observation, evaluation, judgment, and/or opinion). As explained in MPEP § 2106(a)(2)(C)(III), “The courts consider a mental process (thinking) that ‘can be performed in the human mind, or by a human using a pen and paper’ to be an abstract idea. CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1372, 99 USPQ2d 1690, 1695 (Fed. Cir. 2011). As the Federal Circuit explained, ‘methods which can be performed mentally, or which are the equivalent of human mental work, are unpatentable abstract ideas the ‘basic tools of scientific and technological work’ that are open to all.’’ 654 F.3d at 1371, 99 USPQ2d at 1694 (citing Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 673 (1972)).” The limitations reproduced above, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitations in the mind but for the recitation of generic computer components. That is, other than reciting the additional elements identified in Step 2A – Prong 2 below, nothing in the claim elements precludes the steps from practically being performed in the mind and/or by a human using a pen and paper. For example, but for the recitations of generic computer and other processing components (identified in Step 2A – Prong 2 below), the respectively recited steps/functions of the claims, as drafted and set forth above, are a process that, under its broadest reasonable interpretation, covers performance of the limitations in the mind and/or with the use of pen and paper. A human user can present information for display, receive the recited information, store information, customize which limited information is presented for display, etc. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind (and/or with pen and paper) but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claims recite an abstract idea.
Aside from the additional elements, the aforementioned claim details exemplify a method of organizing human activity (since the details include examples of commercial or legal interactions, including advertising, marketing or sales activities or behaviors, and/or business relations and managing personal behavior or relationships or interactions between people, including social activities, teaching, and following rules or instructions). More specifically, the evaluated process is related to information analysis regarding a plurality of facilities and moving objects (Spec: ¶ 3), which (under its broadest reasonable interpretation) is an example of managing commercial interactions (i.e., organizing human activity); therefore, aside from the recitations of generic computer and other processing components (identified in Step 2A – Prong 2 below), the limitations identified in the more detailed claim listing above encompass the abstract idea of organizing human activity.
2A – Prong 2: Integrated into a Practical Application?
No – The judicial exception(s) is/are not integrated into a practical application.
Claim 1 recites an information analysis device for displaying information on a moving object in plural areas corresponding to plural facilities, the information analysis device comprising:
a display configured to display information;
an input interface configured to receive a user operation;
a processor configured to control the display, based on the user operation received by the input interface; and
a memory configured to store status information and detection information, the status information indicating a state where each facility in the plural facilities operates, and the detection information indicating a detection result of the moving object in each area in the plural areas,
wherein the processor is configured to perform the recited operations.
Claim 1 further receives a status information operation via the input interface.
Claim 12 recites an information analysis method for causing a computer to display information on a moving object in plural areas corresponding to plural facilities, the information analysis method comprising:
causing a memory of the computer to store status information and detection information; and
causing a processor of the computer to perform the recited operations.
Claim 12 further receives a status information operation via an input interface.
Claim 13 recites a non-transitory computer-readable recording medium storing a program for causing a processor of a computer to perform the information analysis method according to claim 12.
The claims as a whole merely describe how to generally “apply” the abstract idea(s) in a computer environment. The claimed processing elements are recited at a high level of generality and are merely invoked as a tool to perform the abstract idea(s). Simply implementing the abstract idea(s) on a general-purpose processor is not a practical application of the abstract idea(s); Applicant’s specification discloses that the invention may be implemented using general-purpose processing elements and other generic components (Spec: ¶¶ 18-31).
The use of a processor/processing elements (e.g., as recited in all of the claims) facilitates generic processor operations. The use of a memory or machine-readable media with executable instructions facilitates generic processor operations.
The additional elements are recited at a high-level of generality (i.e., as generic processing elements performing generic computer functions) such that the incorporation of the additional processing elements amounts to no more than mere instructions to apply the judicial exception(s) using generic computer components. There is no indication in the Specification that the steps/functions of the claims require any inventive programming or necessitate any specialized or other inventive computer components (i.e., the steps/functions of the claims may be implemented using capabilities of general-purpose computer components). Accordingly, the additional elements do not integrate the abstract ideas into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea(s).
The processing components presented in the claims simply utilize the capabilities of a general-purpose computer and are, thus, merely tools to implement the abstract idea(s). As seen in MPEP § 2106.05(a)(I) and § 2106.05(f)(2), the court found that accelerating a process when the increased speed solely comes from the capabilities of a general-purpose computer is not sufficient to show an improvement in computer-functionality and it amounts to a mere invocation of computers or machinery as a tool to perform an existing process (see FairWarning IP, LLC v. Iatric Sys., 839 F.3d 1089, 1095, 120 USPQ2d 1293, 1296 (Fed. Cir. 2016)).
There is no transformation or reduction of a particular article to a different state or thing recited in the claims.
Additionally, even when considering the operations of the additional elements as an ordered combination, the ordered combination does not amount to significantly more than what is present in the claims when each operation is considered separately.
2B: Claim(s) Provide(s) an Inventive Concept?
No – The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception(s). As discussed above with respect to integration of the abstract idea(s) into a practical application, the use of the additional elements to perform the steps identified in Step 2A – Prong 1 above amounts to no more than mere instructions to apply the exceptions using a generic computer component(s). Mere instructions to apply an exception using a generic computer component(s) cannot provide an inventive concept. The claims are not patent eligible.
Dependent claims:
Step
Analysis
2A – Prong 1: Judicial Exception Recited?
Yes – Aside from the additional elements identified in Step 2A – Prong 2 below, the claims recite:
[Claim 2] wherein the status information on the plural facilities includes time spans each in which each facility is a predetermined state,
wherein the information on the specific facility, which is designated by the status information operation, indicates a time span in which the specific facility is the predetermined state, and
cause the limited detection information, which is to be displayed on the display, to include the detection result of the moving object in the time span designated by the status information operation.
[Claim 3] limit the information to be displayed on the display among the detection information on the plural areas, based on the detection result of the moving object within the designated time span in each of the plural areas.
[Claim 4] limit the information to be displayed on the display among the detection information on the plural areas, based on a frequency at which the moving object moves between an area corresponding to the specific facility and the other areas.
[Claim 5] limit the information to be displayed on the display, based on a frequency at which the moving object moves between a predetermined area and any area of the plural areas, the predetermined area being different from the plural areas corresponding to the plural facilities.
[Claim 6] store management information managing a responsible moving object for each area among the plural areas, and limit the information to be displayed on the display among the detection information on the plural areas, referring to the management information in response to the status information operation.
[Claim 7] store map information indicating positional relation between the plural facilities and the plural areas, and
limit the information to be displayed on the display among the detection information on the plural areas, based on positional relation between an area corresponding to the specific facility and the plural areas in the map information.
[Claim 8] limit the detection information to be displayed on the display by at least one of: narrowing down the detection information on the plural areas to the detection information to be displayed; or controlling a display order of the detection information to be displayed.
[Claim 9] limit the status information to be displayed on the display by excluding information on facilities other than the specific facility among the plural facilities, from the status information to be displayed on the display.
[Claim 10] receive a detection information operation, the detection information operation designating detection information on a specific area in the detection information displayed on the display; and
cause, on a display, to display predetermined information related to a detection result of the specific area, in response to the detection information operation.
[Claim 11] wherein the status information is a first timeline indicating a state where a corresponding facility operates in chronological order, and
wherein the detection information is a second timeline indicating a detection result of the moving object in a corresponding area in chronological order.
[Claim 14] exclude non-designated status information from the status information to be displayed, the non-designated status information indicating a state of another facility than the specific facility, which is designated by the status information operation, among the plural facilities.
The dependent claims further present details of the abstract ideas identified in regard to the independent claims.
Aside from the additional elements, the aforementioned claim details exemplify the abstract idea(s) of a mental process (since the details include concepts performed in the human mind, including an observation, evaluation, judgment, and/or opinion). As explained in MPEP § 2106(a)(2)(C)(III), “The courts consider a mental process (thinking) that ‘can be performed in the human mind, or by a human using a pen and paper’ to be an abstract idea. CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1372, 99 USPQ2d 1690, 1695 (Fed. Cir. 2011). As the Federal Circuit explained, ‘methods which can be performed mentally, or which are the equivalent of human mental work, are unpatentable abstract ideas the ‘basic tools of scientific and technological work’ that are open to all.’’ 654 F.3d at 1371, 99 USPQ2d at 1694 (citing Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 673 (1972)).” The limitations reproduced above, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitations in the mind but for the recitation of generic computer components. That is, other than reciting the additional elements identified in Step 2A – Prong 2 below, nothing in the claim elements precludes the steps from practically being performed in the mind and/or by a human using a pen and paper. For example, but for the recitations of generic computer and other processing components (identified in Step 2A – Prong 2 below), the respectively recited steps/functions of the claims, as drafted and set forth above, are a process that, under its broadest reasonable interpretation, covers performance of the limitations in the mind and/or with the use of pen and paper. A human user can present information for display, receive the recited information, store information, customize which limited information is presented for display, etc. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind (and/or with pen and paper) but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claims recite an abstract idea.
Aside from the additional elements, the aforementioned claim details exemplify a method of organizing human activity (since the details include examples of commercial or legal interactions, including advertising, marketing or sales activities or behaviors, and/or business relations and managing personal behavior or relationships or interactions between people, including social activities, teaching, and following rules or instructions). More specifically, the evaluated process is related to information analysis regarding a plurality of facilities and moving objects (Spec: ¶ 3), which (under its broadest reasonable interpretation) is an example of managing commercial interactions (i.e., organizing human activity); therefore, aside from the recitations of generic computer and other processing components (identified in Step 2A – Prong 2 below), the limitations identified in the more detailed claim listing above encompass the abstract idea of organizing human activity.
The dependent claims largely limit information to be displayed and exclude information from being displayed, which are examples of filtering content. MPEP § 2106.04(a)(2)(II)(C) cites the following as an example of managing personal behavior, i.e., organizing human activity: “filtering content, BASCOM Global Internet v. AT&T Mobility, LLC, 827 F.3d 1341, 1345-46, 119 USPQ2d 1236, 1239 (Fed. Cir. 2016) (finding that filtering content was an abstract idea under step 2A, but reversing an invalidity judgment of ineligibility due to an inadequate step 2B analysis).” MPEP § 2106.04(a)(2)(III)(D) cites the following as an example of a mental process: “An application program interface for extracting and processing information from a diversity of types of hard copy documents – Content Extraction, 776 F.3d at 1345, 113 USPQ2d at 1356.”
2A – Prong 2: Integrated into a Practical Application?
No – The judicial exception(s) is/are not integrated into a practical application.
The dependent claims include the additional elements of their independent claims.
Claim 1 recites an information analysis device for displaying information on a moving object in plural areas corresponding to plural facilities, the information analysis device comprising:
a display configured to display information;
an input interface configured to receive a user operation;
a processor configured to control the display, based on the user operation received by the input interface; and
a memory configured to store status information and detection information, the status information indicating a state where each facility in the plural facilities operates, and the detection information indicating a detection result of the moving object in each area in the plural areas,
wherein the processor is configured to perform the recited operations.
Claim 1 further receives a status information operation via the input interface.
The processor and memory are generally applied to implement the various operations of the dependent claims as well.
Claim 10 receives a detection information operation via the input interface.
Claim 12 recites an information analysis method for causing a computer to display information on a moving object in plural areas corresponding to plural facilities, the information analysis method comprising:
causing a memory of the computer to store status information and detection information; and
causing a processor of the computer to perform the recited operations.
Claim 12 further receives a status information operation via an input interface.
Claim 13 recites a non-transitory computer-readable recording medium storing a program for causing a processor of a computer to perform the information analysis method according to claim 12.
The claims as a whole merely describe how to generally “apply” the abstract idea(s) in a computer environment. The claimed processing elements are recited at a high level of generality and are merely invoked as a tool to perform the abstract idea(s). Simply implementing the abstract idea(s) on a general-purpose processor is not a practical application of the abstract idea(s); Applicant’s specification discloses that the invention may be implemented using general-purpose processing elements and other generic components (Spec: ¶¶ 18-31).
The use of a processor/processing elements (e.g., as recited in all of the claims) facilitates generic processor operations. The use of a memory or machine-readable media with executable instructions facilitates generic processor operations.
The additional elements are recited at a high-level of generality (i.e., as generic processing elements performing generic computer functions) such that the incorporation of the additional processing elements amounts to no more than mere instructions to apply the judicial exception(s) using generic computer components. There is no indication in the Specification that the steps/functions of the claims require any inventive programming or necessitate any specialized or other inventive computer components (i.e., the steps/functions of the claims may be implemented using capabilities of general-purpose computer components). Accordingly, the additional elements do not integrate the abstract ideas into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea(s).
The processing components presented in the claims simply utilize the capabilities of a general-purpose computer and are, thus, merely tools to implement the abstract idea(s). As seen in MPEP § 2106.05(a)(I) and § 2106.05(f)(2), the court found that accelerating a process when the increased speed solely comes from the capabilities of a general-purpose computer is not sufficient to show an improvement in computer-functionality and it amounts to a mere invocation of computers or machinery as a tool to perform an existing process (see FairWarning IP, LLC v. Iatric Sys., 839 F.3d 1089, 1095, 120 USPQ2d 1293, 1296 (Fed. Cir. 2016)).
There is no transformation or reduction of a particular article to a different state or thing recited in the claims.
Additionally, even when considering the operations of the additional elements as an ordered combination, the ordered combination does not amount to significantly more than what is present in the claims when each operation is considered separately.
2B: Claim(s) Provide(s) an Inventive Concept?
No – The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception(s). As discussed above with respect to integration of the abstract idea(s) into a practical application, the use of the additional elements to perform the steps identified in Step 2A – Prong 1 above amounts to no more than mere instructions to apply the exceptions using a generic computer component(s). Mere instructions to apply an exception using a generic computer component(s) cannot provide an inventive concept. The claims are not patent eligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-3 and 6-14 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Kitazumi et al. (US 2022/0215327).
[Claim 1] Kitazumi discloses an information analysis device for displaying information on a moving object in plural areas corresponding to plural facilities (¶ 34 – “The work analysis device 1 can generate a time chart representing the flow of work step processes by the worker by measuring the work time [of the worker] at each of the workstations. The work analysis device 1 analyzes whether the workstations are arranged appropriately, or whether the work step processes and the like by the worker are appropriate by comparing the time chart generated with a benchmark time chart prepared in advance. The result of the analysis by the work analysis device 1 is presented to the user. The user can use the analysis result from the work analysis device 1 to change the layout of the workstations, exchange the parts placed at a workstation, revise the benchmark time chart, or the like.”; ¶ 64 – “The determination unit 13 acquires the position information for the travel areas a to c from the process management table 12 and determines in which travel area the worker is present on the basis of the position information of the worker detected in step S21. The determination unit 13 also acquires the position information of the workstations A to G from the process management table 12 and can determine at which workstation work is being performed on the basis of information on the position and orientation of the worker detected in step S21. That is, the determination unit 13 can determine the process for which a worker is performing work. The determination unit 13 can also determine the time for a worker to transition from a process the worker is currently performing to the next process.”; NOTE: Applicant’s Specification describes facilities as areas in a workplace, as seen in Spec: ¶ 12), the information analysis device comprising:
a display configured to display information (¶ 85 – “In step S26 of FIG. 3, the output unit 17 presents the time chart generated in step S24 and the result of the analysis in step S25 on a display or the like provided to the work analysis device 1. The output unit 17 may be configured to switch between presenting the time chart and presenting the analysis result in accordance with an instruction from the user. The output unit 17 may also be configured to switch the display format of the time chart (e.g., display formats such as a table, a graph, etc.) in accordance with an instruction from the user.”);
an input interface configured to receive a user operation (¶ 85 – “The output unit 17 may also be configured to switch the display format of the time chart (e.g., display formats such as a table, a graph, etc.) in accordance with an instruction from the user.”);
a processor configured to control the display, based on the user operation received by the input interface (¶ 85 – “The output unit 17 may also be configured to switch the display format of the time chart (e.g., display formats such as a table, a graph, etc.) in accordance with an instruction from the user.”); and
a memory configured to store status information and detection information, the status information indicating a state where each facility in the plural facilities operates, and the detection information indicating a detection result of the moving object in each area in the plural areas (¶ 42 – “The reception unit 10 includes a function of receiving a captured image from the camera 2. The reception unit 10 transfers the captured image received to the detector unit 11. The reception unit 10 may store the captured image received in the auxiliary storage device 103.”; ¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; ¶ 65 – “The determination unit 13 can count the number of frames of the captured image until the worker moves to the next step to thereby compute the work time for each process. The determination unit 13 may store the work time calculated for each process in the auxiliary storage device 103.”; ¶ 68 – “The standard time is defined in advance in accordance with the work content for each process and is stored in a process management table 12.”),
wherein the processor (fig. 1; ¶ 38 – “An example of the hardware configuration for the work analysis device 1 according to an embodiment is described with reference to FIG. 1. The work analysis device 1 is provided with a processor 101, a main storage device 102, an auxiliary storage device 103, a communication interface 104, and an output device 105. The processor 101 reads a program stored in the auxiliary storage device 103 into the main storage device 102 and executes the program to thereby implement the functional configurations described with FIG. 2 as functions.”) is configured to:
receive a status information operation via the input interface, the status information operation designating information on a specific facility in the status information on the plural facilities (¶ 51 – “The overall flow of process that analyzes a work step is described according to FIG. 3. FIG. 3 is a flowchart that is an example of work analysis processing. The work analysis processing in FIG. 3 presents an example where the captured images received from the camera 2 are parsed in order while the worker is performing a series of work steps, and a time chart generated after the worker concludes the work step. The time chart is not limited to being generated after the worker concludes the work step; the time chart may be generated in parallel with the receiving and parsing of captured images.” Work status information may be received as input from the camera.; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.); and
limit information to be displayed on the display among the status information on the plural facilities and the detection information on the plural areas, based on information on the specific facility designated by the status information operation (fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 2] Kitazumi discloses wherein the status information on the plural facilities includes time spans each in which each facility is a predetermined state (¶ 66 – “The detector unit 11 (person detector unit 11A) determines whether or not the worker has completed the work step in step S23. The person detector unit 11A can determine that the worker has completed work step, for instance, when the person detector unit 11A does not detect a person in the captured image fed thereto from the reception unit 10. The person detector unit 11A may also determine that the worker has completed the work step when the worker changes orientation from the workstation G where the last process is performed to the workstation A where the first process is performed. The processing continues to step S24 when the series of work steps by the worker is completed (YES, at step S23). The processing returns to step S20 when the worker has not completed the work step (NO, at step S23). The processing from step S20 through step S22 is repeated for each frame of captured image fed in from the reception unit 10 between returning to step S20 and until the work steps are complete.”; ¶ 67 – “The time chart generation unit 14 generates a time chart in step S24 representing the flow of processes performed by the worker. The time chart generated may be presented on a display or the like, which is the output device 105. Here, an example of the time chart generation unit 14 generating a time chart is described using FIG. 7 and FIG. 8. FIG. 7 and FIG. 8 illustrate an example of a time chart where a worker X and a worker Y perform a work step that includes processes A to G.”; ¶ 68 – “The standard time is defined in advance in accordance with the work content for each process and is stored in a process management table 12. In the example in FIG. 7, the unit for the standard time is minutes. The Worker X field indicates the time the worker X needed to perform the work for each process. The Worker Y field indicates the time the worker Y needed to perform the work for each process. The time in the Worker X field and Worker Y field is indicated in minutes.” The amount of time spent performing a process at each of multiple workstations is an example of status information corresponding to time spans at each of the plurality of facilities.),
wherein the information on the specific facility, which is designated by the status information operation, indicates a time span in which the specific facility is the predetermined state (¶ 66 – “The detector unit 11 (person detector unit 11A) determines whether or not the worker has completed the work step in step S23. The person detector unit 11A can determine that the worker has completed work step, for instance, when the person detector unit 11A does not detect a person in the captured image fed thereto from the reception unit 10. The person detector unit 11A may also determine that the worker has completed the work step when the worker changes orientation from the workstation G where the last process is performed to the workstation A where the first process is performed. The processing continues to step S24 when the series of work steps by the worker is completed (YES, at step S23). The processing returns to step S20 when the worker has not completed the work step (NO, at step S23). The processing from step S20 through step S22 is repeated for each frame of captured image fed in from the reception unit 10 between returning to step S20 and until the work steps are complete.”; ¶ 67 – “The time chart generation unit 14 generates a time chart in step S24 representing the flow of processes performed by the worker. The time chart generated may be presented on a display or the like, which is the output device 105. Here, an example of the time chart generation unit 14 generating a time chart is described using FIG. 7 and FIG. 8. FIG. 7 and FIG. 8 illustrate an example of a time chart where a worker X and a worker Y perform a work step that includes processes A to G.”; ¶ 68 – “The standard time is defined in advance in accordance with the work content for each process and is stored in a process management table 12. In the example in FIG. 7, the unit for the standard time is minutes. The Worker X field indicates the time the worker X needed to perform the work for each process. The Worker Y field indicates the time the worker Y needed to perform the work for each process. The time in the Worker X field and Worker Y field is indicated in minutes.” The amount of time spent performing a process at each of multiple workstations is an example of status information corresponding to time spans at each of the plurality of facilities.), and
wherein the processor is configured to cause the limited detection information, which is to be displayed on the display, to include the detection result of the moving object in the time span designated by the status information operation (fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 3] Kitazumi discloses wherein the processor is configured to limit the information to be displayed on the display among the detection information on the plural areas, based on the detection result of the moving object within the designated time span in each of the plural areas (fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 6] Kitazumi discloses wherein the memory is configured to store management information managing a responsible moving object for each area among the plural areas (¶ 42 – “The reception unit 10 includes a function of receiving a captured image from the camera 2. The reception unit 10 transfers the captured image received to the detector unit 11. The reception unit 10 may store the captured image received in the auxiliary storage device 103.”; ¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; ¶ 65 – “The determination unit 13 can count the number of frames of the captured image until the worker moves to the next step to thereby compute the work time for each process. The determination unit 13 may store the work time calculated for each process in the auxiliary storage device 103.”; ¶ 68 – “The standard time is defined in advance in accordance with the work content for each process and is stored in a process management table 12.”), and
wherein the processor is configured to limit the information to be displayed on the display among the detection information on the plural areas, referring to the management information in response to the status information operation (¶ 51 – “The overall flow of process that analyzes a work step is described according to FIG. 3. FIG. 3 is a flowchart that is an example of work analysis processing. The work analysis processing in FIG. 3 presents an example where the captured images received from the camera 2 are parsed in order while the worker is performing a series of work steps, and a time chart generated after the worker concludes the work step. The time chart is not limited to being generated after the worker concludes the work step; the time chart may be generated in parallel with the receiving and parsing of captured images.” Work status information may be received as input from the camera.; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 7] Kitazumi discloses wherein the memory is configured to store map information indicating positional relation between the plural facilities and the plural areas (¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”), and
wherein the processor is configured to limit the information to be displayed on the display among the detection information on the plural areas, based on positional relation between an area corresponding to the specific facility and the plural areas in the map information (¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 8] Kitazumi discloses wherein the processor is configured to limit the detection information to be displayed on the display by at least one of: narrowing down the detection information on the plural areas to the detection information to be displayed; or controlling a display order of the detection information to be displayed (¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 9] Kitazumi discloses wherein the processor is configured to limit the status information to be displayed on the display by excluding information on facilities other than the specific facility among the plural facilities, from the status information to be displayed on the display (¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed. In other words, information that is not relevant (e.g., information related to processes and corresponding workstations not detected) is not displayed (i.e., it is excluded from the display).).
[Claim 10] Kitazumi discloses wherein the processor is configured to:
receive a detection information operation via the input interface, the detection information operation designating detection information on a specific area in the detection information displayed on the display (¶ 51 – “The overall flow of process that analyzes a work step is described according to FIG. 3. FIG. 3 is a flowchart that is an example of work analysis processing. The work analysis processing in FIG. 3 presents an example where the captured images received from the camera 2 are parsed in order while the worker is performing a series of work steps, and a time chart generated after the worker concludes the work step. The time chart is not limited to being generated after the worker concludes the work step; the time chart may be generated in parallel with the receiving and parsing of captured images.” Work status information may be received as input from the camera.; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.); and
limit information to be displayed on the display among the status information on the plural facilities and the detection information on the plural areas, based on information on the specific facility designated by the status information operation (fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.); and
cause the display to display predetermined information related to a detection result of the specific area, in response to the detection information operation (¶ 51 – “The overall flow of process that analyzes a work step is described according to FIG. 3. FIG. 3 is a flowchart that is an example of work analysis processing. The work analysis processing in FIG. 3 presents an example where the captured images received from the camera 2 are parsed in order while the worker is performing a series of work steps, and a time chart generated after the worker concludes the work step. The time chart is not limited to being generated after the worker concludes the work step; the time chart may be generated in parallel with the receiving and parsing of captured images.” Work status information may be received as input from the camera.; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.); and
limit information to be displayed on the display among the status information on the plural facilities and the detection information on the plural areas, based on information on the specific facility designated by the status information operation (fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 11] Kitazumi discloses wherein the status information is a first timeline indicating a state where a corresponding facility operates in chronological order (¶ 85 – “In step S26 of FIG. 3, the output unit 17 presents the time chart generated in step S24 and the result of the analysis in step S25 on a display or the like provided to the work analysis device 1. The output unit 17 may be configured to switch between presenting the time chart and presenting the analysis result in accordance with an instruction from the user. The output unit 17 may also be configured to switch the display format of the time chart (e.g., display formats such as a table, a graph, etc.) in accordance with an instruction from the user.”; figs. 7, 8, 9, 11, ¶¶ 26-28, 30 – Various display formats are available, including to present data in a chronological order, and a user can select which display format to view.), and
wherein the detection information is a second timeline indicating a detection result of the moving object in a corresponding area in chronological order (¶ 85 – “In step S26 of FIG. 3, the output unit 17 presents the time chart generated in step S24 and the result of the analysis in step S25 on a display or the like provided to the work analysis device 1. The output unit 17 may be configured to switch between presenting the time chart and presenting the analysis result in accordance with an instruction from the user. The output unit 17 may also be configured to switch the display format of the time chart (e.g., display formats such as a table, a graph, etc.) in accordance with an instruction from the user.” figs. 7, 8, 9, 11, ¶¶ 26-28, 30 – Various display formats are available, including to present data in a chronological order, and a user can select which display format to view.).
[Claim 14] Kitazumi discloses wherein the processor is configured to exclude non-designated status information from the status information to be displayed, the non-designated status information indicating a state of another facility than the specific facility, which is designated by the status information operation, among the plural facilities (¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.” In other words, the system knows which workstations are of interest in light of the processes being tracked and the corresponding information is what is presented on a display.; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed. In other words, information that is not relevant (e.g., information related to processes and corresponding workstations not detected) is not displayed (i.e., it is excluded from the display).).
[Claim 12] Kitazumi discloses an information analysis method for causing a computer to display information on a moving object in plural areas corresponding to plural facilities (¶ 34 – “The work analysis device 1 can generate a time chart representing the flow of work step processes by the worker by measuring the work time [of the worker] at each of the workstations. The work analysis device 1 analyzes whether the workstations are arranged appropriately, or whether the work step processes and the like by the worker are appropriate by comparing the time chart generated with a benchmark time chart prepared in advance. The result of the analysis by the work analysis device 1 is presented to the user. The user can use the analysis result from the work analysis device 1 to change the layout of the workstations, exchange the parts placed at a workstation, revise the benchmark time chart, or the like.”; ¶ 64 – “The determination unit 13 acquires the position information for the travel areas a to c from the process management table 12 and determines in which travel area the worker is present on the basis of the position information of the worker detected in step S21. The determination unit 13 also acquires the position information of the workstations A to G from the process management table 12 and can determine at which workstation work is being performed on the basis of information on the position and orientation of the worker detected in step S21. That is, the determination unit 13 can determine the process for which a worker is performing work. The determination unit 13 can also determine the time for a worker to transition from a process the worker is currently performing to the next process.”; NOTE: Application’s Specification describes facilities as areas in a workplace, as seen in Spec: ¶ 12), the information analysis method comprising:
causing a memory of the computer to store status information and detection information, the status information indicating a state where each facility in the plural facilities operates, and the detection information indicating a detection result of the moving object in each area in the plural areas (¶ 42 – “The reception unit 10 includes a function of receiving a captured image from the camera 2. The reception unit 10 transfers the captured image received to the detector unit 11. The reception unit 10 may store the captured image received in the auxiliary storage device 103.”; ¶ 44 – “The process management table 12 stores information pertaining to each process. The position information for a workstation may be stored in the process management table 12 in association with, for example, a process corresponding to aforesaid workstation. The position information for a workstation may be computed in advance in accordance with the installation position of the camera 2, and can be stored in the process management table 12. The process management table 12 also stores information pertaining to a work step that is a benchmark. Information on the benchmark processes included in a work step that is a benchmark and a standard work time (standard time) for performing the work for each benchmark process may be stored in the process management table 12.”; ¶ 63 – “The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; ¶ 65 – “The determination unit 13 can count the number of frames of the captured image until the worker moves to the next step to thereby compute the work time for each process. The determination unit 13 may store the work time calculated for each process in the auxiliary storage device 103.”; ¶ 68 – “The standard time is defined in advance in accordance with the work content for each process and is stored in a process management table 12.”); and
causing a processor of the computer (¶ 85 – “The output unit 17 may also be configured to switch the display format of the time chart (e.g., display formats such as a table, a graph, etc.) in accordance with an instruction from the user.”) to:
receive a status information operation via an input interface, the status information operation designating information on a specific facility in the status information on the plural facilities (¶ 51 – “The overall flow of process that analyzes a work step is described according to FIG. 3. FIG. 3 is a flowchart that is an example of work analysis processing. The work analysis processing in FIG. 3 presents an example where the captured images received from the camera 2 are parsed in order while the worker is performing a series of work steps, and a time chart generated after the worker concludes the work step. The time chart is not limited to being generated after the worker concludes the work step; the time chart may be generated in parallel with the receiving and parsing of captured images.” Work status information may be received as input from the camera.; fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.); and
limit information to be displayed on a display among the status information on the plural facilities and the detection information on the plural areas, based on information on the specific facility designated by the operation in the status information (fig. 6, ¶ 63 – “FIG. 6 is a diagram for describing a method for determining a process being performed; FIG. 6 illustrates a work area for performing a work step that includes processes A through G. Workstations corresponding to the each of the processes A through G (described below as workstations A to G, respectively) are installed in the work area. The area enclosing the workstations A to G is the travel area in which a worker moves while working. The travel area is divided into three travel areas a to c. The travel area a encloses workstation C, workstation D, and workstation E. The travel area b encloses workstation B and workstation F. The travel area c encloses workstation A and workstation G. The position information for workstations A to G and travel areas a to c is stored in advance in the process management table 12.”; Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.).
[Claim 13] Claim 13 recites limitations already addressed by the rejection of claim 12 above; therefore, the same rejection applies.
Furthermore, Kitazumi discloses a non-transitory computer-readable recording medium storing a program for causing a processor of a computer to perform the information analysis method according to claim 12 (¶ 38, claim 8).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 4-5 are rejected under 35 U.S.C. 103 as being unpatentable over Kitazumi et al. (US 2022/0215327), as applied to claim 1 above, in view of Yerian et al. (Yerian, Lisa M. et al. "A Collaborative Approach to Lean Laboratory Workstation Design Reduces Wasted Technologist Travel." Am J Clin Pathol 2012; 138: 273-280; DOI: 10.1309/AJCPE0PI2ENWYWMU.).
[Claim 4] Kitazumi does not explicitly disclose wherein the processor is configured to limit the information to be displayed on the display among the detection information on the plural areas, based on a frequency at which the moving object moves between an area corresponding to the specific facility and the other areas. However, Yerian evaluates work efficiency in an environment in which workstation layout has a great effect on the ability of workers to perform a variety of tasks more efficiently (Yerian: p. 273 – Abstract: “Lean methodologies have been applied in many industries to reduce waste. We applied Lean techniques to redesign laboratory workstations with the aim of reducing the number of times employees must leave their workstations to complete their tasks. At baseline in 68 workflows (aggregates or sequence of process steps) studied, 251 (38%) of 664 tasks required workers to walk away from their workstations. After analysis and redesign, only 59 (9%) of the 664 tasks required technologists to leave their workstations to complete these tasks. On average, 3.4 travel events were removed for each workstation. Time studies in a single laboratory section demonstrated that workers spend 8 to 70 seconds in travel each time they step away from the workstation. The redesigned workstations will allow employees to spend less time travelling around the laboratory. Additional benefits include employee training in waste identification, improved overall laboratory layout, and identification of other process improvement opportunities in our laboratory.”). Yerian establishes a 5-foot-radius circle as an ideal work window/area (Yerian: p. 274: col. 2 – “Waste”). Various ranges of distance traveled to perform different tasks are established (Yerian: p. 278: col. 1 – fig. 2). A number of travel events (as referenced by Yerian in the abstract) conveys a frequency of travel of a worker (i.e., a moving object) between one area of focus (i.e., an optimal work window/area) and other task areas located at different ranges of distances from the optimal work window/area. Both Kitazumi and Yerian seek to optimize work processes and workstation layout (e.g., see ¶¶ 86-88 of Kitazumi and the abstract of Yerian). Like Yerian, Kitazumi displays relevant information of interest to accomplish its goals (Kitazumi: Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.). The Examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of Applicant’s invention to modify Kitazumi wherein the processor is configured to limit the information to be displayed on the display among the detection information on the plural areas, based on a frequency at which the moving object moves between an area corresponding to the specific facility and the other areas in order to enhance Kitazumi’s work efficiency analysis with travel frequency information that provides more granular insights into additional sources of waste that can reduce worker efficiency in an environment where tasks are performed at multiple workstations (as suggested in the abstract of Yerian).
[Claim 5] Kitazumi does not explicitly disclose wherein the processor is configured to limit the information to be displayed on the display, based on a frequency at which the moving object moves between a predetermined area and any area of the plural areas, the predetermined area being different from the plural areas corresponding to the plural facilities. However, Yerian evaluates work efficiency in an environment in which workstation layout has a great effect on the ability of workers to perform a variety of tasks more efficiently (Yerian: p. 273 – Abstract: “Lean methodologies have been applied in many industries to reduce waste. We applied Lean techniques to redesign laboratory workstations with the aim of reducing the number of times employees must leave their workstations to complete their tasks. At baseline in 68 workflows (aggregates or sequence of process steps) studied, 251 (38%) of 664 tasks required workers to walk away from their workstations. After analysis and redesign, only 59 (9%) of the 664 tasks required technologists to leave their workstations to complete these tasks. On average, 3.4 travel events were removed for each workstation. Time studies in a single laboratory section demonstrated that workers spend 8 to 70 seconds in travel each time they step away from the workstation. The redesigned workstations will allow employees to spend less time travelling around the laboratory. Additional benefits include employee training in waste identification, improved overall laboratory layout, and identification of other process improvement opportunities in our laboratory.”). Yerian establishes a 5-foot-radius circle as an ideal work window/area (Yerian: p. 274: col. 2 – “Waste”). Various ranges of distance traveled to perform different tasks are established (Yerian: p. 278: col. 1 – fig. 2). A number of travel events (as referenced by Yerian in the abstract) conveys a frequency of travel of a worker (i.e., a moving object) between one area of focus (i.e., an optimal work window/area) and other task areas located at different ranges of distances from the optimal work window/area. Yerian provides a template for evaluating a predetermined area (such as the optimal work window/area) and any other potential workstation areas (both within a closer range of distance and outside of a closer range of distance), as seen in Image 1 on page 276 of Yerian and in Figure 2 of page 278 of Yerian. Both Kitazumi and Yerian seek to optimize work processes and workstation layout (e.g., see ¶¶ 86-88 of Kitazumi and the abstract of Yerian). Like Yerian, Kitazumi displays relevant information of interest to accomplish its goals (Kitazumi: Fig. 8 – Processes A through G are tracked from start to finish, with time frames measured for each respective process (which is performed at a given workstation). The nature of the processes and corresponding workstations at which the processes are performed affects which information is displayed.). The Examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of Applicant’s invention to modify Kitazumi wherein the processor is configured to limit the information to be displayed on the display, based on a frequency at which the moving object moves between a predetermined area and any area of the plural areas, the predetermined area being different from the plural areas corresponding to the plural facilities in order to enhance Kitazumi’s work efficiency analysis with travel frequency information that provides more granular insights into additional sources of waste that can reduce worker efficiency in an environment where tasks are performed at multiple workstations (as suggested in the abstract of Yerian).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Itou et al. (US 2020/0042921) – Tracks the work efficiency of workers moving between work sections.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SUSANNA M DIAZ whose telephone number is (571)272-6733. The examiner can normally be reached M-F, 8 am-4:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian Epstein can be reached at (571) 270-5389. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SUSANNA M. DIAZ/
Primary Examiner
Art Unit 3625A