DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This is in response to applicant’s amendment/response filed on 10/22/2025, which has been entered and made of record. Claims 1 and 3-5 have been amended. Claim 2 has been cancelled. Claims 6-9 have been cancelled. Claims 1 and 3-9 are pending in the application.
Response to Arguments
Applicant's arguments filed on 10/22/2025 have been fully considered but they are not persuasive. Applicant submitted new amended claims. Accordingly, new grounds of rejection are set forth above. The new grounds of rejection conclusion have been necessitated by Applicant's amendments to the claims.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 and 3-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Japan PGPubs JP2021-125183 to Kitazumi et al. in view of U.S. PGPubs 2024/0071085 to Mori et al..
PNG
media_image1.png
238
372
media_image1.png
Greyscale
Regarding claim 1, Kitazumi et al. teach a workload analysis device comprising (abstract, a work analysis device): memory storing instructions; and at least one processor, wherein the instructions, when executed by the at least one processor, cause the workload analysis device to (par 0029, “The workload analyzer 10 includes one or more processors, a main storage device, an auxiliary storage device, a communication device, an input device, and an output device as hardware components thereof, and the processor executes a computer program to: Execute various processes of. Note that some or all of the processing may be executed by a dedicated hardware circuit”):
acquire, from an imaging device, a plurality of images in a time series that capture a work flow (Fig 1, par 0034-0035, “The image acquisition unit 11 may acquire an image in real time at the same time as the image is captured by the camera 20, or the image stored in the storage device by the camera 20 may be acquired. The following steps S12 to S14 may be performed for each frame of the camera image, or may be performed at predetermined frame intervals. Further, when a plurality of workers who perform a plurality of work processes are transferred to the camera image, processing is performed for each of the plurality of work processes (workers)”);
obtain, by detecting a vehicle body in an image, a process segment comprising a sequence of images, from among the plurality of images (Fig 1, par 0025-0027, “In the example of FIG. 1, a camera 20 is installed on the ceiling of the work place, and an image is taken of a production line including a plurality of workers P1 and P2. The image captured by the camera 20 is captured by the workload analyzer 10. The workload analyzer 10 analyzes the image and determines whether or not the worker is working. By continuously determining whether or not work is in progress, the workload for each process can be obtained”, par 0040, “the work determination unit 12 determines that the work is in progress, and if it is smaller than the threshold value, the work determination unit 12 determines that the work is not in progress” …. Identify work process (an assemble job) from …P1 to P2 … and assemble part from start to finish (left to right)), wherein:
the sequence begins with an image capturing a head of a work area for the vehicle body reaching a predetermined position (Fig 1, par 0025-0027, an assemble job start from left to right),
the sequence ends with an image capturing a rear end of the work area reaching the predetermined position (Fig 1, par 0025-0027, an assemble job start from left to right), and
the predetermined position is set perpendicular to a flow direction of the work flow (Fig 1, P1 and P2 worker position perpendicular to a flow direction); and
detect a worker in the process segment (par 0030, “The work determination unit 12 determines whether or not the worker is working based on the camera image, and stores the determination result in the storage unit 15. …. The work determination unit 12 determines whether or not the worker is working based on at least one of the human body detection result of the human body detection unit 13 and the posture (skeleton point) detection result of the posture detection unit 14. In the embodiment, the start time and the end time of the work are stored in the storage unit 15. The graph generation / display unit 16 generates a graph showing the workload of each work process based on the work start time and work end time stored in the storage unit 15”).
But Kitazumi et al. keep silent for teaching recognize, by detecting a vehicle body in an image, a process segment comprising a sequence of images, from among the plurality of images.
In related endeavor, Mori et al. teach recognize, by detecting a vehicle body in an image, a process segment comprising a sequence of images, from among the plurality of images (par 0075, “Flow line information 70 indicates information acquired by the analysis system by analyzing trajectory information 80 based on time-series elements (ti,(xi,yi),Di) of corresponding trajectory information 80. Flow line information 70 indicates time during which the worker stays in later-described monitoring area Ar that is a predetermined area of each process Pr.”, par 0099-0101, par 0115, “Processor 21 acquires flow line information 70 based on trajectory information 80 as detector 204 (step S4b). Specifically, processor 21 compares the coordinate (xi,yi) of the elements (ti,(xi,yi),Di) constituting trajectory information 80 with the coordinates indicated by monitoring area information 243 of each process Pr, and processor 21 detects that worker Pe exists in monitoring area Ar at corresponding time ti when determining that the coordinate (xi,yi) indicates the position in monitoring area Ar based on the comparison result. When such time ti is detected for monitoring area Ar of each process Pr, the stay time indicated by the IN time to the OUT time of each process Pr constituting flow line information 70 is detected”), wherein: the sequence begins with an image capturing a head of a work area for the vehicle body reaching a predetermined position, the sequence ends with an image capturing a rear end of the work area reaching the predetermined position, and the predetermined position is set perpendicular to a flow direction of the work flow (Fig 5, par 0073-0097, “FIG. 5 is a view illustrating an example of an image of the embodiment. The image in FIG. 5 illustrates a frame of the moving image obtained by camera 50 capturing the image of workplace 2 including five processes Pr. As illustrated in FIG. 5, each frame of the moving image includes the image of workplace 2 and the image of a worker Pe when the worker exists in workplace 2. Each frame of the moving image is associated with the imaging time specified using a time synchronization server (not illustrated) …. For each process Pr, processor 21 specifies a plurality of consecutive frames in time series in which it is determined that worker Pe exists in monitoring area Ar corresponding to process Pr from the moving image. Processor 21 produces a record including the process ID identifying the process Pr for the plurality of specified frames. The record further includes a set of an IN time and an OUT time indicating the stay time. Specifically, processor 21 determines the imaging time of the first frame among the plurality of specified frames as the IN time, and determines the imaging time of the last frame among the specified frames as the OUT time. The set of the determined IN time and OUT time indicates the stay time that is time of a length from the IN time to the OUT time. Processor 21 generates flow line information 70 including the record of each process Pr produced in this way”, Figs 6-7, par 0099, “trajectory information 80 acquired when worker Pe wearing green hat 8 moves through processes Pr(1) to Pr(6) in this order from the entry to the exit of workplace 2 is illustrated in FIG. 6. As illustrated in FIG. 7, flow line information 70 acquired from trajectory information 80 in FIG. 6 includes the color ID (green: G) indicating the color type of hat 8 worn by worker Pe, the worker ID (“124”) corresponding to the color type, and the record corresponding to each of processes Pr(1) to Pr(6). Each record includes the set of IN time and OUT time indicating the stay time of worker Pe in monitoring area Ar of corresponding process Pr.”)
It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Kitazumi et al. to include recognize, by detecting a vehicle body in an image, a process segment comprising a sequence of images, from among the plurality of images as taught by Mori et al. to accurately calculate each work type identified based on the relationship between the first time period and the second time period to improve work process.
Regarding claim 3, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 1, an Kitazumi et al. further teach wherein the instructions, when executed by the at least one processor, cause the workload analysis device to store, in a storage, a worktime of the worker based on an amount of time over which the worker is detected in the process segment (par 0030, “The work determination unit 12 determines whether or not the worker is working based on the camera image, and stores the determination result in the storage unit 15. …. the start time and the end time of the work are stored in the storage unit 15”, par 0053, “the work determination unit 12 stores the work start time and the work end time in the storage unit 15. The work determination unit 12 stores the work state (whether working or non-working) by the worker in the immediately preceding frame. When the work state changes from non-working to working in the current frame, the work determination unit 12 stores the time of the current frame as the work start time in the storage unit 15. Further, when the work determination unit 12 changes the work state from working to non-work, the time of the current frame is stored in the storage unit 15 as the work end time. The work start time and work end time are stored in association with the work process.”).
Regarding claim 4, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 3, an further teach wherein the instructions, when executed by the at least one processor, cause the workload analysis device to: detect a plurality of workers in the process segment, each of the plurality of workers having an identifier; store, in the storage, a number of the plurality of workers and worktimes of the plurality of workers based on recognizing the identifiers of the plurality of workers (Kitazumi et al.: par 0030, “The work determination unit 12 determines whether or not the worker is working based on the camera image, and stores the determination result in the storage unit 15. …. the start time and the end time of the work are stored in the storage unit 15”, par 0053, “the work determination unit 12 stores the work start time and the work end time in the storage unit 15. The work determination unit 12 stores the work state (whether working or non-working) by the worker in the immediately preceding frame. When the work state changes from non-working to working in the current frame, the work determination unit 12 stores the time of the current frame as the work start time in the storage unit 15. Further, when the work determination unit 12 changes the work state from working to non-work, the time of the current frame is stored in the storage unit 15 as the work end time. The work start time and work end time are stored in association with the work process.” ….store time for each worker in the work flow, Mori et al.: par 0099, “ trajectory information 80 acquired when worker Pe wearing green hat 8 moves through processes Pr(1) to Pr(6) in this order from the entry to the exit of workplace 2 is illustrated in FIG. 6. As illustrated in FIG. 7, flow line information 70 acquired from trajectory information 80 in FIG. 6 includes the color ID (green: G) indicating the color type of hat 8 worn by worker Pe, the worker ID (“124”) corresponding to the color type, and the record corresponding to each of processes Pr(1) to Pr(6). Each record includes the set of IN time and OUT time indicating the stay time of worker Pe in monitoring area Ar of corresponding process Pr.”, par 0116, “the work time during which worker Pe performs the work is detected out of the stay time of worker Pe in process Pr. Specifically, worker Pe wears hat 8 such that the orientation of the tip of the brim of hat 8 coincides with the orientation of the front of the face, so that the orientation of the face of worker Pe can be indicated by orientation Di. Processor 21 detects the degree to which orientation Di of worker Pe staying in process Pr coincides with a predetermined orientation corresponding to the posture of the worker during the work. It is determined that time ti at which the detected degree of coincidence exceeds a predetermined value corresponds to the work time of worker Pe”, par 0122, “FIG. 16 illustrates the flow line chart of a broken line based on flow line information 70 with the worker ID of “ID1” and the flow line chart of a solid line based on flow line information 70 with the worker ID of “ID3”. The flow line chart illustrates the work time in each process Pr. In the traffic line chart, the stay time in each process Pr may be indicated instead of the work time” ……identify the worker based on color during workflow and worktime).
Regarding claim 5, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 4, and further teach wherein the instructions, when executed by the at least one processor, cause the workload analysis device to: generate a graph in which workloads in the process segment are visualized based on the worktimes of the plurality of workers and the number of the plurality of workers stored in the storage and output the generated graph to a display (Kitazumi et al.: par 0030, “The graph generation / display unit 16 generates a graph showing the workload of each work process based on the work start time and work end time stored in the storage unit 15, and displays the graph on the display unit 16. The details of the workload analyzer 10 will be described below together with the flowchart”, Fig 4, par 0055, “the graph generation / display unit 16 generates a graph representing the workload based on the time during which the worker is working and the time during which the worker is not working for each work process. Specifically, the graph generation / display unit 16 obtains the working time and the non-working time based on the work start time and the work end time for each work process stored in the storage unit 15, and works. Generate a graph showing the workload for each process”, Mori et al.: Fig 16, par 0122, “Display device 170 of analysis device 10 displays the image based on visualized data 17a acquired by the analysis processing of flow line information 70. In the flow line chart of FIG. 16 based on visualized data 17a, a horizontal axis represents time, and a vertical axis represents processes Pr(1) to Pr(6). FIG. 16 illustrates the flow line chart of a broken line based on flow line information 70 with the worker ID of “ID1” and the flow line chart of a solid line based on flow line information 70 with the worker ID of “ID3”. The flow line chart illustrates the work time in each process Pr. In the traffic line chart, the stay time in each process Pr may be indicated instead of the work time”).
Regarding claim 6, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 1, and further teach wherein the instructions, when executed by the at least one processor, cause the workload analysis device to determine whether the head of the work area or the rear end of the work area has reached the predetermined position by performing recognition of a shape of the vehicle body using a machine learning model (Kitazumi et al.: Fig 1, par 0025-0027, “In the example of FIG. 1, a camera 20 is installed on the ceiling of the work place, and an image is taken of a production line including a plurality of workers P1 and P2. The image captured by the camera 20 is captured by the workload analyzer 10. The workload analyzer 10 analyzes the image and determines whether or not the worker is working. By continuously determining whether or not work is in progress, the workload for each process can be obtained”, par 0040, “the work determination unit 12 determines that the work is in progress, and if it is smaller than the threshold value, the work determination unit 12 determines that the work is not in progress”, par 0036, “In step S12, the human body detection unit 13 detects an estimated region (bounding box) in which the human body exists from the camera image. Human body detection may be performed using any existing algorithm. For example, human body detection can be performed using a classifier learned by statistical machine learning based on HoG (Histogram of Gradient) features or Haar-like features. Further, for example, the human body can be detected by using a discriminator learned by a neural network such as deep learning. Further, a classifier combining a plurality of weak classifiers learned by ensemble learning such as boosting and bagging may be used”, Mori et al.: Fig 5, par 0073-0097, “FIG. 5 is a view illustrating an example of an image of the embodiment. The image in FIG. 5 illustrates a frame of the moving image obtained by camera 50 capturing the image of workplace 2 including five processes Pr. As illustrated in FIG. 5, each frame of the moving image includes the image of workplace 2 and the image of a worker Pe when the worker exists in workplace 2. Each frame of the moving image is associated with the imaging time specified using a time synchronization server (not illustrated) …. For each process Pr, processor 21 specifies a plurality of consecutive frames in time series in which it is determined that worker Pe exists in monitoring area Ar corresponding to process Pr from the moving image. Processor 21 produces a record including the process ID identifying the process Pr for the plurality of specified frames. The record further includes a set of an IN time and an OUT time indicating the stay time. Specifically, processor 21 determines the imaging time of the first frame among the plurality of specified frames as the IN time, and determines the imaging time of the last frame among the specified frames as the OUT time. The set of the determined IN time and OUT time indicates the stay time that is time of a length from the IN time to the OUT time. Processor 21 generates flow line information 70 including the record of each process Pr produced in this way”, Figs 6-7, par 0099, “trajectory information 80 acquired when worker Pe wearing green hat 8 moves through processes Pr(1) to Pr(6) in this order from the entry to the exit of workplace 2 is illustrated in FIG. 6. As illustrated in FIG. 7, flow line information 70 acquired from trajectory information 80 in FIG. 6 includes the color ID (green: G) indicating the color type of hat 8 worn by worker Pe, the worker ID (“124”) corresponding to the color type, and the record corresponding to each of processes Pr(1) to Pr(6). Each record includes the set of IN time and OUT time indicating the stay time of worker Pe in monitoring area Ar of corresponding process Pr.”).
Regarding claim 7, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 1, and Mori et al. further teach wherein the instructions, when executed by the at least one processor, cause the workload analysis device to: repeatedly determine whether the head of the work area has reached the predetermined position until the image capturing the head of the work area reaching the predetermined position is identified; start measuring an amount of time based on the image capturing the head of the work area being identified; repeatedly determine whether the rear end of the work area has reached the predetermined position until the image capturing the rear end of the work area reaching the predetermined position is identified; stop measuring the amount of time based on the image capturing the rear end of the work area being identified; and determine a worktime for the process segment based on the measured amount of elapsed time (par 0058, “analysis device 10 may extract a time slot in which machine 40 operated for each process Pr based on control data 345, and generate the visualization data visualizing an extracted operation time slot in association with the stay time in process Pr”, par 0071, “Detection information 347 includes at least one moving image file 45, and flow line information 70 and trajectory information 80 that are associated with each moving image file 45. For example, flow line information 70 and trajectory information 80 have a format of a comma separated values (CSV) file. Moving image file 45 constitutes the moving image including a plurality of time-series frames captured by camera 50”, par 0075, “Flow line information 70 indicates information acquired by the analysis system by analyzing trajectory information 80 based on time-series elements (ti,(xi,yi),Di) of corresponding trajectory information 80. Flow line information 70 indicates time during which the worker stays in later-described monitoring area Ar that is a predetermined area of each process Pr “, par 0093-0097, “FIG. 5 is a view illustrating an example of an image of the embodiment. The image in FIG. 5 illustrates a frame of the moving image obtained by camera 50 capturing the image of workplace 2 including five processes Pr. As illustrated in FIG. 5, each frame of the moving image includes the image of workplace 2 and the image of a worker Pe when the worker exists in workplace 2. Each frame of the moving image is associated with the imaging time specified using a time synchronization server (not illustrated) …. Processor 21 determines whether worker Pe exists in monitoring area Ar set for each process Pr at the imaging time of each frame….. For each process Pr, processor 21 specifies a plurality of consecutive frames in time series in which it is determined that worker Pe exists in monitoring area Ar corresponding to process Pr from the moving image. Processor 21 produces a record including the process ID identifying the process Pr for the plurality of specified frames. The record further includes a set of an IN time and an OUT time indicating the stay time. Specifically, processor 21 determines the imaging time of the first frame among the plurality of specified frames as the IN time, and determines the imaging time of the last frame among the specified frames as the OUT time. The set of the determined IN time and OUT time indicates the stay time that is time of a length from the IN time to the OUT time. Processor 21 generates flow line information 70 including the record of each process Pr produced in this way”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding claim 8, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 4, and Mori et al. further teach wherein the identifier comprises one or more characters or symbols added to a helmet or clothes of each of the plurality of workers (Fig 1, par 0048, “In workplace 2, for example, a worker is uniquely identified by a color type of a hat 8 worn by the worker. The color of hat 8 is a type of color that is not used for equipment, members, or the like disposed in workplace 2, and is more preferably a bright and vivid type of color.”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding claim 9, Kitazumi et al. as modified by Mori et al. teach all the limitation of claim 6, an Kitazumi et al. further teach wherein the machine learning model comprises a convolutional neural network (CNN) or a recurrent neural network (RNN) (par 0050-0051, “The fifth method is whether or not the worker is working by using a trained model (AI model) that has been machine-learned in advance to receive information about the worker as input and output the work state of the worker. Is determined” ….. using AI model or supervised leaning model).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jin Ge whose telephone number is (571)272-5556. The examiner can normally be reached 8:00 to 5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571)272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JIN . GE
Examiner
Art Unit 2619
/JIN GE/ Primary Examiner, Art Unit 2619