Prosecution Insights
Last updated: April 19, 2026
Application No. 18/570,759

SYSTEM AND METHOD FOR SMART MANUFACTURING

Non-Final OA §102§103
Filed
Dec 15, 2023
Examiner
EDWARDS, TYLER B
Art Unit
2488
Tech Center
2400 — Computer Networks
Assignee
Cargill Incorporated
OA Round
2 (Non-Final)
77%
Grant Probability
Favorable
2-3
OA Rounds
2y 5m
To Grant
91%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
359 granted / 468 resolved
+18.7% vs TC avg
Moderate +14% lift
Without
With
+14.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
14 currently pending
Career history
482
Total Applications
across all art units

Statute-Specific Performance

§101
3.3%
-36.7% vs TC avg
§103
44.0%
+4.0% vs TC avg
§102
25.4%
-14.6% vs TC avg
§112
13.2%
-26.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 468 resolved cases

Office Action

§102 §103
DETAILED ACTION This Office Action for U.S. Patent Application No. 18/570,759 is responsive to communications filed on 10/30/2025, in reply to the Non-Final Rejection of 07/30/2025. Currently, claims 1-7 and 14-20 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments In regard to claim 1, the Applicant submits that nothing has been identified in the cited references that describes at least stopping the conveyor table upon detecting the foreign object embedded in the product as set forth in the claim limitations. The Examiner respectfully agrees. Applicant’s arguments, see Remarks, filed 10/30/2025, with respect to the rejection(s) of claim(s) 1 under 35 U.S.C. 102(a)(1) have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Schmidt et al. (U.S. Publication No. 2021/0121922). In regard to claims 2-7, these claims were also rejected under 35 U.S.C. 102(a)(1) with a similar grounds of rejection to that of the previously presented claim 1. Since claim 1 has a new grounds of rejection under 35 U.S.C. 103 in view of Schmidt, the rejections of claims 2-7 have also been updated to have a new grounds of reject to reflect this grounds of rejection under 35 U.S.C. 103. In regard to claim 14, the Applicant submits that the teachings of Kjaer and Matsunaga do not teach detecting an actual cycle time associated with the worker from the second image data based at least on body positions of the worker identified in the second image data and taking an action based on the condition and the cycle time, as recited in the claim limitations of claim 14. The Examiner respectfully disagrees. The teachings of Matsunaga include that of determining a working hour of the worker based on position information of the worker in a work region, and paragraph 86-87 notes that “a time for which the worker stays and a time at which the worker stays can be regarded as a working hour and a stay time of a work performed in the object region” and “time information represented by a working hour or working time. In other words, data indicating when and where each worker works is acquired” and Fig. 6 showing a chart with a time axis that plots the working time of each employee, effectively showing a schedule of their working times throughout the day. Additionally, it is described that the “real-time processing” performed by the work evaluation system of Matsunaga uses these working hours as variables to perform processing, and as such, it can be seen that monitoring the times in which the worker is present at a location and noting the working hours and working times can be described as determining cycle times associated with the worker, and using these values in the real time processing can be described as taking actions based on these values. As such, Applicant’s arguments have been found to be unpersuasive to overcome the claim’s rejection, and thus, claim 14 shall remain rejected. In regard to claims 15-20, these claims are dependent upon the independent claim 14, and since the arguments regarding the independent claim have been found to be unpersuasive to overcome their claim’s rejection, these dependent claims are not in condition for allowance based on their dependency upon an allowable base claim, and as such, shall also remain rejected. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-7 are rejected under 35 U.S.C. 103 as being unpatentable over Kjaer (WO 2020/161231 A1), hereinafter referred to as Kjaer, in view of Schmidt et al. (U.S. Publication No. 2021/0121922), hereinafter referred to as Schmidt. In regard to claim 1, Kjaer teaches a method (Kjaer page 12 lines 27-29 noting methods for recognizing characteristics features or identifiers, such as colors, shapes, boundary shapes, and undesired objects in food items) comprising: receiving, by a controller, image data captured from a product traveling on a conveyor table in a food processing facility (Kjaer page 11, lines 14-15 noting a food processing device comprising a conveyer system with a plurality of individual conveyer lines; and Kjaer page 13, lines 12-13 noting the imaging system 9 captures pictures of the incoming food items and converts the pictures to image data. The image data is transmitted to the processor 8); determining, by the controller, at least one of a presence of a foreign object embedded within the product, a trim composition of the product, or an amount of meat on the product based on variations in texture and/or color identified from the image data (Kjaer page 3, lines 17-35 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc.); and taking, by the controller, an action based on the determined presence of the foreign object embedded within the product, the trim composition of the product, or the amount of meat on the product (Kjaer page 7, lines 18-27 noting the processor can be configured to appoint the selected food item to a selected work station for further processing, and this could be based on specific requirements related to the need for further processing, such as ability to remove certain fragments. For Example, quality characteristics may be based on determined trimming type; and Kjaer page 10, lines 4-23 noting the method of processing food items including the processor configured for receiving image data representing the food items, and taking actions based on further processing need, along with steps of processing selected food items, e.g. by trimming the food items to remove unwanted parts, e.g. fat, bone, cartilage, and/or in relation to other characteristics, e.g. identifiers mentioned herein). However, Kjaer does not expressly disclose wherein the action comprises stopping the conveyor table upon detecting the foreign object embedded in the product. In the same field of endeavor, Schmidt teaches wherein the action comprises stopping the conveyor table upon detecting the foreign object embedded in the product (Schmidt paragraph 33 noting to stop the forward motion of the product stream (e.g., conveyor belt) upon the detection of a foreign object in the stream containing the pieces of meat. The processing of the raw data and/or image data to stop the forward motion of the product stream can result from electronic communication between the camera, the computer, the forwarding device (e.g. conveyor) and the illumination device(s)). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt, because both disclosures relate to the field of food processing streams that include conveyor belt systems, and both include processes to determine specifics about a meat product, such as the composition of the product or existence of a foreign object within the composition of the product, and taking actions upon the findings of the process that determine what occurs to the product upon determination and determining where it goes. As such, modified to incorporate the teachings of Schmidt, the teachings of Kjaer include all of the limitations presented in claim 1. In regard to claim 2, Kjaer and Schmidt teach all of the limitations of claim 1 as discussed above. In addition, Kjaer teaches further comprising: determining, by the controller, the presence of the foreign object in the product based upon identifying portions in the product where the texture and/or color is different from the texture and/or color of surrounding portions (Kjaer page 3, lines 17-37 and page 4, lines 1-15 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc. The process indicator may increase in number when the difference between e.g. the desired size, colour, colour variation, or shape and the actual size, colour, colour variation or shape increases). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt for the same reasons as discussed above in regard to claim 1. In regard to claim 3, Kjaer and Schmidt teach all of the limitations of claim 2 as discussed above. In addition, Kjaer teaches further comprising raising an alert indicating detection of the foreign object (Kjaer page 6, lines 13-14 noting identification may include use of a light or similar electronically controlled identification means to illuminate or otherwise signal that a food item is selected for further processing). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt for the same reasons as discussed above in regard to claim 1. In regard to claim 4, Kjaer and Schmidt teach all of the limitations of claim 1 as discussed above. In addition, Kjaer teaches determining, by the controller, the trim composition of the product by determining a first area of the product having a first variation in the texture and/or color and a second area of the product having a second variation in the texture and/or color, wherein the first area is indicative of a lean content in the product and the second area is indicative of a fat content in the product (Kjaer page 3, lines 17-37 and page 4, lines 1-15 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc. The process indicator may increase in number when the difference between e.g. the desired size, colour, colour variation, or shape and the actual size, colour, colour variation or shape increases); and activating, by the controller, a diverter gate associated with the conveyor table to sort the product into one of a plurality of combos based on the determined trim composition (Kjaer page 7, lines 1-2 noting an object diverter may be configured to separate the food items selected for further processing from the other food items; and Kjaer page 7, lines 18-27 noting the processor can be configured to appoint the selected food item to a selected work station for further processing, and this could be based on specific requirements related to the need for further processing, such as ability to remove certain fragments. For Example, quality characteristics may be based on determined trimming type). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt for the same reasons as discussed above in regard to claim 1. In regard to claim 5, Kjaer and Schmidt teach all of the limitations of claim 1 as discussed above. In addition, Kjaer teaches determining, by the controller, the trim composition of the product by determining a first area of the product having a first variation in the texture and/or color and a second area of the product having a second variation in the texture and/or color, wherein the first area is indicative of a lean content in the product and the second area is indicative of a fat content in the product (Kjaer page 3, lines 17-37 and page 4, lines 1-15 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc. The process indicator may increase in number when the difference between e.g. the desired size, colour, colour variation, or shape and the actual size, colour, colour variation or shape increases); comparing, by the controller, the trim composition of the product with an expected trim composition of the product (Kjaer page 8, lines 11-13 noting activating the object divert for selected food items based on the processing indicator for the selected food items and the pre-defined threshold values or settings; and Kjaer page 7, lines 24-27 noting that for salmon fillets such quality characteristics may be based on determined trimming type, e.g. Trim A, Trim B, Trim C, and Trim D); and raising, by the controller, an alert upon determining that the trim composition of the product does not meet the expected trim composition of the product (Kjaer page 6, lines 13-14 noting identification may include use of a light or similar electronically controlled identification means to illuminate or otherwise signal that a food item is selected for further processing). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt for the same reasons as discussed above in regard to claim 1. In regard to claim 6, Kjaer and Schmidt teach all of the limitations of claim 5 as discussed above. In addition, Kjaer teaches determining, by the controller, whether an alert threshold is reached upon determining that the trim composition of the product does not meet the expected trim composition of the product (Kjaer page 8, lines 11-13 noting activating the object divert for selected food items based on the processing indicator for the selected food items and the pre-defined threshold values or settings; and Kjaer page 7, lines 24-27 noting that for salmon fillets such quality characteristics may be based on determined trimming type, e.g. Trim A, Trim B, Trim C, and Trim D); and raising, by the controller, the alert upon determining that the alert threshold is reached (Kjaer page 6, lines 13-14 noting identification may include use of a light or similar electronically controlled identification means to illuminate or otherwise signal that a food item is selected for further processing). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt for the same reasons as discussed above in regard to claim 1. In regard to claim 7, Kjaer and Schmidt teach all of the limitations of claim 1 as discussed above. In addition, Kjaer teaches determining, by the controller, the amount of meat on the product by determining a first area of the product having a first variation in the texture and/or color and a second area of the product having a second variation in the texture and/or color, wherein the first area is indicative of the amount of meat in the product and the second area is indicative of the amount of bone in the product (Kjaer page 3, lines 17-35 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc.); and raising, by the controller, an alert upon determining that the amount of meat on the product is greater than a predetermined threshold (Kjaer page 6, lines 13-14 noting identification may include use of a light or similar electronically controlled identification means to illuminate or otherwise signal that a food item is selected for further processing). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt for the same reasons as discussed above in regard to claim 1. Claims 14-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kjaer (WO 2020/161231 A1), hereinafter referred to as Kjaer, in view of Matsunaga (U.S. Publication No. 2021/0166180), hereinafter referred to as Matsunaga. In regard to claim 14, Kjaer teaches a system (Kjaer page 11, lines 14-15 noting a food processing device comprising a conveyer system with a plurality of individual conveyer lines; and Kjaer page 13, lines 12-13 noting the imaging system 9 captures pictures of the incoming food items and converts the pictures to image data. The image data is transmitted to the processor 8) comprising: a first computer vision system to capture first image data from a product travelling on a first conveyor table in a food processing facility (Kjaer page 11, lines 14-15 noting a food processing device comprising a conveyer system with a plurality of individual conveyer lines; and Kjaer page 13, lines 12-13 noting the imaging system 9 captures pictures of the incoming food items and converts the pictures to image data. The image data is transmitted to the processor 8); a processor that executes the computer-readable instructions (Kjaer page 12, lines 12-14 noting processor 8 comprises a CPU and corresponding software code configuring the CPU. The processor is configured to receive image data from the imaging system 9 located upstream relative to the workstations) to: detect a condition associated with the product from the first image data based at least on a variation in texture and/or color in the first image data (Kjaer page 3, lines 17-35 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc.); and take action based on the condition (Kjaer page 7, lines 18-27 noting the processor can be configured to appoint the selected food item to a selected work station for further processing, and this could be based on specific requirements related to the need for further processing, such as ability to remove certain fragments. For Example, quality characteristics may be based on determined trimming type; and Kjaer page 10, lines 4-23 noting the method of processing food items including the processor configured for receiving image data representing the food items, and taking actions based on further processing need, along with steps of processing selected food items, e.g. by trimming the food items to remove unwanted parts, e.g. fat, bone, cartilage, and/or in relation to other characteristics, e.g. identifiers mentioned herein) and worker workload/status (Kjaer page 7, lines 18-22 noting the processor could be configured to appoint the selected food items to a selected workstation for the further processing. The appointment could be based on workload at the individual workstations or it could be based on a combination between specific skills of a workstation or an operator at a workstation and specific requirements related to the need for further processing). However, Kjaer does not expressly disclose a second computer vision system to capture second image data from a worker working at the first conveyor table; detect an actual cycle time associated with the worker from the second image data based at least on body positions of the worker identified from the second image data; and take action based on the cycle time. In the same field of endeavor, Matsunaga teaches a second computer vision system to capture second image data from a worker working at the first conveyor table; a memory having computer-readable instructions stored thereon (Matsunaga paragraph 141 noting information processing apparatus 900 includes a central processing unit (CPU) 901, a read only memory (ROM) 903, and a random-access memory (RAM) 905. Further, the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925. Furthermore, the information processing apparatus 900 may include an image capturing device 933 and a sensor 935); detect an actual cycle time associated with the worker from the second image data based at least on body positions of the worker identified from the second image data (Matsunaga paragraph 8 noting a work identification unit that identifies a work content and a working hour of a worker based on time-series data of position information of the worker at least in a work region; and a quantified information generation unit that generates quantified information quantitatively expressing a work status of the worker, based on time-series data of a production amount on a work line corresponding to the identified work content, and the working hour of the worker; and Matsunaga paragraph 79 noting position information of each worker may be obtained by analyzing an image captured by a camera that captures image of an area inside the work region; and Matsunaga paragraphs 101-104 noting the work monitoring camera able to capture an image of a worker in a work area of a work line; and paragraph 86-87 notes that “a time for which the worker stays and a time at which the worker stays can be regarded as a working hour and a stay time of a work performed in the object region” and “time information represented by a working hour or working time. In other words, data indicating when and where each worker works is acquired” and Fig. 6 showing a chart with a time axis that plots the working time of each employee, effectively showing a schedule of their working times throughout the day); and take action based on the cycle time (Matsunaga paragraph 125 noting a result of the real-time processing can be used, for example, to check whether or not the worker is correctly performing a determined routine work, or to confirm that the worker can perform the work safely.) It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga because both disclosures relate to camera vision systems installed in manufacturing scenarios that involve conveyer belt work lines in order to improve efficiency and monitor the process of the work line. The teachings of Kjaer include the processor of the system making decisions and being configured to perform certain processing tasks based on the workload at different workstations, or the skills/requirements/needs of different workstations or human operators, and the teachings of Matsunaga include visions systems to monitor the workload and abilities of the human operators of such conveyor work lines, and as such, would benefit the teachings of Kjaer by providing information regarding to the workers on the work lines. As such, modified to incorporate the teachings of Matsunaga, the teachings of Kjaer include all of the limitations presented in claim 14. In regard to claim 16, Kjaer and Matsunaga teach all of the limitations of claim 14 as discussed above. In addition, Kjaer teaches wherein the condition comprises an actual trim composition of the product (Kjaer page 8, lines 11-13 noting activating the object divert for selected food items based on the processing indicator for the selected food items and the pre-defined threshold values or settings; and Kjaer page 7, lines 24-27 noting that for salmon fillets such quality characteristics may be based on determined trimming type, e.g. Trim A, Trim B, Trim C, and Trim D), and wherein the action comprises raising an alert upon determining that the actual trim composition varies from an expected trim composition of the product (Kjaer page 6, lines 13-14 noting identification may include use of a light or similar electronically controlled identification means to illuminate or otherwise signal that a food item is selected for further processing). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga for the same reasons as discussed above in regard to claim 14. In regard to claim 17, Kjaer and Matsunaga teach all of the limitations of claim 14 as discussed above. In addition, Kjaer teaches a third computer vision system to capture third image data from the product travelling on a second conveyor table (Kjaer page 7 noting a first conveyor belt and a second conveyor belt) in the food processing facility, and wherein the processor further executes computer-readable instructions (Kjaer page 11, lines 14-15 noting a food processing device comprising a conveyer system with a plurality of individual conveyer lines; and Kjaer page 13, lines 12-13 noting the imaging system 9 captures pictures of the incoming food items and converts the pictures to image data. The image data is transmitted to the processor 8) to: determine an actual trim composition of the product from the third image data based at least on an additional variation in texture and/or color in the third image data (Kjaer page 3, lines 17-37 and page 4, lines 1-15 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc. The process indicator may increase in number when the difference between e.g. the desired size, colour, colour variation, or shape and the actual size, colour, colour variation or shape increases); and activate a diverter gate to sort the product into one of a plurality of combos based on the actual trim composition (Kjaer page 7, lines 1-2 noting an object diverter may be configured to separate the food items selected for further processing from the other food items; and Kjaer page 7, lines 18-27 noting the processor can be configured to appoint the selected food item to a selected work station for further processing, and this could be based on specific requirements related to the need for further processing, such as ability to remove certain fragments. For Example, quality characteristics may be based on determined trimming type). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga for the same reasons as discussed above in regard to claim 14. In regard to claim 18, Kjaer and Matsunaga teach all of the limitations of claim 14 as discussed above. In addition, Matsunaga teaches determine a speed at which the first conveyor table is moving; determine a throughout of the first conveyor table based upon the first image data and the speed at which the first conveyor table is moving (Matsunaga paragraph 91 noting the production amount on the work line L may be represented by a moving speed of the product P on the work line L as illustrated on the lower side of FIG. 7. It can be evaluated that the higher the moving speed of the product P, the higher the productivity; and Matsunaga paragraph 109 noting performance feature amount data can be generated by digitizing information (for example, a production amount, a moving speed of a product, or the like); and Matsunaga paragraph 11 noting a production amount acquisition device that acquires a production amount on a work line in the work region as time-series data; and Matsunaga paragraph 51 noting Examples of the work result information include the number of products (that is, a production amount) processed in the work lines L1 to L3, a quality of a processed product, and the like. Such work result information of the work lines L1 to L3 can be acquired by, for example, capturing an image of a product conveyed on the line with an image capturing device, and the like); and raise an alert upon determining that the throughout differs from an expected throughput (Matsunaga paragraphs 88-95 noting that work result information is collected, and includes data such as the number of products processed on the work line to determine the production amount of the work line. This work result information can be presented to operators, and used to evaluate work and performance of workers; and Matsunaga paragraph 131 noting that degree of matching of work content can be determined, and workers or managers can be notified of abnormal states when current work status is lower than predetermined values). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga for the same reasons as discussed above in regard to claim 14. In regard to claim 19, Kjaer and Matsunaga teach all of the limitations of claim 14 as discussed above. In addition, Matsunaga teaches compare the actual cycle time of the worker (Matsunaga paragraph 10 noting identifying a work content and a working hour of a worker based on time-series data of position information of the worker at least in a work region; and generating quantified information quantitatively expressing a work status of the worker, based on time-series data of a production amount on a work line corresponding to the identified work content, and the working hour of the worker) with an expected cycle time; compare the actual cycle time of the worker with a historical cycle time of the worker (Matsunaga paragraph 93 noting the quantified information generation unit 115 may generate, as the quantified information, information in which a work content and working hour of a worker identified by the work identification unit 113 and a preset work schedule of the worker are associated with each other on the same time axis. By presenting such quantified information, the user can easily check whether or not the worker works according to the determined schedule); and raise an alert upon determining that the actual cycle time of the worker varies from the expected cycle time and the historical cycle time of the worker (Matsunaga paragraph 131 noting that degree of matching of work content can be determined, and workers or managers can be notified of abnormal states when current work status is lower than predetermined values). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga for the same reasons as discussed above in regard to claim 14. In regard to claim 20, Kjaer and Matsunaga teach all of the limitations of claim 14 as discussed above. In addition, Matsunaga teaches receive location data associated with the worker (Matsunaga paragraph 10 noting identifying a work content and a working hour of a worker based on time-series data of position information of the worker at least in a work region); determine a current location of the worker relative to the first conveyor table based on the location data (Matsunaga paragraph 10 noting identifying a work content and a working hour of a worker based on time-series data of position information of the worker at least in a work region; and Matsunaga paragraph 50 noting as illustrated in FIG. 1, it is assumed that a plurality of work lines L1 to L3 are arranged in a work region S in a factory. Workers currently work on the work lines L1 to L3, respectively. Such work on the work lines L1 to L3 of the factory is often performed at fixed positions. Therefore, in the work evaluation system according to the present embodiment, a work content of a worker is identified based on a work position of the worker by acquiring position information of the worker in the work region S. Further, in a case where at least the position information of the worker in the work region S is acquired as time-series data, it is possible to identify where the worker currently stays in the work region S and to where the worker moved); determine that the current location of the worker varies from the expected location of the worker relative to the first conveyor table (Matsunaga paragraph 93 noting the quantified information generation unit 115 may generate, as the quantified information, information in which a work content and working hour of a worker identified by the work identification unit 113 and a preset work schedule of the worker are associated with each other on the same time axis. By presenting such quantified information, the user can easily check whether or not the worker works according to the determined schedule); and raise an alert upon determining that the current location of the worker varies from the expected location of the worker (Matsunaga paragraph 131 noting that degree of matching of work content can be determined, and workers or managers can be notified of abnormal states when current work status is lower than predetermined values). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga for the same reasons as discussed above in regard to claim 14. Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Kjaer (WO 2020/161231 A1), hereinafter referred to as Kjaer, in view of Matsunaga (U.S. Publication No. 2021/0166180), hereinafter referred to as Matsunaga, in view of Schmidt et al. (U.S. Publication No. 2021/0121922), hereinafter referred to as Schmidt. In regard to claim 15, Kjaer and Matsunaga teach all of the limitations of claim 14 as discussed above. In addition, Kjaer teaches wherein the condition comprises a foreign object embedded within the product, and wherein upon detecting the foreign object embedded within the product (Kjaer page 3, lines 17-37 and page 4, lines 1-15 noting the processor indicator can be defined based on different predetermined identifiers in the food item, such as foreign objects, minimum or maximum fat content, totally and/or locally, unwanted objects such as membrane, fat, bones, etc., gaping structure, shape, size, color, etc. The process indicator may increase in number when the difference between e.g. the desired size, colour, colour variation, or shape and the actual size, colour, colour variation or shape increases). However, Kjaer does not expressly disclose the action comprises stopping the first conveyor table. In the same field of endeavor, Schmidt teaches the action comprises stopping the first conveyor table (Schmidt paragraph 33 noting to stop the forward motion of the product stream (e.g., conveyor belt) upon the detection of a foreign object in the stream containing the pieces of meat. The processing of the raw data and/or image data to stop the forward motion of the product stream can result from electronic communication between the camera, the computer, the forwarding device (e.g. conveyor) and the illumination device(s)). It would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Schmidt, because both disclosures relate to the field of food processing streams that include conveyor belt systems, and both include processes to determine specifics about a meat product, such as the composition of the product or existence of a foreign object within the composition of the product, and taking actions upon the findings of the process that determine what occurs to the product upon determination and determining where it goes. Additionally, it would have been obvious, for a person having ordinary skill in the art before the effective filing date, to combine the teachings of Kjaer with the teachings of Matsunaga for the same reasons as discussed above in regard to claim 14. As such, modified to incorporate the teachings of Matsunaga and Schmidt, the teachings of Kjaer include all of the limitations presented in claim 15. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Youngs et al. – U.S. Patent No. 12,106,468 Vogeley, Jr. – U.S. Patent No. 5,324,228 Any inquiry concerning this communication or earlier communications from the examiner should be directed to TYLER B EDWARDS whose telephone number is (571)272-2738. The examiner can normally be reached 9:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sathyanarayanan Perungavoor can be reached at (571)272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TYLER B EDWARDS/Examiner, Art Unit 2488
Read full office action

Prosecution Timeline

Dec 15, 2023
Application Filed
Jul 25, 2025
Non-Final Rejection — §102, §103
Oct 30, 2025
Response Filed
Feb 06, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12581042
EVENT RECOGNITION SYSTEMS AND METHODS
2y 5m to grant Granted Mar 17, 2026
Patent 12561983
SENSOR PROCESSING METHOD, APPARATUS, COMPUTER PROGRAM PRODUCT, AND AUTOMOTIVE SENSOR SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12556689
INTRA PREDICTION METHOD AND DEVICE
2y 5m to grant Granted Feb 17, 2026
Patent 12552316
VEHICULAR MIRROR CONTROL SYSTEM
2y 5m to grant Granted Feb 17, 2026
Patent 12556693
INTRA PREDICTION METHOD AND DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

2-3
Expected OA Rounds
77%
Grant Probability
91%
With Interview (+14.5%)
2y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 468 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month