Prosecution Insights
Last updated: April 19, 2026
Application No. 18/077,967

BUILDING A ROBOT MISSION BASED ON MISSION INVENTORY

Final Rejection §101§102§DP
Filed
Dec 08, 2022
Examiner
KARWAN, SIHAR A
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Yokogawa Electric Corporation
OA Round
4 (Final)
56%
Grant Probability
Moderate
5-6
OA Rounds
3y 3m
To Grant
82%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
215 granted / 385 resolved
+3.8% vs TC avg
Strong +26% interview lift
Without
With
+25.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
41 currently pending
Career history
426
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
27.8%
-12.2% vs TC avg
§102
33.4%
-6.6% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 385 resolved cases

Office Action

§101 §102 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Claims 1-20 are pending. Claims 1-20 are rejected. Amendments to the claims have been recorded. Response to Amendment Applicant’s arguments are in respect to amended claims and have been addressed with the rejections and remarks to the amendments. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims “1-20” are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. On January 7, 2019, the USPTO released new examination guidelines setting forth a two-step inquiry for determining whether a claim is directed to non-statutory subject matter. According to the guidelines, a claim is directed to non-statutory subject matter if: STEP 1: the claim does not fall within one of the four statutory categories of invention (process, machine, manufacture or composition of matter), or STEP 2: the claim recites a judicial exception, e.g. an abstract idea, without reciting additional elements that amount to significantly more than the judicial exception, as determined using the following analysis: STEP 2A (PRONG 1): Does the claim recite an abstract idea, law of nature, or natural phenomenon? STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? Claim 1 reads: 1. A method comprising: identifying one or more target tasks to be performed; identifying a recorded mission file based on the one or more target tasks, wherein identifying the recorded mission file comprises identifying one or more recorded tasks of the recorded mission file that at least partially achieve a target goal associated with the one or more target tasks; and , based on one or more commands, the one or more recorded tasks with one or more robot devices to at least partially achieve Using the two-step inquiry, it is clear that claim 10 is directed toward non-statutory subject matter, as shown below: STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon or an abstract idea? Yes, the claims 1-20 are directed to an abstract idea. With regard to STEP 2A (PRONG 1), the guidelines provide three groupings of subject matter that are considered abstract ideas: Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations; Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and Mental processes – concepts that are practicably performed in the human mind (including an observation, evaluation, judgment, opinion, calculating, determining). The method in claim 10 is a mental process that can be practicably performed in the human mind and, therefore, an abstract idea. The abstract ideas are: identifying one or more target tasks to be performed; identifying a recorded mission file based on the one or more target tasks, wherein identifying the recorded mission file comprises identifying one or more recorded tasks of the recorded mission file that at least partially achieve a target goal associated with the one or more target tasks; Analyzing the abstract idea we can understand that the abstract idea with the given examples. identifying one or more target tasks to be performed; (thinking of a task to be performed) identifying a recorded mission file based on the one or more target tasks, (thinking if the task has been performed in the past) wherein identifying the recorded mission file comprises identifying one or more recorded tasks of the recorded mission file that at least partially achieve a target goal associated with the one or more target tasks; (thinking if the task has been performed in the past and if the task has partially ben achieved) STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? No, the claim does not recite additional elements that integrate the judicial exception into a practical application. With regard to STEP 2A (prong 2), whether the claim recites additional elements that integrate the judicial exception into a practical application, the guidelines provide the following exemplary considerations that are indicative that an additional element (or combination of elements) may have integrated the judicial exception into a practical application: an additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field; an additional element that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition; an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim; an additional element effects a transformation or reduction of a particular article to a different state or thing; and an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. While the guidelines further state that the exemplary considerations are not an exhaustive list and that there may be other examples of integrating the exception into a practical application, the guidelines also list examples in which a judicial exception has not been integrated into a practical application: an additional element merely recites the words “apply it” (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea; an additional element adds insignificant extra-solution activity to the judicial exception [receiving data, data gathering, data output] further addressed in WUEC; and an additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use. Claim 1 does not recite any of the exemplary considerations that are indicative of an abstract idea having been integrated into a practical application. While the claim does recite that the method is for: , based on one or more commands, the one or more recorded tasks with one or more robot devices to at least partially achieve;( these limitations are Apply it level limitations such as data outputting / inputting i.e. transmitting, receiving, storing and recording). STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No, the claim does not recite additional elements that amount to significantly more than the judicial exception. With regard to STEP 2B, whether the claims recite additional elements that provide significantly more than the recited judicial exception, the guidelines specify that the pre-guideline procedure is still in effect. Specifically, that examiners should continue to consider whether an additional element or combination of elements: adds a specific limitation or combination of limitations that are not well-understood, routine, conventional activity in the field, which is indicative that an inventive concept may be present; or simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present. Claim 1 does not recite any specific limitation or combination of limitations that are not well-understood, routine, conventional (WURC) activity in the field. Claim 1 does further recites WURC extra steps of: “implementing; [This limitation is insignificant extra solution amounting to Apply it level of data outputting]. Analyzing the WURC steps of the abstract idea with the given examples. we can understand that the abstract idea falls within the WURC Activity MPEP 2106.05(d)(1) Evaluation improvement consideration WURC consideration MPEP.05(a); mere instructions to apply an exception consideration MPEP 2106.05(f) insignificant extra-solution activity consideration MPEP 2106.05(g) Generic computer performing merely generic computer functions, data gathering, populating tables, sending and receiving data or performing functions ‘known’ in the art. CONCLUSION Thus, since claim 1 is: (a) directed toward an abstract idea, (b) does not recite additional elements that integrate the judicial exception into a practical application, and (c) does not recite additional elements that amount to significantly more than the judicial exception, it is clear that claim 1 is directed towards non-statutory subject matter. Amendments to the claims addressed as following. Claim 1. by one or more robot devices; Apply it level. To robot devices such as memory. wherein the one or more recorded tasks and the one or more target tasks are robot-agnostic tasks that are absent an indication of a robot type; Abstract idea, thinking about what is missing. determining a confidence value that the one or more robot devices can successfully perform the one or more target tasks by comparing the one or more recorded tasks to the one or more target tasks: Abstract idea, thinking about if the device can hold memory comparing the confidence value to a threshold confidence value and Abstract idea, thinking about if there is enough memory ,controlling the one or more robot devices to perform the one or more recorded tasks to at least partially achieve the target goal Apply it level, story memory, Data gathering, data transmitting, data receiving. when the confidence value meets or exceeds the threshold confidence value. Abstract Idea, thinking if we have enough memory. Claim 2, thinking of a task to be performed and regarding implementing is Apply it level i.e. data outputting. Controlling the one or more robot device to perfume both…Apply it level, data gathering Claim 3, thinking about breaking the task into smaller task. Claim 4, breaking the task into smaller tasks and matching it with previous tasks i.e. mapping. Claim 5, remembering segments relating to previous tasks. Claim 6, remembering previous tasks and overlapping the previous tasks to current tasks. Claim 7, remembering what was task was asked of me from someone else. Receiving, at a user interface, data gathering Claim 8, remembering what was task was asked of me from someone else. Data gathering, data transmitting, Apply it level, Abstract idea ranking, data transmitting, Apply it level button, data gathering receiving files. Claim 9, automatically thinking about how to perform a task. Claim 10, remembering from memory a previous task. Also Apply it level of data output. Abstract idea, cost metric Claim 11, Apply it level of data output. Claim 12, the thoughts are consistent with the mission. Claim 14, Thinking about physically inspecting something. Claim 16, not thinking of a robot type. Claim 17-20 are mapped and rejected the same as based on the 102 mapping with the addition of “Apply it level insignificant data outputting.” 21 same as 8. data gathering, apply it level, data receiving i.e. files, Abstract idea cost metretes, 22. same as 10. 23. same as 21. Nonstatutory Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. Effective January 1, 1994, a registered attorney or agent of record may sign a terminal disclaimer. A terminal disclaimer signed by the assignee must fully comply with 37 CFR 3.73(b). The claim of this instant application are rejected on the ground of nonstatutory obviousness-type double patenting as being unpatentable over claim of copending Application. Although the conflicting claims are not identical, they are not patentably distinct from each other. This is a provisional obviousness-type double patenting rejection because the conflicting claims have not in fact been patented. Double Patenting Rejections will not be revisited and be held in abeyance until allowable subject matter is to be found. Instant Application Chen US 20220300011 Claims: 1. A method comprising: identifying one or more target tasks to be performed; identifying a recorded mission file based on the one or more target tasks, wherein identifying the recorded mission file comprises identifying one or more recorded tasks of the recorded mission file that at least partially achieve a target goal associated with the one or more target tasks; and implementing, based on one or more commands, the one or more recorded tasks with one or more robot devices to at least partially achieve the target goal. 2. The method of claim 1, further comprising: identifying at least one second recorded mission file based on the one or more target tasks, wherein the at least one second recorded mission file comprises one or more second recorded tasks that at least partially achieve the target goal; and providing the one or more commands to the one or more robot devices in association with implementing the one or more recorded tasks and the one or more second recorded tasks, wherein implementing the one or more recorded tasks and the one or more second recorded tasks fully achieves the target goal. 3. The method of claim 1, further comprising: segmenting one or more recorded mission files of a set of recorded mission files; mapping one or more segments of the one or more recorded mission files to one or more recorded goals, one or more recorded tasks, or both; and storing the mapping to a data record of the one or more recorded mission files. 4. (Original) The method of claim 1, further comprising: segmenting the recorded mission file in response to identifying the one or more target tasks; and identifying a mapping between one or more segments of the recorded mission file and at least one of: the target goal; and the one or more target tasks, wherein identifying the one or more recorded tasks that at least partially achieve the target goal is based on a result of the mapping. 5. (Original) The method of claim 4, wherein: each of the segments comprises at least one recorded task associated with the recorded mission file. 6. (Currently Amended) The method of claim 4, further comprising: segmenting a second recorded mission file in response to identifying that the one or more recorded tasks does not fully achieve the target goal; identifying a second mapping between one or more second segments of the second recorded mission file and at least one of: the target goal; and the one or more target tasks; and identifying one or more second recorded tasks of the second recorded mission file that at least partially achieve the target goal, based on a result of the second mapping, wherein the one or more recorded tasks and the one or more second recorded tasks: fully achieve the target goal; and completely overlap the one or more target tasks. 7. (Original) The method of claim 1, further comprising: receiving a user input indicating the one or more target tasks to be performed. 8. (Original) The method of claim 1, further comprising: receiving a user input comprising an indication of the target goal, wherein identifying the one or more target tasks is based on the indication of the target goal. 9. (Original) The method of claim 1, further comprising: automatically providing the one or more commands based on a confidence value associated with a comparison between the one or more recorded tasks and the one or more target tasks. 10. (Original) The method of claim 1, further comprising: providing, from a data repository, a set of recorded mission files associated with implementing the one or more recorded tasks, wherein the set of recorded mission files comprises the recorded mission file; and receiving a user input indicating the recorded mission file; wherein providing the one or more commands to the one or more robot devices is in response to the user input. 11. (Original) The method of claim 1, further comprising: providing the one or more commands the one or more robot devices in association with implementing one or more second recorded tasks associated with a second recorded mission file, wherein implementing the one or more recorded tasks and the one or more second recorded tasks fully achieves the target goal. 12. (Original) The method of claim 1, wherein the recorded mission file is associated with a recorded robot mission. 13. (Original) The method of claim 1, wherein the one or more target tasks comprise one or more robot-agnostic tasks. 14. (Original) The method of claim 1, wherein the one or more target tasks comprise one or more process automation inspection tasks. 15. (Original) The method of claim 1, wherein the one or more target tasks are absent an indication of robot type. 16. (Original) The method of claim 1, wherein the target goal is absent an indication of robot type. 17. (Currently Amended) A robot management system, comprising: an interface to provide commands to a plurality of robot devices of one or more robot types; and a data repository comprising a data record of mission files associated with a set of recorded robot missions, wherein the robot management system is configured to manage one or more robot missions and the plurality of robot devices by: identifying one or more target tasks to be performed; identifying a recorded mission file based on the one or more target tasks, wherein identifying the recorded mission file comprises identifying one or more recorded tasks of the recorded mission file that at least partially achieve a target goal associated with the one or more target tasks; and implementing, based on one or more commands, the one or more recorded tasks with one or more robot devices to at least partially achieve the target goal. 18. (Currently Amended) The system of claim 17, wherein the robot management system is further configured to: identify at least one second recorded mission file based on the one or more target tasks, wherein the at least one second recorded mission file comprises one or more second recorded tasks that at least partially achieve the target goal; and provide the one or more commands to the one or more robot devices in association with implementing the one or more recorded tasks and the one or more second recorded tasks, wherein implementing the one or more recorded tasks and the one or more second recorded tasks fully achieves the target goal. 19. The system of claim 17, wherein the robot management system is further configured to: segment one or more recorded mission files of a set of recorded mission files; map one or more segments of the one or more recorded mission files to one or more recorded goals, one or more recorded tasks, or both; and storing the mapping to a data record of the one or more recorded mission files. 20. A robot system platform comprising: an interface to provide commands to a plurality of robot devices of one or more robot types, wherein the plurality of robot devices are independent of the robot system platform; a data repository comprising a data record of mission files associated with a set of recorded robot missions; and one or more circuits to: identify one or more target tasks to be performed with respect to an environment, wherein identifying the one or more target tasks is in response to electronically receiving an indication of the one or more target tasks, an indication of a target goal, or both; identify one or more recorded mission files from the data record of mission files based on the one or more target tasks, wherein the one or more recorded mission files comprise one or more recorded tasks that at least partially achieve the target goal; and implement the one or more recorded tasks with one or more robot devices. Claims: 21. A computer-implemented method of managing a fleet of robots, the method comprising: receiving one or more missions, each mission from the one or more missions including a mission profile; determining robots associated with a system; determining at least one capability of each robot associated with the system; assigning the one or more missions to one or more of the robots associated with the system based on the at least one capability; and generating a mission schedule based on the assigned one or more missions to the one or more robots. 22. The computer-implemented method of claim 21, wherein the at least one capability is stored in a memory of the robot, wherein determining the at least one capability of the robot includes transmitting the at least one capability from the robot to a memory or a processor of the system, and wherein the at least one capability includes an ability of the robot to perform a task, an availability of the robot, an amount of battery charge of the robot, one or more tools associated with the robot, or one or more materials of the robot. 23. The computer-implemented method of claim 21, wherein each mission profile includes at least one factor related to information for performance of the respective mission, and wherein a robot from the one or more robots is assigned, by a processor of the system, to a mission from the one or more missions if the at least one capability of the robot is capable of satisfying at least a portion of the at least one factor such that the robot can perform at least a portion of the respective mission. 24. The computer-implemented method of claim 21, further comprising: determining, by a processor of the system, at least one environmental data of a facility, wherein assigning the one or more missions to the one or more robots is also based on the at least one environmental data, wherein determining the at least one environmental data includes the processor of the system receiving data from at least one or more sensors or from a third party connected to the system via a wired or a wireless connection. 25. The computer-implemented method of claim 24, wherein the at least one environmental data includes at least one of a pressure, a temperature, a humidity, a weather condition, a noise level, a noise type, a vibration, a presence of a substance, or a level of a substance. 26. The computer-implemented method of claim 21, wherein the one or more missions include a plurality of missions capable of being assigned to a plurality of robots from the one or more robots, wherein each mission profile includes one or more priorities each having a desired level of importance to the mission, and wherein the assigning the one or more missions further includes assigning the one or more missions based on the one or more priorities. 27. The computer-implemented method of claim 26, wherein the one or more priorities includes one or more of a degree of quality of performance of each mission in a mission profile, a cost of each mission in the mission profile, and an amount of energy to be used in each mission of the mission profile, wherein the one or more priorities are stored in a memory of the system and are transmitted to a processor of the system to determine a prioritized mission. 28. The computer-implemented method of claim 26, wherein the method further comprises: determining, by a processor of the system, a plurality of mission schedules, wherein the plurality of missions are assigned by the processor of the system to different robots of the plurality of robots in each of the plurality of mission schedules; determining, by the processor of the system, a prioritized mission schedule from the plurality of mission schedules, based on the one or more priorities; and initiating, by the processor of the system or by a processor of each of the plurality of robots, the plurality of missions of the prioritized mission schedule. 29. The computer-implemented method of claim 21, further comprising: receiving, by one or more of a processor of the system or a processor of a robot, a feedback from a robot of the one or more robots, wherein the feedback is transmitted via a wired or a wireless connection; and modifying, by the processor of the system, the mission profile of a first mission from the one or more missions based on the feedback, wherein the feedback is received by a processor of a first robot scheduled to perform the first mission, and wherein modifying the mission profile prevents the first robot from completing the first mission. 30. A computer-implemented method of managing a fleet of robots, the method comprising: receiving a first mission; assigning the first mission to a first robot; determining a mission schedule based on the assigning of the first mission to the first robot; instructing the first robot to perform the first mission, based on the mission schedule; receiving feedback from the first robot; and assigning a second mission to a second robot, different from the first robot, based on the feedback. 31. The computer-implemented method of claim 30, wherein the first mission is defined by a first mission profile stored in one or more of a memory of the first robot or a memory of the second robot, and wherein the step of assigning further comprises: updating, by a processor of a system or a processor of the first robot, data in the first mission profile based on the feedback; and generating, by the processor of the system or the processor of the first robot, a second mission profile based on the updated first mission profile; wherein the second mission is defined by the second mission profile, wherein the second robot is capable of performing at least a portion of the second mission, and wherein the first robot is incapable of performing at least a portion of the second mission. 32. The computer-implemented method of claim 31, further comprising: comparing, by the processor of the system or a processor of the second robot, one or more capabilities of the second robot to the second mission profile, wherein the second robot is capable of performing at least the portion of the second mission based on the one or more capabilities of the second robot matching factors for completing the portion of the second mission. 33. The computer-implemented method of claim 31, wherein the feedback indicates a change to at least one of a factor for performing the first mission or an environmental data in an environment in which the fleet of robots is configured to operate, and wherein generating the second mission profile includes changing the at least one factor for performing the first mission based on the feedback. 34. The computer-implemented method of claim 33, wherein the environmental data includes one or more of pressure, temperature, a type of gas, presence or absence of stairs, humidity, a weather condition, a noise level, a noise type, a vibration, or a substance. 35. The computer-implemented method of claim 30, wherein a third mission, different from the first mission and the second mission, is assigned, by a processor of a system, to the first robot or the second robot, and wherein the mission schedule is updated, by the processor of the system, based on assigning the second mission and the third mission. 36. The computer-implemented method of claim 30, further comprising: determining, by a processor of a system, if the second robot is capable of completing the second mission; and assigning the second mission to at least a third robot, different from the first robot and the second robot, if capabilities of the second robot prevent the second robot from completing the second mission. 37. A computer-implemented method of managing a fleet of robots, the method comprising: receiving one or more missions, each mission including a mission profile, wherein the mission profile includes environmental data from one or more sensors; for each of the one or more missions, determining one or more robots capable of completing the one or more missions based on the mission profile included in the mission; scheduling, as a primary schedule, the one or more missions to be performed by the one or more robots; receiving feedback indicating a change in the environmental data, the feedback obtained by a robot from the one or more robots performing one or more of the scheduled missions; and scheduling, as a secondary schedule, the one or more missions to the one or more robots based on the feedback. 38. The computer-implemented method of claim 37, the method further comprising: determining, by a processor of a system, a robot capability of a first robot; determining, by the processor of the system, if the robot capability of the first robot satisfies features in a mission profile defining a first mission; wherein the robot capability of the first robot of the one or more robots scheduled to perform the first mission of the one or more missions is unsuitable for operating in an environment defined by the change in the environmental data, and the method further comprises assigning, by the processor of the system, the first mission to a second robot. 39. The computer-implemented method of claim 37, wherein a processor of a first robot obtains the feedback, and the feedback is obtained separate from data gathered based on a mission assigned to the first robot. 40. The computer-implemented method of claim 37, wherein the scheduling the one or more missions of the primary schedule includes: assigning, by a processor of a system, a first mission from the one or more missions to a first robot, wherein the scheduling of the secondary schedule based on the feedback further includes: determining, by the processor of the system, a second robot capable of performing a portion of the first mission; and assigning the second robot to the first mission, wherein the second robot is configured to assist the first robot to accomplish the mission based on the change in environmental data. Also Claims 1-21 of Application 18/077,964 to claims 1-23 of this instant applicant. A notice of allowance for the co-pending application has been sent, however the claims have not been patented as no Allowance fees have been made, rather and a Request for continued Examination has been made. A patentee or applicant may disclaim or dedicated to the public the entire term, or any terminal part of the term of a patent. 35 U.S.C. 253. The statue does not provide for a terminal disclaimer of only a specified claim or claims. The terminal disclaimer must operate with respect to all claims in the patent. MPEP 804.02. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Buckland US 20230025714. Based on the current amendments, claim 1 has becomes an option not selected. Claim 1 is rejected under MPEP 2111.04(ii). Claims 1- 16 depend on the conditional claim 1 and are also rejected. 1. A method comprising: identifying one or more target tasks to be performed by one or more robot devices; 38; this robot 100 is fully autonomous [i.e. can Identify] and can perform many tasks around the orchard. identifying a recorded mission file based on the one or more target tasks, 94; Task Manager 503 queries a local database for new operation requests; these operations can be generated by human operators, through a graphical user interface. wherein identifying the recorded mission file comprises identifying one or more recorded tasks of the recorded mission file that at least partially achieve a target goal associated with the one or more target tasks; and 83; The Task Manager 503 is in charge of breaking down large-scale Operations 507, which it receives in queries to the Almanacs wherein the one or more recorded tasks and the one or more target tasks are robot-agnostic tasks that are absent an indication of a robot type; 94; Task Manager 503 [robot-agnostic] queries a local database for new operation requests; these operations can be generated by human operators, through a graphical user interface. determining a confidence value that the one or more robot devices can successfully perform the one or more target tasks by comparing the one or more recorded tasks to the one or more target tasks: comparing the confidence value to a threshold confidence value and107; the swarm algorithm may utilize a more complete Almanac dataset and understanding [swarm confidence level]. Routine operations, such as daily orchard monitoring [automatically based on monitored triggers i.e. threshold values] and fertilization, are launched autonomously by the Almanac. controlling the one or more robot device to perform the one or more recorded tasks to at least partially achieve the target goal. 83; into single-robot Tasks, which is referred to as Task Division 503B. The Task Manager 503 then distributes [controls to perform] those Tasks to individual robots 506 (robots 506A, 506B, . . . , 506N are referred collectively as robots 506) within a swarm, which is referred to as Task Allocation 503c. when the confidence value meets or exceeds the threshold confidence value. Regarding: when the confidence value meets or exceeds the threshold confidence value. Examiner takes the conditional option of “the confidence value ‘does not’ meets or exceeds the threshold confidence value”. As such claim 1 becomes an option not selected. Claim 1 is rejected under MPEP 2111.04(ii). Claims 1- 16 depend on the conditional claim 1 and are also rejected. MPEP 2111.04(ii) Regarding the method claim 1, the limitation “when the confidence value meets or exceeds the threshold confidence value.”. Examiner notes in a method claim, the steps of “controlling the one or more robot devices to perform the one or mark recorded tasks to at least partially achieve the target goal” is an optional statement that relies on the condition of “when the confidence value meets or exceeds the threshold confidence value”. A condition of “the confidence value ‘does not’ meets or exceeds the threshold confidence value” can be selected as an option, making the limitations of claim 1 optional as the steps of “controlling the one or more robot devices to perform the one or mark recorded tasks to at least partially achieve the target goal” would never need to be performed. According to MPEP 2111.04(11) and Ex parte Schulhauser. As such Examiner will not be considering the non-selected option. 2. The method of claim 1, further comprising: identifying at least one second recorded mission file based on the one or more target tasks, wherein the at least one second recorded mission file comprises one or more second recorded tasks that at least partially achieve the target goal; and 83; The Task Manager 503 is in charge of breaking [first, second …nth mission] down large-scale Operations […nth tasks] 507, providing the one or more commands to the one or more robot devices in association with implementing the one or more recorded tasks and the one or more second recorded tasks, wherein implementing the one or more recorded tasks and the one or more second recorded tasks fully achieves the target goal. 83; into single-robot Tasks, which is referred to as Task Division 503B. The Task Manager 503 then distributes those Tasks to individual robots 506 (robots 506A, 506B, . . . , 506N are referred collectively as robots 506) within a swarm, which is referred to as Task Allocation 503c. 3. The method of claim 1, further comprising: segmenting one or more recorded mission files of a set of recorded mission files; 83; The Task Manager 503 is in charge of breaking down large-scale Operations 507, mapping one or more segments of the one or more recorded mission files to one or more recorded goals, one or more recorded tasks, or both; and 83; which it receives in queries to the Almanacs (i.e., global almanac 501 and local almanac 502), into single-robot Tasks [mapped to], storing the mapping to a data record of the one or more recorded mission files. 83; Manager 503 then distributes [to distributes is to store in to each individual robot] those Tasks to individual robots 506 4. The method of claim 1, further comprising: segmenting the recorded mission file in response to identifying the one or more target tasks; and 83; Manager 503 then distributes those Tasks to individual robots 506 identifying a mapping between one or more segments of the recorded mission file and at least one of: 83; The Task Manager 503 is in charge of breaking down large-scale Operations 507, which it receives in queries to the Almanacs (i.e., global almanac 501 and local almanac 502), into single-robot Tasks [mapped to each robot and mission files], the target goal; and 83; The Task Manager 503 is in charge of breaking down large-scale Operations [target goal] 507, the one or more target tasks, wherein identifying the one or more recorded tasks that at least partially achieve the target goal is based on a result of the mapping. 83; which it receives in queries to the Almanacs (i.e., global almanac 501 and local almanac 502), into single-robot Tasks [target tasks mapped to robots]. The Task Manager 503 is in charge of breaking down [partially achieve the target goals based on mapping to each robot] 5. The method of claim 4, wherein: each of the segments comprises at least one recorded task associated with the recorded mission file. 83; distributes those Tasks to individual robots 6. The method of claim 4, further comprising: segmenting a second recorded mission file in response to identifying that the one or more recorded tasks does not fully achieve the target goal; 83; distributes [first, second …nth recorded missions] those Tasks to individual robots. identifying a second mapping between one or more second segments of the second recorded mission file and at least one of: the target goal; and 83; Tasks [to complete a task is a target goal] the one or more target tasks; and 83; Tasks [plurality of tasks] identifying one or more second recorded tasks of the second recorded mission file that at least partially achieve the target goal, based on a result of the second mapping, wherein the one or more recorded tasks and the one or more second recorded tasks: 83; distributes [first, second …nth recorded missions] those Tasks [mapped] to individual robots. fully achieve the target goal; and 86; complete tasks completely overlap the one or more target tasks. 86; as robots [robot tasks are overlapped] 506 complete tasks, 7. The method of claim 1, further comprising: receiving a user input indicating the one or more target tasks to be performed. 94; operations can be generated by human operators, through a graphical user interface. 8. The method of claim 1, further comprising: receiving a user input comprising an indication of the target goal, wherein identifying the one or more target tasks is based on the indication of the target goal. 94; the Task Manager 503 queries a local database for new operation requests. these operations can be generated by human operators, through a graphical user interface. 9. The method of claim 1, further comprising: automatically providing the one or more commands based on a confidence value associated with a comparison between the one or more recorded tasks and the one or more target tasks. 107; the swarm algorithm may utilize a more complete Almanac dataset and understanding. Routine operations, such as daily orchard monitoring [automatically based on monitored triggers] and fertilization, are launched autonomously by the Almanac. 10. The method of claim 1, further comprising: providing, from a data repository, a set of recorded mission files associated with implementing the one or more recorded tasks, wherein the set of recorded mission files comprises the recorded mission file; and 107; the swarm algorithm may utilize a more complete Almanac dataset and understanding. Routine operations, such as daily orchard monitoring and fertilization, are launched autonomously by the Almanac. receiving a user input indicating the recorded mission file; wherein providing the one or more commands to the one or more robot devices is in response to the user input. 107; More complex operations, such as pruning, thinning, and harvesting, are launched by humans, but guided by insights from the Almanac. Farmers or farm managers can request operations and monitor the results through a web based graphical user interface. 11. The method of claim 1, further comprising: providing the one or more commands the one or more robot devices in association with implementing one or more second recorded tasks associated with a second recorded mission file, wherein implementing the one or more recorded tasks and the one or more second recorded tasks fully achieves the target goal. 107; More complex operations, such as pruning, thinning, and harvesting, [1st, 2nd, and 3rd recorded missions] are launched by humans, but guided by insights from the Almanac. Farmers or farm managers can request operations and monitor the results through a web based graphical user interface. Also 203; second task. 12. The method of claim 1, wherein the recorded mission file is associated with a recorded robot mission. 83; Task Division 503B. The Task Manager 503 then distributes [the recorded robot mission to] those Tasks to individual robots. 13. The method of claim 1, wherein the one or more target tasks comprise one or more robot-agnostic tasks. 263; re-allocating it to the most appropriate robot 506, which may or may not be the same robot 506. [i.e. agnostic; not tied to a specific robot] 14. The method of claim 1, wherein the one or more target tasks comprise one or more process automation inspection tasks.284; to actuate [automation] the robot's;3D camera data is used to visually inspect the branch and further ensure that it has been removed from the tree. 15. The method of claim 1, wherein the one or more target tasks are absent an indication of robot type. 263; re-allocating it to the most appropriate robot 506, which may or may not be the same robot 506. 16. The method of claim 1, wherein the target goal is absent an indication of robot type. 242; preference for some robot types over others. 17. is rejected using the same rejections as made to claim 12 i.e. 12 and 1. 18. is rejected using the same rejections as made to claim 2. 19. is rejected using the same rejections as made to claim 3. 20. is rejected using the same rejections as made to claim 17. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SIHAR A KARWAN whose telephone number is (571)272-2747. The examiner can normally be reached on M-F 11am.-7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached on 571-270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SIHAR A KARWAN/Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Dec 08, 2022
Application Filed
Jan 14, 2025
Non-Final Rejection — §101, §102, §DP
Apr 21, 2025
Response Filed
Jul 16, 2025
Final Rejection — §101, §102, §DP
Oct 14, 2025
Request for Continued Examination
Oct 22, 2025
Response after Non-Final Action
Oct 28, 2025
Non-Final Rejection — §101, §102, §DP
Jan 30, 2026
Response Filed
Feb 24, 2026
Final Rejection — §101, §102, §DP
Mar 16, 2026
Examiner Interview Summary
Mar 16, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589502
CARGO-HANDLING APPARATUS, CONTROL DEVICE, CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12589750
VEHICULAR CONTROL SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12589504
SYSTEM AND METHOD FOR COGNITIVE SURVEILLANCE ROBOT FOR SECURING INDOOR SPACES
2y 5m to grant Granted Mar 31, 2026
Patent 12583100
ROBOT TO WHICH DIRECT TEACHING IS APPLIED
2y 5m to grant Granted Mar 24, 2026
Patent 12576516
HUMAN SKILL BASED PATH GENERATION
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
56%
Grant Probability
82%
With Interview (+25.8%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 385 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month