Prosecution Insights
Last updated: April 19, 2026
Application No. 18/336,650

Agricultural Operation Mapping

Final Rejection §101§103
Filed
Jun 16, 2023
Examiner
GLADE, ZACHARY EDWARD FREW
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Agco International GmbH
OA Round
2 (Final)
64%
Grant Probability
Moderate
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
14 granted / 22 resolved
+11.6% vs TC avg
Strong +62% interview lift
Without
With
+61.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
39 currently pending
Career history
61
Total Applications
across all art units

Statute-Specific Performance

§101
13.5%
-26.5% vs TC avg
§103
48.7%
+8.7% vs TC avg
§102
12.7%
-27.3% vs TC avg
§112
21.0%
-19.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 22 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This action is in reply to the amendments and remarks filed on 6/25/2025. Claims 1, 2, 5, 7, 9-12, 14, and 15 have been amended. Claims 16 and 17 have been added. Claim 3 has been cancelled. Claims 1-2 and 4-17 are currently pending and have been examined. Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement(s) (IDS(s)) submitted on 10/05/2023 has been received and considered. Response to Amendment Applicant’s amendments to the Specification, Drawings, and Claims have overcome each and every objection and 112(d) rejection previously set forth in the Non-Final Office Action mailed 3/26/2025. Response to Arguments Applicant’s arguments, see pages 10-13, filed 6/25/2025, with respect to the rejection(s) of claim(s) 1, 14, and 15 under 35 USC 101 have been fully considered and are not persuasive. Regarding Step 2A Prong 1, the analysis and identification of agricultural vehicles within the working environment can practicably be performed in the human mind, therefore the invention is categorized as a mental process, and the additional elements enable the performance of that process. Regarding Step 2A Prong 2, the imaging sensors, agricultural machine, controllers, and detection model are described as generic components and recited at a high level of generality, the use of the controllers as generic components amounts to “apply it,” and the receipt of image data and logging of operational data amounts to insignificant extra-solution activity. For these reasons, these elements fail to integrate the mental process into a practical application, and there is no tangible control step beyond the mental process to constitute a practical application. The analogy to Thales Visionix Inc. v. United States, 850 F.3d 1343, 1345 (Fed. Cir. 2017) is not persuasive because the generic claim elements as described are used in a conventional manner. Therefore, the rejection of claims 1, 14, 15 and their dependent claims stands, as updated to the amended claims. Applicant’s arguments, see pages 13-16, filed 6/25/2025, with respect to the rejection(s) of claim(s) 1, 14, 15, and 9 under 35 USC 102 and 35 USC 103 have been fully considered and are persuasive regarding “an indication of an operational task being performed.” In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., indication of a specific harvesting operation, a tillage operation, a seeding operation, or a fertilizing operation) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Regarding the arguments that Eichhorn (US 20220026226) does not teach “log operational information associated with one or more identified working machines…the operational information compris[ing] an indication of an operational task being performed by the one or more identified working machines,” Eichhorn ¶ 0067 lines 1-7 “As would be appreciated, many agricultural vehicles 10 and implements are capable of extending and retracting structures to allow for field 6 operation and road 7 transport. The status of these structures may be monitored by the system 20, via any known or appreciated mechanism, in some implementations, to provide accurate collision warnings and prevent nuisance false alarms,” emphasis added, teaches these elements at the claimed level of generality within their broadest reasonable interpretation. “Implements are capable of extending and retracting structures to allow for field operation” is a clear indication of an agricultural operation task, identifying field operation information from the extension or retraction of implements, and “the status of these structures is monitored” is within the broadest reasonable interpretation of identification and logging of these tasks. Furthermore, as noted in the prior rejection of Claim 6, Eichhorn ¶ 0065 “As previously noted, the maps 72, databases 58, and/or other media containing object 10 locations and other object 10 information may be used for […] automatic swath control operations. […] automatic swath control operations include the ability to turn on/off certain sections or rows of an implement, such as a planter or fertilizer/herbicide applicators,” emphasis added, teaches the combined use of a database of object information (logged operational status) with the map in order to implement fertilizer operations. However, examiner agrees with the argument that Eichhorn does not take the step to teach “an indication of an operational task being performed.” In light of the amendments and the addition of claims 16 and 17, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made as necessitated by amendment in view of Eichhorn and Van Dyk et al (CN 114303613), which more explicitly teaches the claimed elements. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Step 1 of the Alice/Mayo framework considers whether the claims are directed to one of the four statutory classes of invention – method/process, machine/apparatus, manufacture, or composition of matter. Claims 1 and 14 are directed to machines. Claim 15 is directed to a method. Accordingly, claims 1, 14, and 15 is within at least one of the four statutory categories. Step 2A Step 2A of the Alice/Mayo framework considers whether claims are “directed to” an abstract idea. That is, whether the claims recite an abstract idea (Prong 1) and fail to integrate the abstract idea into a practical application (Prong 2). Step 2A Prong 1 Regarding Prong One of Step 2A of the Alice/Mayo test (which collectively includes the guidance in the January 7, 2019 Federal Register notice and the October 2019 update issued by the USPTO as now incorporated into the MPEP, as supported by relevant case law), the claim limitations are to be analyzed to determine whether, under their broadest reasonable interpretation, they “recite” a judicial exception or in other words whether a judicial exception is “set forth” or “described” in the claims. MPEP 2106.04(III)(C)(2). An “abstract idea” judicial exception is subject matter that falls within at least one of the following groupings: a) certain methods of organizing human activity, b) mental processes, and/or c) mathematical concepts. MPEP 2106.04(a). Specifically, independent claim 1 recites the following, with the abstract idea emphasized. (Additional elements (Prong 2, to be discussed in the subsequent section) are italicized): A system for mapping agricultural operations within a working environment, the working environment having a plurality of objects located therein, the system comprising: at least one imaging sensor mounted on an agricultural machine within the working environment and configured to obtain image data indicative of the working environment; and one or more controllers, the one or more controllers being configured to: receive the image data from the at least one imaging sensor; analyze the image data, utilizing a detection model to classify the plurality of objects within the working environment; identify, from the plurality of classified objects, one or more working machines within the working environment; and log operational information associated with the one or more identified working machines within an operational map of the working environment in dependence on the identification of the plurality of classified objects; wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. These limitations, as drafted, constitute “a mental process” because it is an observation/evaluation/judgment/analysis that can, at the currently claimed high level of generality, be practically performed in the human mind (e.g., with pen and paper). For instance, a person could analyze images to identify working machines mentally. Accordingly, the claim recites at least one abstract idea, and the claim does not recite anything that precludes it from the abstract idea grouping. Step 2A Prong 2 Regarding Prong Two of Step 2A of the Alice/Mayo test, it must be determined whether the claim as a whole integrates the abstract idea into a practical application. As noted at MPEP §2106.04(II)(A)(2), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements such as merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” MPEP §2106.05(I)(A). The one or more controllers, the one or more controllers being configured to: […] utilizing a detection model merely describes how to generally “apply” the otherwise mental process using generic components. The at least one imaging sensor mounted on an agricultural machine within the working environment and configured to obtain image data indicative of the working environment; […] one or more controllers, and utilizing a detection model is recited at a high level of generality and merely generic components as described. The receive the image data from the at least one imaging sensor; […] and log operational information associated with the one or more identified working machines within an operational map of the working environment in dependence on the identification of the plurality of classified objects; wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines is/are recited at a high level of generality and amount(s) to no more than insignificant extra-solution activity (i.e. sending and receiving data). The additional elements fail to integrate the judicial exception into a practical application (see MPEP 2106.05(g)) because they merely facilitate performing the mental process. Thus, taken alone, the additional elements do not integrate the at least one abstract idea into a practical application. Looking at the additional limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. MPEP §2106.05(I)(A) and §2106.04(II)(A)(2). For these reasons, claims 1, 14, and 15 do not recite additional elements that integrate the judicial exception into a practical application. Step 2B Regarding Step 2B of the Alice/Mayo test, claims 1, 14, and 15 do not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for reasons the same as those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above, the one or more controllers, the one or more controllers being configured to: […] utilizing a detection model additional element amounts to mere instructions to apply the exception. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Use of a computer or other machinery in its ordinary capacity or simply adding a general-purpose computer or computer components after the fact to an abstract idea does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). As discussed above, the receive the image data from the at least one imaging sensor; […] and log operational information associated with the one or more identified working machines within an operational map of the working environment in dependence on the identification of the plurality of classified objects; wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines amount(s) to no more than insignificant extra-solution activity (i.e. sending and receiving data, gathering and analyzing information using conventional techniques), and recording analysis results is analogous to “mere automation of manual processes” (see MPEP 2106.05(d) (II)) and is/are recited at a high level of generality. The receipt of image data, use of a detection model, and logging operational information as described in the claim is/are well-understood, routine, and conventional in the art. Regarding the at least one imaging sensor mounted on an agricultural machine within the working environment and configured to obtain image data indicative of the working environment; […] one or more controllers, and utilizing a detection model, the recitation of claim limitations that attempt to cover any solution to an identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words “apply it.” See Electric Power Group, LLC v. Alstom, S.A., 830 F.3d 1350, 1356, 119 USPQ2d 1739, 1743-44 (Fed. Cir. 2016); Intellectual Ventures I v. Symantec, 838 F.3d 1307, 1327, 120 USPQ2d 1353, 1366 (Fed. Cir. 2016); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1417 (Fed. Cir. 2015). Thus, even when viewed as an ordered combination, nothing in the claims add significantly more (i.e., an inventive concept) to the mathematical concept. The limitations of claim 1 are analogous to the limitations of claims 14 and 15 and thus the analysis of claim 1 is applied to claims 14 and 15. Dependent Claims The dependent claims 2-13 do not provide additional elements or a practical application to become eligible under 35 U.S.C. 101. Claim 2: A system of claim 1, wherein the one or more controllers are configured to receive image data from at least one imaging sensor on or otherwise associated with each of a plurality of agricultural machines operating within the environment. Claim 4: A system of claim 1, wherein the operational information includes positional information for the one or more identified working machines. Claim 5: A system of claim 4, wherein the positional information includes one or more of: an absolute position of the one or more identified working machines; a relative position of the one or more identified working machines with respect to the at least one imaging sensor and/or with respect to the agricultural machine associated with the at least one imaging sensor from which the operational information is determined; and a direction of travel of the one or more identified working machines. Claim 6: A system of claim 1, wherein the operational information comprises an indication of whether the working machine is or is not performing an operational task. Claim 7: A system of claim 1, wherein the at least one imaging sensor includes: a camera; and/or a transceiver sensor. Claim 8: A system of claim 1, wherein the detection model comprises a machine-learned model trained on one or more training datasets with known objects with respective classifications. Claim 9: A system of claim 1, wherein the classification output by the detection model comprises a bounding box overlaid onto the image data at the location of the object as determined by the model; and wherein the position of the bounding box within the image data is utilized by the one or more controllers of the system for determining positional information for the respective one or more working machines. Claim 10: A system of claim 1, wherein the one or more controllers are configured to receive positional data from a positioning system associated with the agricultural machine. Claim 11: A system of claim 10, wherein the one or more controllers are configured to utilize the positional data for an agricultural machine and image data from at least one sensor associated with that machine to determine positional information for the one or more working machines in dependence thereon. Claim 12: A system of claim 1, wherein the operational map is stored on a remote data server accessible through a suitable wireless communication link at the agricultural machine, or remotely by an operator for the working environment overseeing individual operational tasks performed therein. Claim 13: A system of claim 1, comprising a user interface; and wherein the one or more controllers are configured to generate a representation of the operational map for display by, and optionally provide interaction via, the user interface. Claim 16: The system of claim 1, wherein the indication of the operational task describes the type of operation being performed by the one or more working machines, the description of the type of operation being one of a harvesting operation, a tillage operation, a seeding operation, and a fertilizing operation. Claim 17: The system of claim 16, wherein the indication of the operational task relates to performance of the operational task, the performance being one of a working height and a working depth. These additional claim limitations recite mental processes and further narrow the abstract idea. They do not constitute a practical application of the abstract idea and do not amount to significantly more than the judicial exception. The localization and mapping, classification, and sensor limitations are recited at a high level of generality. The localization and mapping and sensor limitations are well understood, routine, and conventional within the art. Thus, the claims generally link the use of the abstract idea to a particular technological environment and do not integrate the judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims, individually or in combination, do not include additional elements that are sufficient to amount to significantly more than the judicial exception at Step 2A or provide an inventive concept in Step 2B. For these reasons, there is no inventive concept in the claim, and thus it is ineligible. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-8 and 10-15 are rejected under 35 U.S.C. 103 as being unpatentable over Eichhorn (US 20220026226, hereinafter “Eichhorn”) in view of Van Dyk et al (CN 114303613, hereinafter “Van Dyk,” all citations and excerpts taken from the attached machine translation). Regarding Claim 1, Eichhorn describes: A system for mapping agricultural operations within a working environment, the working environment having a plurality of objects located therein, the system comprising: at least one imaging sensor mounted on an agricultural machine within the working environment and configured to obtain image data indicative of the working environment; (Eichhorn ¶ 0046 lines 1-11 “In various implementations of the system 20, as shown in FIG. 3, an agricultural vehicle 22 is outfitted with one or more object detection sensors 30 to detect objects 10 that may provide a collision or navigation hazard. In various implementations, the object detection sensors 30 may include, but are not limited to single cameras 36 such as Mobileye Supervision, stereo cameras 36 such as the Stereo Labs Zed 2i, cameras 36 utilizing structured light patterns such as the Intel RealSense 435, LIDAR 38 such as the Velodyne Puck, time of flight sensors 40 such as the IFM O3M, 1D, 2D and/or 3D distance sensors” and ¶ 0056 lines 1-11 “In certain implementations, the sensors 30 are constructed and arranged to detect objects 10 and measure the distance and direction of the object 10 from the vehicle 22 and/or the sensor(s),” the detected objects being indicative of the working environment) and one or more controllers, the one or more controllers being configured to: receive the image data from the at least one imaging sensor; (Eichhorn ¶ 0052 lines 1-5 “In various implementations of the system 20, the operations system 50 further includes the various processing and computing components necessary for the operation of the system 20, including receiving, recording and processing the various received signals”) analyze the image data, utilizing a detection model to classify the plurality of objects within the working environment; (Eichhorn ¶ 0059 lines 1-2 “In some implementations, the system 20 is able to classify the detected objects,” and ¶ 0061 lines 1-6 “In certain implementations, the classification process is automated and may implement various machine learning techniques, as would be readily appreciated. In various of these implementations, the system 20 may analyze sensor 30 feedback—such as but not limited to images, video,”) identify, from the plurality of classified objects, one or more working machines within the working environment; (Eichhorn ¶ 0060 lines 1-15 “In various implementations, certain objects 10 may include unique identifiers for detection by the various sensors 30 to assist in object 10 classification. [...] the unique identifier may be inherent to the object 10 such as a known pattern of lights, symbols, and/or other characteristics on a […] vehicle.”) and log operational information associated with the one or more identified working machines (Eichhorn ¶ 0067 lines 1-7 “As would be appreciated, many agricultural vehicles 10 and implements are capable of extending and retracting structures to allow for field 6 operation and road 7 transport. The status of these structures may be monitored by the system 20, via any known or appreciated mechanism, in some implementations, to provide accurate collision warnings and prevent nuisance false alarms,” and ¶ 0065 “As previously noted, the maps 72, databases 58, and/or other media containing object 10 locations and other object 10 information may be used for […] automatic swath control operations,” the status of agricultural implements extending and retracting being operational information, and monitoring by the system being analogous to logging, and a database of object information teaching vehicle operational information being logged to a database) within an operational map of the working environment in dependence on the identification of the plurality of classified objects; […] (Eichhorn ¶ 0048 lines 1-9 “In various implementations, the system's 20 object detection sensors 30 are used in conjunction with a vehicle 22 equipped with a Global Navigation Satellite System (“GNSS”) 32 to precisely locate and map obstacles 10 that may be collision or navigation hazards. That is, in certain implementations, a GNSS 32 is provided that is configured or otherwise interfaced with one or more sensors 30 to calculate and store object 10 locations in maps 72 or other databases 58 for future use and/or use by other vehicles.”) Eichhorn does not explicitly teach: […] wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. Within the same field of endeavor as Eichhorn, Van Dyk teaches: […] wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. (Van Dyk Pg 19 ¶ 3 “The processing system 338 processes the sensor data generated from the agricultural characteristic sensor 336 to generate the processed data, and some examples of which are described below. For example, agricultural characteristic sensor 336 may be an optical sensor, such as a camera or other device performing optical sensing. The optical sensor may generate an image indicative of various agricultural characteristics and related characteristics, such as non-machine characteristics or machine characteristics of the agricultural harvester 100 or another machine. processing system 338 processing one or more sensor signal (such as obtained from the optical sensor of the image) to generate identification one or more non-machine characteristics (such as field characteristics) of the processed sensor data (such as processed image data). or identifying one or more characteristics of the agricultural harvester 100 (such as machine setting, characteristic of the operation characteristic or machine performance) or the sensor data related to the processing of the characteristic,” emphasis added, teaching the identification of operational characteristics and performance of an agricultural harvester via image recognition) Eichhorn and Van Dyk are considered analogous because they both relate to agricultural vehicle operations. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the monitoring of implements for field operation of Eichhorn with the simple substitution of Van Dyk’s identification of agricultural harvester operation characteristics or machine performance. This modification would be made with a reasonable expectation of success as motivated by controlling the harvester to optimize the operation of the agricultural machinery (Van Dyk Pg 3 ¶ 8 – Pg 4 ¶ 1). Regarding Claim 2, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: wherein the one or more controllers are configured to receive image data from at least one imaging sensor on or otherwise associated with each of a plurality of agricultural machines operating within the environment. (Eichhorn ¶ 0013 lines 1-14 “In Example 8 a navigation system comprising: one or more collision avoidance sensors disposed on a first vehicle configured to detect objects and object data; and an operations system in communication with the one or more collision avoidance sensors. […] and a communications link in communication with the storage medium, the communications link configured for transmitting object data to one or more second vehicles,”) Regarding Claim 4, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: wherein the operational information includes positional information for the one or more identified working machines. (Eichhorn ¶ 0048 lines 1-9 “In various implementations, the system's 20 object detection sensors 30 are used in conjunction with a vehicle 22 equipped with a Global Navigation Satellite System (“GNSS”) 32 to precisely locate and map obstacles 10 that may be collision or navigation hazards. That is, in certain implementations, a GNSS 32 is provided that is configured or otherwise interfaced with one or more sensors 30 to calculate and store object 10 locations in maps 72 or other databases 58 for future use and/or use by other vehicles.”) Regarding Claim 5, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 4 as described above. Eichhorn further describes: wherein the positional information includes one or more of: an absolute position of the one or more identified working machines; a relative position of the one or more identified working machines with respect to the at least one imaging sensor and/or with respect to the agricultural machine associated with the at least one imaging sensor from which the operational information is determined; (Eichhorn ¶ 0056 “In certain implementations, the sensors 30 are constructed and arranged to detect objects 10 and measure the distance and direction of the object 10 from the vehicle 22 and/or the sensor(s) 30. In some implementations, the system 20 is constructed and arranged to work in conjunction with the GPS 32 to calculate and store the location of the object 10, as would be appreciated. That is, the system 20 is constructed and arranged to survey the location of an object 10 by offsetting the reported location of the vehicle 22 by the distance and direction of the object 10 detected by one or more sensors 30.”) and a direction of travel of the one or more identified working machines. (Eichhorn ¶ 0047 lines 11-14 “the system 20 may implement sensor fusion to include a laser rangefinder or other second sensor 30 to determine the distance and direction/orientation of the object 10 to the sensor 30.”) Regarding Claim 6, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn does not explicitly teach: wherein the operational information comprises an indication of whether the working machine is or is not performing an operational task. Within the same field of endeavor as Eichhorn, Van Dyk teaches: wherein the operational information comprises an indication of whether the working machine is or is not performing an operational task. (Van Dyk Pg 19 ¶ 3 “identifying one or more characteristics of the agricultural harvester 100 (such as machine setting, characteristic of the operation characteristic or machine performance) or the sensor data related to the processing of the characteristic,” emphasis added, operation characteristics and machine performance being indications of whether or not the harvester is performing a task) Eichhorn and Van Dyk are considered analogous because they both relate to agricultural vehicle operations. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the monitoring of implements for field operation of Eichhorn with the simple substitution of Van Dyk’s identification of agricultural harvester operation characteristics or machine performance. This modification would be made with a reasonable expectation of success as motivated by controlling the harvester to optimize the operation of the agricultural machinery (Van Dyk Pg 3 ¶ 8 – Pg 4 ¶ 1). Regarding Claim 7, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: wherein the at least one imaging sensor includes: a camera; (Eichhorn ¶ 0046 lines 1-7 “In various implementations of the system 20, as shown in FIG. 3, an agricultural vehicle 22 is outfitted with one or more object detection sensors 30 to detect objects 10 that may provide a collision or navigation hazard. In various implementations, the object detection sensors 30 may include, but are not limited to single cameras 36 such as Mobileye Supervision, stereo cameras 36”) and/or a transceiver sensor. (Eichhorn ¶ 0046 lines 8-12 “LIDAR 38 such as the Velodyne Puck, time of flight sensors 40 such as the IFM O3M, 1D, 2D and/or 3D distance sensors 42 such as but not limited to NXP TEF810x radar”) Regarding Claim 8, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: wherein the detection model comprises a machine-learned model (Eichhorn ¶ 0061 lines 1-6 “In certain implementations, the classification process is automated and may implement various machine learning techniques, as would be readily appreciated. In various of these implementations, the system 20 may analyze sensor 30 feedback—such as but not limited to images, video, and point clouds”) trained on one or more training datasets with known objects with respective classifications. (Eichhorn ¶ 0062 lines 1-16 “In various implementations, the CNN or other artificial intelligence system is configured to classify objects 10 […] the system 20 may be configured to prompt a user to confirm or reject the classification of an object 10 […] The user responses may then be used to correctly classify the object 10, and also to train the CNN or other artificial intelligence system and improve recognition accuracy, as would be appreciated.”) Regarding Claim 10, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: wherein the one or more controllers are configured to receive positional data from a positioning system associated with the agricultural machine. (Eichhorn ¶ 0006 lines 1-3 “In Example 1 a collision avoidance system comprising: […] a GPS disposed on the vehicle”) Regarding Claim 11, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 10 as described above. Eichhorn further describes: wherein the one or more controllers are configured to utilize the positional data for an agricultural machine and image data from at least one sensor associated with that machine to determine positional information for the one or more working machines in dependence thereon. (Eichhorn ¶ 0056 “In certain implementations, the sensors 30 are constructed and arranged to detect objects 10 and measure the distance and direction of the object 10 from the vehicle 22 and/or the sensor(s) 30. In some implementations, the system 20 is constructed and arranged to work in conjunction with the GPS 32 to calculate and store the location of the object 10, as would be appreciated. That is, the system 20 is constructed and arranged to survey the location of an object 10 by offsetting the reported location of the vehicle 22 by the distance and direction of the object 10 detected by one or more sensors 30.”) Regarding Claim 12, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: wherein the operational map is stored on a remote data server (Eichhorn ¶ 0078 lines 11-13 “In various implementations, the map 72 is stored in the cloud 70 or other remote database 58 in communication with the vehicle 22”) accessible through a suitable wireless communication link at the one or more agricultural machine, (Eichhorn ¶ 0078 lines 14-16 “via a communications component 64, such as but not limited to a cellular link, wi-fi, mobile data, radio link, or other wireless communication service as would be appreciated.”) or remotely by an operator for the working environment overseeing individual operational tasks performed therein. (Eichhorn ¶ 0054 “Continuing with FIGS. 3 and 4, in further implementations, the display 52 and/or remote cloud system 70 include a graphical user interface (“GUI”) 62 and a graphics processing unit (“GPU”) 63. In these and other implementations, the GUI 62 and/or GPU 63 allows for the display of information to a user and optionally for a user to interact with the displayed information, as would be readily appreciated. It would be understood that various input methods are possible for user interaction including but not limited to a touch screen, various buttons, a keyboard, or the like.”) Regarding Claim 13, the combination of Eichhorn and Van Dyk teaches the limitations of Claim 1 as described above. Eichhorn further describes: comprising a user interface; (Eichhorn ¶ 0054 lines 1-6 “Continuing with FIGS. 3 and 4, in further implementations, the display 52 and/or remote cloud system 70 include a graphical user interface (“GUI”) 62 and a graphics processing unit (“GPU”) 63. In these and other implementations, the GUI 62 and/or GPU 63 allows for the display of information to a user”) and wherein the one or more controllers are configured to generate a representation of the operational map for display by, (Eichhorn ¶ 0078 lines 16-19 “In some implementations, the map 72 and/or database 58 may be stored on physical media that is transferred between vehicles 22, 23, such a via a portable display 52.”) and optionally provide interaction via, the user interface. (Eichhorn ¶ 0054 lines 6-10 “and optionally for a user to interact with the displayed information, as would be readily appreciated. It would be understood that various input methods are possible for user interaction including but not limited to a touch screen, various buttons, a keyboard, or the like.”) Regarding Claim 14, Eichhorn describes: A control system for mapping one or more agricultural operations within a working environment, the working environment having a plurality of objects located therein, the control system comprising one or more controllers, and being configured to: receive image data (Eichhorn ¶ 0052 lines 1-5 “In various implementations of the system 20, the operations system 50 further includes the various processing and computing components necessary for the operation of the system 20, including receiving, recording and processing the various received signals”) from at least one imaging sensor mounted on an agricultural machine within the working environment (Eichhorn ¶ 0046 lines 1-11 “In various implementations of the system 20, as shown in FIG. 3, an agricultural vehicle 22 is outfitted with one or more object detection sensors 30 to detect objects 10 that may provide a collision or navigation hazard. In various implementations, the object detection sensors 30 may include, but are not limited to single cameras 36 such as Mobileye Supervision, stereo cameras 36 such as the Stereo Labs Zed 2i, cameras 36 utilizing structured light patterns such as the Intel RealSense 435, LIDAR 38 such as the Velodyne Puck, time of flight sensors 40 such as the IFM O3M, 1D, 2D and/or 3D distance sensors”) and configured to obtain image data indicative of the working environment and/or one or more objects located therein; (Eichhorn ¶ 0056 lines 1-11 “In certain implementations, the sensors 30 are constructed and arranged to detect objects 10 and measure the distance and direction of the object 10 from the vehicle 22 and/or the sensor(s)”) analyze the image data, utilizing a detection model to classify one or more the plurality of objects within the working environment; (Eichhorn ¶ 0059 lines 1-2 “In some implementations, the system 20 is able to classify the detected objects,” and ¶ 0061 lines 1-6 “In certain implementations, the classification process is automated and may implement various machine learning techniques, as would be readily appreciated. In various of these implementations, the system 20 may analyze sensor 30 feedback—such as but not limited to images, video,”) identify, from the plurality of classified objects, one or more working machines within the working environment; (Eichhorn ¶ 0060 lines 1-15 “In various implementations, certain objects 10 may include unique identifiers for detection by the various sensors 30 to assist in object 10 classification. [...] the unique identifier may be inherent to the object 10 such as a known pattern of lights, symbols, and/or other characteristics on a […] vehicle.”) and control the logging of operational information associated with the one or more identified working machines (Eichhorn ¶ 0067 lines 1-7 “As would be appreciated, many agricultural vehicles 10 and implements are capable of extending and retracting structures to allow for field 6 operation and road 7 transport. The status of these structures may be monitored by the system 20, via any known or appreciated mechanism, in some implementations, to provide accurate collision warnings and prevent nuisance false alarms,” and ¶ 0065 “As previously noted, the maps 72, databases 58, and/or other media containing object 10 locations and other object 10 information may be used for […] automatic swath control operations,” the status of agricultural implements extending and retracting being operational information, and monitoring by the system being analogous to logging, and a database of object information teaching vehicle operational information being logged to a database) within an operational map of the working environment in dependence on the identification of the plurality of classified objects; […] (Eichhorn ¶ 0048 lines 1-9 “In various implementations, the system's 20 object detection sensors 30 are used in conjunction with a vehicle 22 equipped with a Global Navigation Satellite System (“GNSS”) 32 to precisely locate and map obstacles 10 that may be collision or navigation hazards. That is, in certain implementations, a GNSS 32 is provided that is configured or otherwise interfaced with one or more sensors 30 to calculate and store object 10 locations in maps 72 or other databases 58 for future use and/or use by other vehicles.”) Eichhorn does not explicitly teach: […] wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. Within the same field of endeavor as Eichhorn, Van Dyk teaches: […] wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. (Van Dyk Pg 19 ¶ 3 “The processing system 338 processes the sensor data generated from the agricultural characteristic sensor 336 to generate the processed data, and some examples of which are described below. For example, agricultural characteristic sensor 336 may be an optical sensor, such as a camera or other device performing optical sensing. The optical sensor may generate an image indicative of various agricultural characteristics and related characteristics, such as non-machine characteristics or machine characteristics of the agricultural harvester 100 or another machine. processing system 338 processing one or more sensor signal (such as obtained from the optical sensor of the image) to generate identification one or more non-machine characteristics (such as field characteristics) of the processed sensor data (such as processed image data). or identifying one or more characteristics of the agricultural harvester 100 (such as machine setting, characteristic of the operation characteristic or machine performance) or the sensor data related to the processing of the characteristic,” emphasis added, teaching the identification of operational characteristics and performance of an agricultural harvester via image recognition) Eichhorn and Van Dyk are considered analogous because they both relate to agricultural vehicle operations. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the monitoring of implements for field operation of Eichhorn with the simple substitution of Van Dyk’s identification of agricultural harvester operation characteristics or machine performance. This modification would be made with a reasonable expectation of success as motivated by controlling the harvester to optimize the operation of the agricultural machinery (Van Dyk Pg 3 ¶ 8 – Pg 4 ¶ 1). Regarding Claim 15, Eichhorn describes: A method of mapping agricultural operations within a working environment, the working environment having a plurality of objects located therein, the method comprising: receiving image data (Eichhorn ¶ 0052 lines 1-5 “In various implementations of the system 20, the operations system 50 further includes the various processing and computing components necessary for the operation of the system 20, including receiving, recording and processing the various received signals”) from at least one imaging sensor mounted on an agricultural machine within the working environment (Eichhorn ¶ 0046 lines 1-11 “In various implementations of the system 20, as shown in FIG. 3, an agricultural vehicle 22 is outfitted with one or more object detection sensors 30 to detect objects 10 that may provide a collision or navigation hazard. In various implementations, the object detection sensors 30 may include, but are not limited to single cameras 36 such as Mobileye Supervision, stereo cameras 36 such as the Stereo Labs Zed 2i, cameras 36 utilizing structured light patterns such as the Intel RealSense 435, LIDAR 38 such as the Velodyne Puck, time of flight sensors 40 such as the IFM O3M, 1D, 2D and/or 3D distance sensors”) and configured to obtain image data indicative of the working environment and/or one or more objects located therein; (Eichhorn ¶ 0056 lines 1-11 “In certain implementations, the sensors 30 are constructed and arranged to detect objects 10 and measure the distance and direction of the object 10 from the vehicle 22 and/or the sensor(s)”) analyzing the image data, utilizing a detection model to classify the plurality of objects within the working environment; (Eichhorn ¶ 0059 lines 1-2 “In some implementations, the system 20 is able to classify the detected objects,” and ¶ 0061 lines 1-6 “In certain implementations, the classification process is automated and may implement various machine learning techniques, as would be readily appreciated. In various of these implementations, the system 20 may analyze sensor 30 feedback—such as but not limited to images, video,”) identifying, from the plurality of classified objects, one or more working machines within the working environment; (Eichhorn ¶ 0060 lines 1-15 “In various implementations, certain objects 10 may include unique identifiers for detection by the various sensors 30 to assist in object 10 classification. [...] the unique identifier may be inherent to the object 10 such as a known pattern of lights, symbols, and/or other characteristics on a […] vehicle.”) and controlling the logging of operational information associated with the one or more identified working machines (Eichhorn ¶ 0067 lines 1-7 “As would be appreciated, many agricultural vehicles 10 and implements are capable of extending and retracting structures to allow for field 6 operation and road 7 transport. The status of these structures may be monitored by the system 20, via any known or appreciated mechanism, in some implementations, to provide accurate collision warnings and prevent nuisance false alarms,” and ¶ 0065 “As previously noted, the maps 72, databases 58, and/or other media containing object 10 locations and other object 10 information may be used for […] automatic swath control operations,” the status of agricultural implements extending and retracting being operational information, and monitoring by the system being analogous to logging, and a database of object information teaching vehicle operational information being logged to a database) within an operational map of the working environment in dependence on the identification of the plurality of classified objects; (Eichhorn ¶ 0048 lines 1-9 “In various implementations, the system's 20 object detection sensors 30 are used in conjunction with a vehicle 22 equipped with a Global Navigation Satellite System (“GNSS”) 32 to precisely locate and map obstacles 10 that may be collision or navigation hazards. That is, in certain implementations, a GNSS 32 is provided that is configured or otherwise interfaced with one or more sensors 30 to calculate and store object 10 locations in maps 72 or other databases 58 for future use and/or use by other vehicles.”) Eichhorn does not explicitly teach: […] wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. Within the same field of endeavor as Eichhorn, Van Dyk teaches: […] wherein the operational information comprises an indication of an operational task being performed by the one or more identified working machines. (Van Dyk Pg 19 ¶ 3 “The processing system 338 processes the sensor data generated from the agricultural characteristic sensor 336 to generate the processed data, and some examples of which are described below. For example, agricultural characteristic sensor 336 may be an optical sensor, such as a camera or other
Read full office action

Prosecution Timeline

Jun 16, 2023
Application Filed
Mar 21, 2025
Non-Final Rejection — §101, §103
Jun 25, 2025
Response Filed
Sep 17, 2025
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591246
METHOD AND DEVICE FOR REMOTELY CONTROLLING A VEHICLE
2y 5m to grant Granted Mar 31, 2026
Patent 12559259
AIRCRAFT TEST SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12515638
METHOD AND APPARATUS FOR OPTIMAL CONTROL OF DRIVING TORQUE FOR SMOOTH RIDE ON UNEVEN ROAD
2y 5m to grant Granted Jan 06, 2026
Patent 12496711
MULTI-LEGGED ROBOT LOAD BALANCING METHOD, MULTI-LEGGED ROBOT, AND STORAGE MEDIUM
2y 5m to grant Granted Dec 16, 2025
Patent 12468301
WORKING ROBOT SYSTEM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+61.5%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 22 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month