DETAILED ACTION
This Final Office Action is in response to amendments filed 12/17/2025.
Claims 1, 6, 7, 9, 12, 14, 18, 19, 21, and 23 have been amended.
Claims 1-23 are pending.
Drawings
The amendments to the drawings filed 12/17/2025 have been entered by the Examiner.
Specification
The amendments to the specification filed 12/17/2025 have been entered by the Examiner.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 10/21/2025 has been considered by the examiner.
Response to Arguments
Claim Objections
Due to the amendments filed 12/17/2025, the objections of claims 9 and 21 have been withdrawn and replaced with rejections under 35 U.S.C. 112(b). Specifically, the independent claims recite multiple instances of “sensor data” (e.g., sensor data transmitted to the one or more processors, sensor data from a first sensor type, sensor data from a second sensor type, and sensor data from a first sensor type and a second sensor type), such that proper antecedent basis of “the sensor data” of claims 9 and 21 cannot be clearly determined.
Rejections under 35 U.S.C. 112
Due to the amendments filed 12/17/2025, the issues discussed with respect to the rejection of claim 12 under 35 U.S.C. 112(b) has been resolved; however, claims 12, 13, and 23 remain rejected under 35 U.S.C. 112(b) in the present Office Action. Specifically, claims 12 and 13 are rejected under 35 U.S.C. 112(b) for incorporating the errors of claim 9, and the dependency of claim 23 has not been changed in the amendments filed 12/17/2025 to resolve the insufficient antecedent basis discussed in paragraph 14 of the Office Action mailed 9/17/2025.
Rejections under 35 U.S.C. 102 and 103
Applicant’s arguments with respect to the claims have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Specifically, new references have been applied to the amendments filed 12/17/2025.
Examiner’s Note
To enhance clarity, claim language is underlined throughout this Office Action.
Citations to the prior art are provided in parentheses following each claim limitation, along with any necessary supplemental explanations.
While no allowable subject matter has been currently indicated, the Examiner has provided suggested amendments within the rejections of claims 6 and 18 to assist the Applicant in reaching allowable subject matter.
Claim Objections
Claims 2 and 5 objected to because of the following informalities:
Claims 2 and 5 recite the limitation of one or more sensors. However, independent claim 1 has been amended to recite “a plurality of sensors.” To ensure antecedent basis, claims 2 and 5 should be amended to align with the amended limitations of claim 1.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 9-13, 21, and 23 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 9 recites the limitation of the sensor data. One of ordinary skill in the art cannot clearly determine if this limitation is referencing the “sensor data” transmitted to the central processor of claim 9, the “sensor data” transmitted to the one or more processors of claim 1, the “sensor data” from the first sensor type of claim 1, the “sensor data” from the second sensor type of claim 1, or the “sensor data” from the first sensor type and the second sensor type of claim 1. Claim 21 is rejected under 35 U.S.C. 112(b) for similar reasons.
Claims 10-13 are rejected under 35 U.S.C. 112(b) for incorporating the errors of claim 9 by dependency.
Claim 23 recites the limitations of the one or more predicted objects and the central processor. There is insufficient antecedent basis for this limitation in the claim. Specifically, claim 23 depends from independent claim 14. A central processor and one or more predicted objects cannot be considered inherent features of the method.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-6, 8, 14-18, 20 and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Honkanen (US 2022/0391644 A1), hereinafter Honkanen, in view of McKinnis, JR. et al. (US 2019/0327897 A1), hereinafter McKinnis.
Claim 1
Honkanen discloses the claimed system for monitoring a portion of an agricultural field during operation of an agricultural work machine (see Figure 3, depicting system 11, described as applicable to any agricultural application in ¶0111, e.g., embodiments in which the agricultural tool is a chopper or mover, as described in ¶0088), the system comprising:
one or more processors (i.e. data processing unit 150, as described in ¶0114);
a plurality of sensors configured to transmit sensor data to the one or more processors (see ¶0112-0113, regarding sensor 118, position unit 130, and imaging devices 20 provide captured images and sensed data for processing by data processing unit 150, as described in ¶0014); and
a memory device coupled to the one or more processors, the memory device including instructions that when executed by the at least one or more processors cause the one or more processors to perform the following claimed steps (see ¶0117, regarding data processing unit 150 includes specific software; ¶0041, regarding that the invention is operated using suitable software).
Honkanen further discloses that the processor is configured to determine a presence of one or more objects disposed in of a portion of an agricultural field disposed about a predetermined area of the work machine based on the sensor data from a first sensor type of the plurality of sensors (see ¶0115, regarding imaging devices 120 transfer their images to data interpretation unit 151 of data processing unit 150 for calculating an interpretation of any detected features such as feature 1110, depicted as a stone 1110 in agricultural field 1100 in Figure 3). A “predetermined area” may be reasonably taught by the sensor’s detection region. While Honkanen describes the system as being applicable to harvesting operations, where the agricultural working means can be any agricultural tool, such as a mower (see ¶0038), Honkanen does not explicitly recite a header. However, due to mowers using header-style attachments, it would be obvious to modify the agricultural working means to be a header, in light of McKinnis.
Specifically, McKinnis teaches that common harvesters such as mowers include a header (see ¶0001; ¶0048, with respect to Figure 1, depicting harvester 100 including header 102).
Since the systems of Honkanen and McKinnis are directed to the same purpose, i.e. agricultural machines that perform harvesting operations, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the work machine of Honkanen to include a header, in light of McKinnis, such that the one or more objects taught by Honkanen may be reasonably disposed in of a portion of an agricultural field disposed about a predetermined area of a header of the work machine, with the predictable result of incorporating a header known to be included in common harvesters such as mowers (see ¶0001 of McKinnis), applicable to an embodiment of the work machine of Honkanen (see ¶0038).
Honkanen further discloses that the processor is configured to determine the presence of the one or more objects based on sensor data from a second sensor type of the plurality of sensors (see ¶0115, regarding data interpretation unit 151 considers the sensor data of sensor 118, defined as a draft force sensor in ¶0112; ¶0038-0039, regarding information from different sensors are used in combination with imaging data, where the sensors include force sensors, defined as being used to identify certain events, e.g., hitting a stone or driving over a curb, as described in ¶0031-0032).
Honkanen further discloses that the processor is configured to combine data from the first sensor and the second sensor (see ¶0038-0039, regarding that during a working operation, data from different sensors, e.g., force sensor, are stored with imaging data). Because the limitation of “data” does not reference the “sensor data” of the preceding limitations and the combined “data” is not used in the subsequent limitations, these limitations have been accorded their broadest reasonable interpretation in view of the prior art.
Honkanen further discloses that the processor is configured to:
determine one or more features of the one or more objects based on the sensor data from the first sensor type and the second sensor type (see ¶0117, regarding that feature detection unit 152, feature location determination unit 154, and feature determination unit 156 of data processing unit 150 identifies features in the images and/or map created by the images, calculates the positions of the detected features, and identifies respective attributes of the features; ¶0038-0039, regarding that different sensors are used in combination with the imaging data to enhance the characterization of the area detected, e.g., associating the instance of draft force with a stone); and
determine whether the one or more objects are desired objects based at least in part on the one or more features of the one or more objects (see ¶0062, regarding the various applications in which the determined features are applied, e.g., “desirably” detecting uniformity of furrows or “undesirably” detecting obstacles; ¶0123, regarding that machine learning unit 158 is included in data processing unit 150, where the machine learning unit is trained for the detection and determination of different objectives, such as stones and animals, using force and optical data, as described in ¶0080).
Because the determined “desired objects” are not used in subsequent limitations, the limitations involved in the determination of desired objects have been accorded their broadest reasonable interpretation in view of the prior art. Further, the description of a determined “feature” in the Applicant’s disclosure is not based on a fusion of data from different types of sensors (see paragraph [0063] of the specification filed 4/30/2024); therefore, Honkanen discloses the claimed determination of “one or more features” in as much as the Applicant’s disclosure.
Honkanen further discloses that the processor is configured to:
determine one or more object identifiers of the one or more objects (see ¶0046, regarding that feature determination unit characterizes an identified feature by its attributes, e.g., it may be determined whether an identified obstacle is a stone or an animal); and
transmit instructions to one or more systems to adjust performance of the agricultural machine based at least in part on the one or more object identifiers (see ¶0076, regarding that data processing unit automatically induces a controlling response based on the determined features at the agricultural working means, which is accordingly controlled based on the controlling response).
Claims 2 and 15
Honkanen further discloses that the one or more sensors comprises an optical sensor (see ¶0112-0113, regarding imaging devices 20 provide captured images for processing by data processing unit 150, as described in ¶0014), as discussed in the rejection of claim 1.
Claims 3 and 16
Honkanen further discloses that the one or more processors are further configured to classify the one or more objects (see ¶0046, regarding that feature determination unit characterizes an identified feature by its attributes, e.g., it may be determined whether an identified obstacle is a stone or an animal).
Claims 4 and 17
Honkanen further discloses that the one or more processors are further configured to determine a predicted interaction of the one or more objects based at least in part on classification information (see ¶0013-0016, regarding that a predicted force is determined using previously collected draft force data and other data in order to predict discrete events, such as stone detection).
Claim 5
Honkanen further discloses that the one more sensors comprise a vibration sensor and an optical sensor (see ¶0032, regarding that the system comprises an accelerometer; ¶0036, regarding that the system comprises at least one imaging device), and wherein the processor is configured to determine one or more objects based at least in part on optical data and vibration data from the vibration sensor and the optical sensor (see ¶0038-0039, regarding that information from different sensors are used in combination with imaging data to characterize the detected area, e.g., classify a geographic object using image data and drag force, where acceleration data supplements the force data to better identify events, such as hitting a stone, as described in ¶0032).
Claims 6 and 18
While Honkanen, as modified by McKinnis, further discloses the work machine as comprising a header, the combination of Honkanen and McKinnis does not explicitly disclose that the predetermined area comprises a ground portion of the agricultural field between the header and the work machine. However, the “plurality of sensors” is not defined in the claim language, and therefore, the application of Honkanen to the limitations of “first sensor type” and “second sensor type” may be transposed, such that the “first sensor type” used to determine the presence of an object in the “predetermined area” may alternatively be force sensor 20 attached to working implement 10 in the area of the hitch (see ¶0089 of Honkanen, with respect to Figure 1). Modifying the working implement 10 to be a header, in light of the combination of Honkanen and McKinnis discussed in the rejection of claim 1, teaches that the predetermined area comprises a ground portion of the agricultural field between the header and the work machine, where the “predetermined area” may be reasonably taught by the sensor’s detection region.
Upon further search and consideration, the limitations of claims 6 and 18 may overcome the identified prior art when the limitation of the “first sensor type” is limited to be an optical sensor.
Claims 8 and 20
Honkanen further discloses that the processor is configured to receive a field location of the of the one or more objects (see ¶0117, regarding that feature location determination unit 154 calculates the position of the detected feature, where position information is provided from position unit 130, as described in ¶0113, which is used to interpret the optical data of imaging devices 120 with consideration of sensor data of sensor 118, as described in ¶0115).
Claim 14
Honkanen, as modified by McKinnis, discloses the claimed method for monitoring a portion of an agricultural field during operation of an agricultural work machine, as discussed in the rejection of claim 1.
Claim 23
Due to the lack of clear antecedent basis and resulting ambiguity discussed in the rejection of claim 23 under 35 U.S.C. 112(b), the scope of the limitations of claim 23 has been interpreted broadly for the purposes of the prior art rejection.
Honkanen further discloses receiving the one or more predicted objects from the central processor (see ¶0038, regarding a user may access a map of detected features, e.g., a stone, that has been transferred via cloud to a remote location).
Claims 7 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Honkanen in view of McKinnis, and in further view of Zielke et al. (US 2022/0225569 A1), hereinafter Zielke.
Claims 7 and 19
Given that the imaging devices of Honkanen are designed to acquire images of an environment of the agricultural working means (see ¶0036), defined as any agricultural tool in ¶0038, Honkanen, as modified by McKinnis, further discloses that the predetermined area comprises a ground portion of the agricultural field in front of the header with respect to a direction of travel of the work machine (see Figure 3, depicting imaging devices 120 arranged to image the front and rear environment of driving means 114). In case the combination of Honkanen and McKinnis cannot reasonably teach the “predetermined area” in front of a header, Zielke is applied to more clearly teach this known configuration.
Specifically, Zielke teaches a harvester 12 that includes a header 12 in ¶0072, with respect to Figure 10A (similar to the agricultural work machine taught by the combination of Honkanen and McKinnis) with a processor 24 (similar to the one or more processors taught by Honkanen) that determines ground conditions that include rocks (similar to the one or more objects taught by Honkanen) based on images of the field during harvest captured by video camera 40 (similar to the sensor data from a first sensor type taught by Honkanen) (see ¶0087, with respect to steps 120 and 122 of Figure 12, regarding video sensor 40 records real-time images of the field during harvest such that image classification processor can process the recorded images to detect various ground conditions, where the ground conditions may include rocks detected in real-time, as described in ¶0070), such that the area of the field captured by video camera 40 (similar to the predetermined area taught by Honkanen) comprises a ground portion of the agricultural field in front of header 12 with respect to a direction of travel of harvester 12 (see ¶0089-0090, regarding recognition of an approach of harvester 12 to a rock so as to adjust the header 14, which is known to be damaged by collision with rocks, as described in ¶0070).
Since the systems of Honkanen and Zielke are directed to the same purpose, i.e. detecting obstacles in a field using a camera installed on an agricultural machine during harvesting operations, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the first sensor type of the plurality of sensors taught by Honkanen to be positioned, such that the predetermined area comprises a ground portion of the agricultural field in front of the header with respect to a direction of travel of the work machine, in the same manner that a video camera of Zielke is positioned to detect an area in front of a header, with the predictable result of allowing the camera to view rocks or similar hazards before entering the header to prevent equipment damage and loss of productivity (¶0070 of Zielke).
Claims 9, 12, 13, and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Honkanen in view of McKinnis, and in further view of Gresch et al. (US 2019/0174667 A1), hereinafter Gresch.
Claims 9 and 21
While Honkanen further discloses that the processor is configured to transmit sensor data to a central processor (see ¶0038, regarding that sensor data and optical images are transferred via cloud to a remote location for further processing), Honkanen does not further disclose that the central processor is configured to determine one or more future predicted object locations based at least in part on the sensor data transmitted to the central processor. However, because the “predicted object locations” are not defined as being associated with the “one or more objects” and the “sensor data” is not defined as being associated with particular sensor(s), prior art may be reasonably combined to teach this claimed feature.
Specifically, Gresch teaches an agricultural machine that may be defined as a harvesting machine (similar to the agricultural work machine taught by Honkanen) (see ¶0017) that includes a control unit 94 (similar to the processor taught by Honkanen) configured to transmit sensor data to a central processor (see ¶0021, regarding signals of the sensor and/or detected position of the detected foreign object is transmitted to a remote site, such as a computer center, where the sensors include optical sensors, as described in ¶0018), wherein the central processor is configured to determine one or more future predicted object locations based at least in part on the sensor data (see ¶0050, regarding the coordinate points are processed at the remote site to accumulate mapped points that can be transmitted to other machines for early indication of hazard sites, such as stones; ¶0045, regarding stone detection processes using cameras). No steps are claimed for performing a “prediction;” therefore, the “future predicted object locations” may be reasonably taught by the mapped locations of stones accessed for subsequent (“future”) operations (see ¶0017) that may have been manually collected after mapping (see ¶0003) and are thus expected (“predicted”) locations.
Since the systems of Honkanen and Gresch are directed to the same purpose, i.e. detecting obstacles by an agricultural work machine traversing a field, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the processor of Honkanen to be further configured to transmit sensor data to a central processor, wherein the central processor is configured to determine one or more future predicted object locations based at least in part on the sensor data, in the same manner that similar sensor data of Gresch is transmitted to a remote computer for mapping stone locations that can be accessed by agricultural machines for subsequent operation, with the predictable result of remembering the locations of objects that may damage agricultural machines at a later time (¶0004 of Gresch).
Claim 12
Gresch further teaches that the central processor is configured to transmit the one or more predicted objects to virtual terminal 122 controlled by control unit 94, as described in ¶0029 (similar to the one or more processors taught by Honkanen) (see ¶0050, regarding the coordinate points are processed at the remote site to accumulate mapped points that can be transmitted to virtual terminal 122 of the machine for early indication of hazard sites, such as stones).
Claim 13
Honkanen further discloses that the one or more processors are configured to provide operational instructions to one or more systems of the work machine (see ¶0076, regarding that data processing unit automatically induces a controlling response based on the determined features at the agricultural working means, which is accordingly controlled based on the controlling response).
Claims 10 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Honkanen in view of McKinnis and Gresch, and in further view of Missotten et al. (US 2017/0215330 A1), hereinafter Missotten.
Claim 22 is rejected under 35 U.S.C. 103 as being unpatentable over Honkanen in view of McKinnis, and in further view of Missotten.
Claims 10 and 22
While Honkanen discloses obtaining an enhanced map of data comprising location and attributes of the determined features (see ¶0052), Honkanen does not explicitly disclose the processor is configured to determine a predicted object based at least in part on a received field location of the work machine. However, it would be obvious to obtain a similar map of expected features based on the location of the work machine, in light of Missotten.
Specifically, Missotten teaches an agricultural combine 2 (similar to the agricultural machine taught by Honkanen) capable of performing harvesting operations (see ¶0022) that includes a controller 14 (similar to the processor taught by Honkanen) configured to determine a predicted object (i.e. rocks) based at least in part on a received field location of the combine 2 (see ¶0037, with respect to Figure 3, regarding that residue processing system 8 determines field characteristics based on the retrieved position 3 of the combine 2 on the field 1, where the residue processing system 8 is provided on-board the combine for communication with controller 14, as described in ¶0036, with respect to Figure 2, and the position 3 of combine 2 is matched to one or more field maps to determine its position, as described in ¶0025). No steps are claimed for performing a “prediction;” therefore, the “predicted object” may be reasonably taught by the mapped locations of rocks accessed for subsequent operations (see ¶0004-0005) and thus are at expected (“predicted”) locations.
Since the systems of Honkanen and Missotten are directed to the same purpose, i.e. providing field maps with associated locations of obstacles to an agricultural machine, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the processor of Honkanen to be further configured to determine a predicted object based at least in part on a received field location of the work machine, in the same manner that Missotten determines rocks on a field map based on the retrieved position of a combine, with the predictable result of utilizing field data collected over a period of time to optimize subsequent operation of an agricultural combine (¶0003-0005 of Missotten).
Claim 11
Missotten reasonably teaches that the received field location is a field boundary (see Figure 1, depicting “field boundaries,” which include zone 5 defined as a sloped part of the field in ¶0024, where the agricultural combine may be driven, as described in ¶0032). Given that Figure 1 represents an example of a field map 1 (see ¶0023) and the agricultural combine may be driven in zone 5 (see ¶0032), depicted as part of the map’s “boundary” in Figure 1, it may be reasonably gleaned that the “received field location” of the agricultural combine of Missotten may include a “field boundary.”
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Specifically, Peterson et al. (US 2011/0153168 A1) teaches the control of an implement associated with an agricultural vehicle based on a field map (see abstract) that includes obstacles such as boulders (see ¶0042), Lange et al. (US 2012/0237083 A1) teaches automatically mapping obstacle locations using images (see abstract), and Porsborg et al. (US 2025/0022382 A1) teaches the use of a machine learning mode to evaluate parameters received from an agricultural machine for the prediction of future repair or maintenance of components (see abstract).
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Sara J Lewandroski whose telephone number is (571)270-7766. The examiner can normally be reached Monday-Friday, 9 am-5 pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramya P Burgess can be reached at (571)272-6011. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SARA J LEWANDROSKI/Examiner, Art Unit 3661