DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Status of Claims
Claims 1-4, 6-14, and 16-20 are currently pending and are being hereby examined herein. Claims 5 and 15 are canceled.
Information Disclosure Statement
Three information disclosure statements were considered by the Examiner (two submitted 17 November 2025 and one submitted 15 January 2026).
Response to Amendments / Remarks
Any reference to the prior office action refers to the non-final rejection dated 2 September 2025. All rejections under 35 U.S.C. 112(b) from the prior office action are withdrawn. All rejections under 35 U.S.C. 101 from the prior office action are withdrawn.
Applicant’s arguments, filed 1 December 2025, with respect to potential rejections under 35 U.S.C. 103 of Claims 1-4, 6-14, and 16-20 under the combination of references cited in the prior office action have been considered but are moot because the new ground of rejection (necessitated by amendment) does not rely solely on the combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Joint Inventors
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim Objections
The claims are objected to because of the following informalities:
Claim 11: “each having a priority associated therewith according to a capacity to detect a plume of crop residue released by the agricultural machine” should be “each of the plurality of sensors having a priority associated therewith according to a capacity to detect a plume of crop residue released by the agricultural machine”.
Claim 11: “determining, by the computing device, one or more features of merit for the the two or more images” should be “determining, by the computing device, one or more features of merit for the [[the]] two or more images”.
Claim 11: “a spreader mechanism, a chopper, or a harvester head the agricultural machine” should be “a spreader mechanism, a chopper, or a harvester head of the agricultural machine”.
Claim 18: “two or more of” should be “two of more images of”.
Appropriate corrections are required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-4, 6-14, and 16-20 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Pub. No. 2022/0350989 (Warwick et al., hereinafter, Warwick) in view of U.S. Pub. No. 2021/0015039 (Vandike et al., hereinafter, Vandike and Gilmore) in further view of U.S. Pub. No. 2020/0120869 (hereinafter, Vandike and Readel).
Regarding Claim 1, Warwick discloses A monitoring system for an agricultural machine (see at least [0005] and [0016]: “the present subject matter is directed to an agricultural system for monitoring surface conditions for an agricultural field. The system includes an agricultural machine configured to travel across an agricultural field, and one or more imaging devices supported relative to the agricultural machine”), comprising:
a plurality of sensors mounted to the agricultural machine generating a plurality of sensor outputs that include two or more images… (see at least [0005], [0022], FIG. 1, and FIG. 2: “the agricultural machine 10 may include one or more imaging devices 104 coupled thereto and/or supported thereon”); and
a controller (see at least [0005], [0028]-[0031], and FIG. 2: computing system 110) configured to:
receive the plurality of sensor outputs from the plurality of sensors (see at least [0005], [0028], and [0038]: “The system also includes a computing system communicatively coupled to the one or more imaging devices. The computing system is configured to receive, from the one or more imaging devices, an image of an imaged portion of the agricultural field, with the imaged portion of the agricultural field being represented by a plurality of pixels within the image.”);
determine one or more features of quality for two or more images of the plurality of sensor outputs (see at least [0005], [0017], [0022], [0032]-[0033], [0037]-[0043], [0048]-[0051], [0055]-[0058], FIG. 3, FIG. 4, and FIG. 5: “The computing system is further configured to identify at least one pixel-related parameter associated with the plurality of pixels within the image, determine whether at least one image quality metric for the image is satisfied based at least in part on the at least one pixel-related parameter”; “However, given that agricultural machines often operate in dirty/dusty environments and/or in low-lighting conditions, the images captured by the imaging device(s) (and/or the images generated based on the captured images) may often be of insufficient quality to accurately estimate the surface condition(s) associated with the imaged portion of the field. Thus, in accordance with aspects of the present subject matter, the disclosed systems and methods allow for the quality of the images to be automatically evaluated or assessed such that only images of a given quality are used to estimate the relevant surface condition(s). For instance, as will be described below, each candidate image for estimating one or more surface conditions may be analyzed in view of one or more quality metrics to assess the quality of the image. Images determined to be of “low-quality” may be disregarded (e.g., by being discarded or simply ignored) when determining the relevant surface condition(s) within the field. In addition, an operator notification may be generated when it is determined that “low-quality” images are being captured.”; “image analysis module 118 may also be configured to assess the quality of the images deriving from the imaging device(s) 104 such that only images of a given quality are used to estimate the relevant surface condition(s). For instance, in several embodiments, each input image (e.g., an original 2-D image captured by the cameras 106, 108 and/or a depth image generated based on a pair of 2-D images) may be analyzed in view of one or more quality metrics to assess the quality of the image. Images determined to be of “low-quality” may then be disregard for purposes of determining the relevant surface condition(s) within the field”);
determine one or more features of merit for the two or more images of the plurality of sensor outputs, the one or more features of merit corresponding to operation of the agricultural machine and having one or more confidence scores associated therewith (see at least [0005], [0016], [0028], [0032]-[0033], [0040], [0051], [0058]-[0061], and [0065]-[0067]: “The computing system is configured to…estimate a surface condition associated with the agricultural field based at least in part on the image when it is determined that the at least one image quality metric is satisfied”; “The images can then be analyzed to estimate one or more surface conditions, such as one or more conditions relating to crop residue (e.g., percent residue coverage, residue size, residue bunches, residue mat height, etc.), soil clods (e.g., clod size/volume, clod count, etc.) and/or surface irregularities (e.g., surface roughness, levelness, including the detection of ridges and/or valleys), and/or the like within the field.”; “images deemed “acceptable” may be used by the computing system 110 to estimate a surface condition(s) associated with the portion of the field depicted within each of such images”; the confidence score of “acceptable” is associated with the surface condition);
select a sensor output from the two or more images of the plurality of sensor outputs as a selected sensor output corresponding to a selected image selected…according to the one or more features of quality for the two or more images of the plurality of sensor outputs (see at least [0033] and [0037]: “Images that are identified as having sufficient quality (e.g., above a given quality threshold(s)) will be classified as acceptable images for detecting the surface condition(s) of the imaged portion(s) of the field while images that are identified as having insufficient quality (e.g., below a given quality threshold(s)) will be classified as unacceptable images and disregarded for purposes of surface condition detection (e.g., by discarding or simply ignoring the image(s) when estimating a given surface condition(s)). Such a quality-based evaluation can ensure that only higher quality images are used for detecting surface conditions, thereby increasing the reliability and accuracy of the estimated surface conditions.”); and
control the operation of the agricultural machine according to the one or more features of merit for selected image corresponding to the selected sensor output by controlling a speed, a position ,or a direction of a spreader mechanism, a chopper, or a harvester head (see at least [0017]-[0018], [0020]-[0021], [0034], [0037], and [0045]: “Images determined to be of “low-quality” may be disregarded (e.g., by being discarded or simply ignored) when determining the relevant surface condition(s) within the field”; “the control module 120 may be configured to control the operation of the agricultural machine 10 based on the monitored surface condition(s) of the field”; “the control module 120 may be configured to adjust the operating parameters (e.g., penetration depth, down force/pressure, etc.) associated with one or more of the ground-engaging tools 126 of the implement 14 (e.g., the disc blades 30, shanks 32, leveling blades 34, and/or basket assemblies 36) to proactively or reactively adjust the operation of the implement 14 in view of the monitored surface condition(s).”; “For example, when the agricultural machine includes a tillage implement configured to perform a tillage operation within the field (e.g., the implement 14 shown in FIG. 1), real-time or on-the-fly adjustments may be made to the tillage parameters associated with the ground-engaging tools of the implement, such as by adjusting the penetration depth, down pressure/force, angle-of-attack, and/or the like of one or more of the ground-engaging tools.”; the tillage implement is a chopper/spreader mechanism and adjusting the angle-of-attack is adjusting a position / direction).
Warwick does not explicitly disclose the plurality of sensors each having a priority associated therewith according to a capacity to detect a plume of crop residue released by the agricultural machine and select a sensor output from the two or more images of the plurality of sensor outputs as a selected sensor output corresponding to a selected image selected according to two or more of the priorities of the plurality of sensors.
Vandike and Gilmore, in the same field of controls for agricultural vehicles, and therefore analogous art, teach the plurality of sensors each having a priority associated therewith according to a capacity to detect …crop residue… and select a sensor output from the two or more images of the plurality of sensor outputs as a selected sensor output corresponding to a selected image selected according to two or more of the priorities of the plurality of sensors (see at least [0063]: “In some implementations, the values for different parameters associated with the crop residue may be derived from an aggregation of values derived from images from multiple cameras 424. In other words, the values may be generated using the images from all of cameras 424 or more than one camera 424. In such an implementation, controller 450 may receive input from an operator indicating which cameras 424 are to provide the images that are used for deriving the crop residue parameter values. In one implementation, controller 450 may further receive input from an operator assigning a predefined weight to each camera, the degree to which the values derived from each camera contribute to the overall value for a particular crop residue parameter. For example, for a first parameter, the predefined weights assigned to the different cameras may result in the individual values from camera 424-3 having a larger impact or effect on the aggregate value for the first parameter as compared to the individual values from a different camera, such as camera 424-2.”).
It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to combine the teachings of Warwick with the teachings of Vandike and Gilmore so that an operator can indicate which cameras should be prioritized in the processing for the purpose of obtaining better results (see at least Vandike and Gilmore [0063]).
Vandike and Gilmore do not explicitly teach a capacity to detect …crop residue… is a capacity to detect a plume of crop residue released by the agricultural machine.
Vandike and Readel, in the same field of agricultural vehicles, and therefore analogous art, teach a capacity to detect …crop residue… is a capacity to detect a plume of crop residue released by the agricultural machine (see at least [0053] and FIG. 2: “The cameras 142, 144, 146, 150, 152 are digital cameras that produce digital signals representing images of the plume of residue within their fields of view”).
It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to combine the Warwick / Vandike and Gilmore combination with the teachings of Vandike and Readel because it is a simple substitution for the features being detected / analyzed to become the plume-related conditions of Vandike and Readel; the motivation to combine is “to monitor the spread of the MOG behind the agricultural harvester and to control fans, vanes, and other steering devices to ensure that the MOG is properly distributed on the ground behind the combine” since “A common problem when spreading MOG (such as straw) behind the agricultural harvester is accurately monitoring the spread of the MOG, particularly when there are strong prevailing winds. The MOG is thrown to the rear of the vehicle and the wind carries it from side to side” (see at least Vandike and Readel [0005]-[0006]).
Regarding Claim 2, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Warwick further discloses wherein the one or more features of quality for two or more images of the plurality of sensor outputs includes a fraction of the two or more images of the plurality of sensor outputs obscured by an obscurant (see at least [0047]-[0051] and FIG. 4: “the control logic 300 includes calculating a percent completeness of the depth image”; “in instances in which the view of one of the cameras 106, 108 is obscured with reference to a given pixel location within the image (e.g., due to dirt on the lens, airborne dust, poor lighting, out-of-view features, etc.) and/or in instances in which an imaged area includes insufficient features to perform stereo matching, one or more pixels (or, more likely, one or more areas of pixels) will not include depth information associated therewith.”).
Regarding Claim 3, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Warwick further discloses wherein the one or more features of quality for the two or more images of the plurality of sensor outputs includes a characterization of lighting represented in the two or more images of the plurality of sensor outputs (see at least [0047], [0049], and [0059]-[0060]: “Specifically, images captured under low-lighting conditions typically exhibit a lower standard deviation. Thus, in several embodiments, the standard deviation threshold may correspond to a minimum standard deviation threshold at or above which the image will be considered satisfactory.”).
Regarding Claim 4, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Vandike and Readel further teaches (with the same motivation to combine as Claim 1) wherein the one or more features of merit characterize the plume of crop residue as represented in the two or more images of the plurality of sensor outputs (see at least [0052]-[0053]: “The cameras 142, 144, 146, 150, 152 are digital cameras that produce digital signals representing images of the plume of residue within their fields of view. The cameras are coupled to one of more networked electronic control units (represented in FIG. 3 as ECU 302). The networked electronic control units 302 are configured to receive images of the plume of residue from each of the cameras and to transmit those images to a display unit 304 disposed in the operator cabin of the vehicle. The networked electronic control units 302 are also configured to extract physical characteristics of the plume of residue from the images.”); and
wherein the one or more features of merit include at least one of a width, a distribution, or a direction of the plume of crop residue (see at least [0052]-[0053]: “These characteristics include the width of the plume and where the plume is falling on the ground behind the agricultural harvester 100—e.g. how far to the right and/or how far to the left of the agricultural harvester 100.”; “With this information, other ECUs and actuators can steer the spreader mechanism 126 to direct the plume of residue across the ground in a more even distribution.”; “Collectively, the cameras, networked ECUs and display are a residue spread monitoring system 300.”).
Regarding Claim 6, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Warwick further discloses wherein the controller is configured to control operation of the agricultural machine according to the one or more features of merit by controlling a display device (see at least [0035], [0044], and [0047]: “Such actions may also include generating a notification for display to an operator (e.g., via the associated user interface 130) that provides information associated with the estimated surface condition. For instance, when the estimated surface condition corresponds to a residue-related surface condition(s), the operator notification may provide information associated with percent residue coverage, residue length, residue bunches (e.g., the location, number, and/or height of any detected residue bunches), and/or the like. Similarly, when the estimated surface condition corresponds to a clod-related surface condition(s), the operator notification may provide information associated with the clod locations, the number of clods, the size of the clods (e.g., length, width, height, and/or volume), and/or the like. As yet another example, when the estimated surface condition corresponds to a surface condition(s) associated with surface irregularities, the operator notification may provide information associated with the degree of soil roughness/levelness, the location, number and/or size of surface irregularities (e.g., ridges and valleys), and/or the like.”).
Regarding Claim 7, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Warwick further discloses wherein the controller is further configured to select the sensor output as the selected sensor output additionally based on the one or more confidence scores for two or more images of the plurality of sensor outputs (see at least [0061]: when the image confidence score is “acceptable”, the image is used for surface condition detection).
Regarding Claim 8, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 7. Furthermore, Warwick further discloses wherein the controller is configured to select the sensor output from the two or more images of the plurality of sensor outputs as the selected sensor output corresponding to the selected image further selected according to the one or more confidence scores for the two or more images of the plurality of sensor outputs (see at least [0061]: when the image confidence score is “acceptable”, the image is used for surface condition detection).
Regarding Claim 9, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Warwick further discloses wherein the plurality of sensors comprise at least one of a light detection and ranging (LIDAR) sensor, a radio detection and ranging (RADAR) sensor, an ultrasonic sensor, an ultraviolet sensor, an infrared sensor, or a visible light camera (see at least [0025], [0040], and [0053]: the imaging devices include cameras that take RGB / color images; therefore, they are visible light cameras).
Regarding Claim 10, the Warwick / Vandike and Gilmore / Vandike and Readel combination teaches all the limitations of Claim 1. Furthermore, Warwick further discloses wherein the plurality of sensors includes two or more cameras, including at least one camera configured to detect rearward of the agricultural machine (see at least FIG. 1 and [0023]-[0025]: “a second imaging device(s) 104B may be provided at or adjacent to an aft end 40 of the implement 14 to allow the imaging device(s) 104B to capture images and related data of a section of the field disposed behind the implement 14”; “imaging device(s) 104 may correspond to a stereo camera”).
Regarding Claim 11, this claim is substantially similar to Claim 1 and that rejection should be referenced for most limitations; additionally, Warwick discloses the limitation not already addressed in Claim 1: A method for monitoring an agricultural machine (see at least [0005]-[0006] and [0016]: an agricultural method for monitoring is disclosed).
Regarding Claim 12, this claim is substantially similar to Claim 2 and that rejection should be referenced.
Regarding Claim 13, this claim is substantially similar to Claim 3 and that rejection should be referenced.
Regarding Claim 14, this claim is substantially similar to Claim 4 and that rejection should be referenced.
Regarding Claim 16, this claim is substantially similar to Claim 6 and that rejection should be referenced.
Regarding Claim 17, the limitations in this claim are substantially similar to limitations in Claims 1 and 7 and those rejections should be referenced.
Regarding Claim 18, this claim is substantially similar to Claim 8 and that rejection should be referenced.
Regarding Claim 19, this claim is substantially similar to Claim 9 and that rejection should be referenced.
Regarding Claim 20, Warwick discloses all the limitations of Claim 11. Furthermore, Warwick discloses wherein the plurality of sensors includes two or more cameras, including at least one camera configured to detect rearward of a spreader mechanism of the agricultural machine (see at least FIG. 1 [0018]-[0020] and [0023]-[0025]: “a second imaging device(s) 104B may be provided at or adjacent to an aft end 40 of the implement 14 to allow the imaging device(s) 104B to capture images and related data of a section of the field disposed behind the implement 14”; “imaging device(s) 104 may correspond to a stereo camera”; in FIG. 1, imaging device 104B is shown rearward of a spreader mechanism).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDRA ROBYN MORFORD whose telephone number is (571)272-6109. The examiner can normally be reached Monday - Friday 8:00 AM - 4:00 PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.R.M./Examiner, Art Unit 3658
/JASON HOLLOWAY/Primary Examiner, Art Unit 3658