Prosecution Insights
Last updated: April 19, 2026
Application No. 18/374,550

COMPUTER VISION SYSTEMS AND METHODS FOR AN AGRICULTURAL HEADER

Non-Final OA §101§102§103
Filed
Sep 28, 2023
Examiner
RHEE, ROY B
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Cnh Industrial America LLC
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
92%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
98 granted / 143 resolved
+16.5% vs TC avg
Strong +24% interview lift
Without
With
+24.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
38 currently pending
Career history
181
Total Applications
across all art units

Statute-Specific Performance

§101
10.8%
-29.2% vs TC avg
§103
45.7%
+5.7% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
23.3%
-16.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 143 resolved cases

Office Action

§101 §102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-15 and 17-18 are rejected under 35 U.S.C. 101 because the claimed inventions are directed to one or more abstract ideas without significantly more. Claim 1 recites an impact detection system for a header of an agricultural system, which is in the machine category of the four statutory categories. The claim as drafted, is a machine that, under its broadest reasonable interpretation, covers the manipulation or control of data along with concepts and/or mental processes that are practicably performed in the human mind. In other words, the claim may be described as the recitation of an abstract idea in conjunction with insignificant extra-solution activity. The claim covers performance of the recited limitations in the mind but for the recitation of a number of generic components. That is, other than the recitation of various generic components, nothing in the claim precludes the selection of one or more operating parameters from being practically performed in the human mind. But for the recitation of a first camera, the step of configuring the first camera to capture imagery of at least one row unit of the header may be interpreted to be mere data gathering corresponding to the transmission or reception of an image transmitted by a camera, for example, which corresponds to insignificant extra-solution activity in the form of pre-solution activity (see MPEP 2106.05(g)) in which the pre-solution activity is incidental to the primary product and corresponds to merely a nominal or tangential addition to the claim. But for the recitation of a controller, the step of identifying a portion of a crop in the imagery may be performed in the human mind. The foregoing step is equivalent to a person identifying a portion of a crop in the image received from the camera. The step of determining a location of initial contact of the portion of the crop at the header based on the imagery may also be performed in the human mind. The foregoing step corresponds to the person determining the location on the header in which the portion contacts the header based on what is seen in the image. The mere nominal recitation of generic components, such as the first camera and the controller, does not take the claim limitations out of the mental processes grouping. The claim limitations do not require any particular level of accuracy or precision, so nothing in the claim elements preclude the recited steps from practically being performed in the mind. This judicial exception is not integrated into a practical application because each of the limitations are recited at a high level of generality. There is nothing implemented to technologically improve the functionality of what is recited in claim 1. The judicial exception does not recite additional elements that are sufficient to amount to significantly more. The limitations of the claim do not integrate the abstract idea into a practical application. In summary, with respect to the subject matter eligibility test (see MPEP 2106), independent claim 1 falls within one of the four statutory categories of invention which satisfies STEP 1 (i.e., a machine). Claim 1 covers performance of one or more limitations in the human mind which constitutes a mental process, such as an observation, evaluation, and/or visualization, for example. Accordingly, the claim recites at least one abstract idea which satisfies STEP 2A (Prong 1). Claim 1 does not recite additional elements that integrate the judicial exception into a practical application, which does not satisfy STEP 2A (Prong 2). Furthermore, with regard to STEP 2B, the recitation of insignificant extra-solution activity corresponds to well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality to the judicial exception, which is indicative that an inventive concept is not present. Since claim 1, under its broadest reasonable interpretation, recites limitations of a mental process, without integrating the limitations into a practical application and does not amount to significantly more, it is ineligible subject matter under 35 U.S.C. 101. Claims 2-13 are also rejected as ineligible subject matter under 35 U.S.C. 101 because these claims fall into the mental processes grouping as each of them depends on independent claim 1 and the additional limitations recited in each of these claims do not integrate the abstract idea into a practical application. Claim 14 recites a header of an agricultural system, which is in the machine category of the four statutory categories. The claim as drafted, is a machine that, under its broadest reasonable interpretation, covers the manipulation or control of data along with concepts and/or mental processes that are practicably performed in the human mind. Independent claim 14 performs the same steps recited in independent claim 1. The same argument as stated above for claim 1 applies to independent claim 14 because claim 14 covers similar mental functions and insignificant extra-solution activities which do not integrate an abstract idea into a practical application and does not amount to significantly more. Claims 15 and 17-18 are also rejected as ineligible subject matter under 35 U.S.C. 101 because these claims fall into the mental processes grouping as each of them depends on independent claim 15 and the additional limitations recited in each of these claims do not integrate the abstract idea into a practical application. Examiner notes that while each of claims 6 and 16 recites one or more control signals to adjust the header, the foregoing limitation was written in the alternative for claim 6. As a consequence, claim 6 was also rejected as ineligible subject matter under 35 U.S.C. 101. Examiner further notes that an output comprising one or more alarms corresponds to a form of insignificant extra-solution activity because an output comprising one or more alarms is mere data transmission for alerting a person. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-12 and 14-20 are rejected under 35 U.S.C. 102(a)(2) as being unpatentable over Yanke et al. (US 12,022,772). Regarding claim 1, Yanke teaches an impact detection system for a header of an agricultural system, the impact detection system comprising: (see Yanke at col. 1 lines 34-37 which discloses that a second aspect of the present disclosure is directed to an apparatus for controlling an agricultural header based on movement of crop material at the agricultural header during harvesting; Yanke at col. 1 lines 26-29 which discloses that method may include analyzing one or more images containing at least a portion of the agricultural header to detect crop material present in the one or more images; see Yanke at col. 4 lines 48-50 which discloses that in some implementations, the corn header 108 includes impact sensors that detect a force or sound of EHP interacting (e.g., impacting) with the header 108. Examiner maps apparatus for controlling an agricultural header to the impact detection system for a header of an agricultural system.) a first camera coupled to the agricultural system and configured to capture imagery of at least one row unit of the header; (see Yanke at col. 4 lines 41-44 which discloses that the harvester control system 112 includes one or more sensors 114 that sense the presence of a crop material, such as EHP (ears, heads, or pods), relative to the header 108 and that in some implementations, the region sensors 114 are image sensors that capture images; see Yanke at col.6 lines 11-13 which discloses region sensors, such as region sensors 114 or 206, include image sensors, such as a camera (e.g., mono camera and stereo camera); also see Yanke at col. 5 lines 24-26 which discloses that one or more of the sensors 114 captures images of crop material moving through the row units. Examiner notes that capturing images of crop material moving through the row units corresponds to capturing imagery of at least one row unit of the header.) and a controller configured to utilize computer vision techniques to: identify a portion of a crop in the imagery; and determine a location of initial contact of the portion of the crop at the header based on the imagery (see Yanke at col. 10 lines 1-8 which discloses that image analysis techniques may be employed to detect the presence of crop material within an image and movement of the crop material between images, and that further, in some implementations, classification approaches using machine learning algorithms are also used to identify features, such as different types of crop material or features of a header, and movement of detected objects between images; see Yanke at col. 10 lines 12-19 which discloses that in some implementations, neural networks, including neural networks using deep learning, may also be used to identify and classify crop material present in the image data, that example neural networks include perception neural networks, that feed-forward neural networks, convolutional neural networks, recurrent neural networks, and autoencoders, to name only a few and that other types of neural networks are also within the scope of the present disclosure. Examiner maps neural networks using deep learning to identify and classify crop material present in image data corresponds to utilizing computer vision techniques to identify a portion of a crop in the imagery. Also, see Yanke at col. 10 lines 20-32 which discloses that the image analyzer identifies a location of the identified crop material within the images, and that for example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground. Examiner notes that the identifying of a location of the identified crop material by the image analyzer, such as at a surface of a header, corresponds to determining a location of initial contact of the portion of the crop at the header based on the imagery.) Regarding claim 2, Yanke teaches the impact detection system of claim 1, wherein the portion of the crop comprises an ear of corn (see Yanke at col. 3 lines 62-65 which discloses that the present disclosure describes detecting the presence or movement of crop material, such as a crop material representing grain (e.g., ears, heads, or pods of crops ("EHP")); see Yanke at col. 10 lines 40-42 which disclose that the image analyzer has identified an ear of corn.) Regarding claim 3, Yanke teaches the impact detection system of claim 1, wherein the first camera is coupled to the header or to a cab portion of the agricultural system (see Yanke at col. 4 lines 41-44 which discloses that the harvester control system 112 includes one or more sensors 114 that sense the presence of a crop material, such as EHP (ears, heads, or pods), relative to the header 108 and that in some implementations, the region sensors 114 are image sensors that capture images; also see Yanke at col. 4 lines 53-55 which discloses that the sensor includes, for example, an optical sensor (e.g., camera, a stereo camera). Examiner notes that image sensors which sense the presence of crop material relative to the header corresponds to the first camera coupled to the header; also, see Yanke at col. 6 lines 5-7 which discloses that a region sensor 206 may be located on a header, such as header 108 shown in Fig. 1. Alternatively, the region sensor or camera being located on a header corresponds to a camera coupled to the header. Examiner has shown a teaching based on a broadest reasonable interpretation of the claimed language.) Regarding claim 4, Yanke teaches the impact detection system of claim 1, comprising a second camera coupled to the agricultural system and configured to capture additional imagery of at least one additional row unit of the header, wherein the controller is configured to utilize computer vision techniques to: identify an additional portion of an additional crop in the additional imagery; and determine an additional location of respective initial contact of the additional portion of the additional crop at the header based on the additional imagery (see Yanke at col. 5 lines 24-28 which discloses that for example, in some instances, one or more of the sensors 114 captures images of crop material moving through the row units 110, within a trough of a cross-auger of the corn header 108, or at one or more locations contained within the confines of the corn header 108; see Yanke at col. 20 lines 36-40 which also discloses that the performance of each row unit is monitored, and each row unit is adjustable independently based on the monitored performance. Examiner maps another of the one or more sensors to the second camera. Examiner maps another of the row units to the at least one additional row unit of the header.) Regarding claim 5, Yanke teaches the impact detection system of claim 1, wherein the controller is configured to utilize computer vision techniques to: identify additional portions of additional crops in the imagery over a period of time; and determine additional locations of respective initial contacts of the additional portions of the additional crops at to the header based on the imagery over the period of time; (see Yanke at col. 10 lines 1-12, for example, which discloses that image analysis techniques may be employed to detect the presence of crop material within an image and movement of the crop material between images and that in some implementations, classification approaches using machine learning algorithms are also used to identify features, such as different types of crop material or features of a header, and movement of detected objects between images. Examiner notes that detecting movement of detected objects occurs over a period of time.) and calculate a combined impact location based on an average or a median of the location of the initial contact of the portion of the crop at the header and the additional locations of the respective initial contacts of the additional portions of the additional crops at the header (see Yanke at col. 10 lines 20-38 which discloses: Additionallly, the image analyzer identifies a location of the identified crop material within the images. For example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground. Further, the image analyzer is operable to determine whether the crop material was present on the ground prior to harvesting or traveled to the ground as a result of harvesting. This functionality is described in greater detail below. The image analyzer also determines a position (e.g., position vector), movement (e.g., a movement vector), or both of the crop material within an image, between multiple images, or a combination thereof. A movement vector (an example of which is shown in FIG. 18) encompasses a speed and direction of movement and is determinable, for example, within an image using, for example, motion blur of an object; between multiple images using, for example, a change in position between images; or using a combination of these techniques. Also, see Yanke at Fig. 18 which depicts movement of the vector as a function of time T1 and T2. Examiner notes that using a movement vector that encompasses a speed and direction of movement of the crop material while encompassing change in position between images over time corresponds to calculating a combined impact location based on an average or a median of the location of the initial contact of the portion of the crop at the header and the additional locations of the respective initial contacts of the additional portions of the additional crops at the header. Examiner has shown a teaching based on a broadest reasonable interpretation in light of what is written in the specification.) Regarding claim 6, Yanke teaches the impact detection system of claim 5, wherein the controller is configured to provide an appropriate output in response to the combined impact location being outside of a target impact region, and the appropriate output comprises one or more alarms, one or more control signals to adjust the header, or both (see Yanke at Fig. 3 element 312 which discloses adjusting a setting of the header when the measured distribution data fails to satisfy the target distribution data; see Yanke at col. 17 lines 55-67 which discloses that at 312 one or more settings of a header are altered when a measured distribution value of the measured distribution data does not satisfy criteria contained in the target distribution data, that for example, when a measured distribution value for a parameter meets or exceeds the defined threshold, a change is applied to a component or system of the header or agricultural vehicle coupled to the header, and thus, when a measured distribution for a parameter contained in the measured distribution data fails to satisfy the corresponding criteria in the target distribution data, the controller, such as controller 200, generates a signal, for example, to cause a change in position of an actuator to alter a setting of the header.) Regarding claim 7, Yanke teaches the impact detection system of claim 1, comprising a sensor configured to generate a signal indicative of vibrations due to the initial contact of the portion of the crop at the header (see Yanke at col. 4 which discloses that the harvester control system 112 is computer implemented device that receives information, such as in the form of sensor data, and that in some implementations, the corn header 108 includes impact sensors that detect a force or sound of EHP interacting (e.g., impacting) with the header 108. Examiner maps sensor data to a signal generated by the recited sensor. Examiner maps force to vibration. Examiner notes that impact sensors that detect a force corresponds to a sensor configured to generate a signal indicative of vibrations due to the initial contact of the portion of the crop at the header.) Regarding claim 8, Yanke teaches the impact detection system of claim 7, wherein the controller is configured to: determine a time of the initial contact of the portion of the crop at the header based on the signal; identify one or more frames in the imagery that correspond to the time; and utilize the computer vision techniques to: identify the portion of the crop in the one or more frames in the imagery; and determine the location of the initial contact of the portion of the crop at the header based on the one or more frames in the imagery (see Yanke at col. 9 lines 35-54 in conjunction with Figs. 4-7 which discloses that FIGS. 4 through 7 are a series of images 400, 500, 600, and 700, respectively, taken from a region sensor that is directed towards a region above and forward of a header 402 and that the series of images are consecutive images arranged in time order, i.e., from the earliest time shown in FIG. 4 to the latest time shown in FIG. 7. Further, see Yanke at col. 10 lines 20-32 which discloses that the image analyzer identifies a location of the identified crop material within the images, and that for example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground. Examiner notes that the identification of crop material located on a surface, such as a surface of a header, within the series of timed images, corresponds to the limitations recited in claim 7.) Regarding claim 9, Yanke teaches the impact detection system of claim 7, wherein the computer vision techniques comprise machine learning algorithms, (see Yanke at col. 10 lines 1-38 which discloses that image analysis techniques may be employed to detect the presence of crop material within an image and movement of the crop material between images and that in further, in some implementations, classification approaches using machine learning algorithms are also used to identify features, such as different types of crop material or features of a header, and movement of detected objects between images. Examiner maps image analysis techniques to computer vision techniques. Yanke at col. 10 lines 1-38 further discloses that machine learning algorithms include, but are not limited to, supervised learning algorithms, unsupervised learning algorithms, semi-supervised learning algorithms, and reinforcement learning algorithms.) and the signal is utilized as feedback training data to train the machine learning algorithms to determine the location of the initial contact of the portion of the crop at the header based on the one or more frames in the imagery (see Yanke at col. 4 lines 37-40 which discloses that the harvester control system 112 is a computer implemented device the receives information, such as in the form of sensor data; see Yanke at col. 10 lines 1-38 which discloses that in some implementations, neural networks, including neural networks using deep learning, may also be used to identify and classify crop material present in the image data and that example neural networks include perception neural networks, feed-forward neural networks, convolutional neural networks, recurrent neural networks, and autoencoders, to name only a few and that other types of neural networks are also within the scope of the present disclosure. Examiner notes that the use of deep learning in neural networks fundamentally relies on feedback training data such as sensor data provided by image sensors, for example. Further, see Yanke at col. 10 lines 1-38 which discloses that the image analyzer identifies a location of the identified crop material within the images, and that for example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground. Examiner maps crop material located on a surface such as a surface of a header to the location of the initial contact of the portion of the crop at the header.) Regarding claim 10, Yanke teaches the impact detection system of claim 1, wherein the computer vision techniques comprise machine learning algorithms (see Yanke at col. 10 lines 1-38 which discloses that image analysis techniques may be employed to detect the presence of crop material within an image and movement of the crop material between images and that in further, in some implementations, classification approaches using machine learning algorithms are also used to identify features, such as different types of crop material or features of a header, and movement of detected objects between images. Examiner maps image analysis techniques to computer vision techniques.) Regarding claim 11, Yanke teaches the impact detection system of claim 1, wherein the controller is configured to utilize the computer vision techniques to identify a reference feature on the header, and to determine the location of the initial contact of the portion of the crop at the header based on the reference feature on the header (see Yanke at col. 10 lines 1-38 which discloses that image analysis techniques may be employed to detect the presence of crop material within an image and movement of the crop material between images and that in further, in some implementations, classification approaches using machine learning algorithms are also used to identify features, such as different types of crop material or features of a header, and movement of detected objects between images. Examiner notes that identifying features of a header teaches identifying a reference feature on the header. Also, see Yanke at col. 10 lines 20-32 which discloses that the image analyzer identifies a location of the identified crop material within the images, and that for example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground; see Yanke at col. 11 at 37-53 which discloses that the image analyzer also identifies a reference location 406 corresponding to a part or feature of the header 408, that in this example, the reference location 406 is a static location corresponding to a discernable feature, such as a tip 410 of a row unit cover 412 of the header 408. Noting that Yanke teaches identifying a reference feature of a header, the Examiner further notes that the identifying of crop material that is located on a surface, such as a surface of a header, corresponds to determining the location of the initial contact of the portion of the crop at the header based on a reference feature on the header.) Regarding claim 12, Yanke teaches impact detection system of claim 11, wherein the reference feature comprises a patterned cover on a hood positioned between adjacent row units of the header (see Yanke at col. 9 lines 60-66 which discloses that the image analyzer may use one or more of the following image analysis techniques: two-dimensional (2D) object recognition, three-dimensional (3D) object recognition, image segmentation, motion detection (e.g., single particle tracking), video tracking, optical flow, 3D pose estimation, pattern recognition, and object recognition, to name a few. See Yanke at col. 11 lines 37-53 which discloses that the image analyzer also identifies a reference location 406 corresponding to a part or feature of the header 408, that in this example, the reference location 406 is a static location corresponding to a discernable feature, such as a tip 410 of a row unit cover 412 of the header 408. Yanke at col. 11 lines 45-53 further discloses that other marker types may be used to identify the reference location 406 in a presented image, that example markers include markers having different shapes, colors, patterns, symbols, and characters and that in some instances, text or objects are used as a marker type to identify the reference location 406 on a display and, still further, text or objects with varying intensity are used as a marker type to identify the reference location 406 on a display. Examiner notes that a reference location corresponding to a part or feature of a header maps to a patterned cover on a hood positioned between adjacent row units of the header. Examiner has shown a teaching based on a broadest reasonable interpretation of the claimed language in light of the specification, at [0041], which states that FIG. 5 is a perspective side view of an embodiment of the hood 218 that may be employed within the header 200 of FIG. 2.). Claim 14 is directed toward a header that performs the steps recited in the system of claim 1. The cited portions of the reference(s) used in the rejection of claim 1 teaches the steps recited in the header of claim 14. Additionally, Yanke at col. 4 lines 16-20 teaches a plurality of row units distributed across a width of the header (see Yanke at col. 4 lines 16-20 in conjunction with Fig. 1 which discloses that the combine harvester 100 includes a corn header 108 that includes a plurality of row units 110, with each row unit 110 aligning with a particular row 106 to harvest the crops contained in that row 106.). Claim 15 is directed toward a header that performs the steps recited in the system of claim 4. The cited portions of the reference(s) used in the rejection of claim 4 teach the steps recited in the header of claim 15. Therefore, claim 15 is rejected under the same rationale used in the rejection of claim 4. Claim 16 is directed toward a header that performs the steps recited in the system of claim 6. The cited portions of the reference(s) used in the rejection of claim 6 teach the steps recited in the header of claim 16. Therefore, claim 16 is rejected under the same rationale used in the rejection of claim 6. Claim 17 is directed toward a header that performs the steps recited in the system of claim 7. The cited portions of the reference(s) used in the rejection of claim 7 teach the steps recited in the header of claim 17. Therefore, claim 17 is rejected under the same rationale used in the rejection of claim 7. Claim 18 is directed toward a header that performs one or more of the steps recited in the system of claim 1. The cited portions of the reference(s) used in the rejection of claim 1 teach the one or more of the steps recited in the header of claim 18. Therefore, claim 18 is rejected under the same rationale used in the rejection of claim 1. Claim 19 recites a method comprising: operating a header to harvest a plurality of crops as the header travels through a field; (see Yanke at the Abstract which discloses that systems and methods for controlling agricultural headers based on crop movement relative to a portion of the agricultural header are disclosed and that based on a position or movement or both of the crop material relative to the header, one or more parameters of the harvester header may be adjusted, for example, to reduce an amount of grain loss.) generating, via a first camera, imagery of at least one row unit of the header; (see Yanke at col. 4 lines 41-44 which discloses that the harvester control system 112 includes one or more sensors 114 that sense the presence of a crop material, such as EHP (ears, heads, or pods), relative to the header 108 and that in some implement-tations, the region sensors 114 are image sensors that capture images; see Yanke at col.6 lines 11-13 which discloses region sensors, such as region sensors 114 or 206, include image sensors, such as a camera (e.g., mono camera and stereo camera); also see Yanke at col. 5 lines 24-26 which discloses that one or more of the sensors 114 captures images of crop material moving through the row units. Examiner notes that capturing images of crop material moving through the row units corresponds to capturing imagery of at least one row unit of the header.) processing, via one or more processors, the imagery to identify a respective portion of each crop of the plurality of crops received at the at least one row unit of the header; and processing, via the one or more processors, the imagery to determine a respective location of initial contact between the respective portion of each crop of the plurality of crops received at the at least one row unit of the header and deck plates of the header (see Yanke at col. 1 lines 37-45 which discloses that the apparatus may include one or more processors and a non-transitory computer-readable storage medium coupled to the one or more processors and storing programming instructions for execution by the one or more processors, that the programming instructions may instruct the one or more processors to analyze one or more images containing at least a portion of the agricultural header to detect crop material present in the one or more images; categorize the detected crop material detected in the one or more images; see Yanke at col. 10 lines 20-32 which discloses that the image analyzer identifies a location of the identified crop material within the images, and that for example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground. Also, see Yanke at Fig. 2 element 242 which depicts deck plate actuator; see Yanke at col. 8 lines 39-48 which discloses that actuators 240 include a deck plate width actuator 242, a rotary actuator 244, reciprocating actuator 246, compressible component actuator 248, and a speed control actuator 250; see Yanke at col. 8 lines 60-62 which discloses controlling motion of a header or a control transient movement of deck plates of a row unit, such as when the deck plates are moved in response to engagement with incoming crops and that the deck plate width actuator 242 alters a spacing between deck plates of a stalk roll assembly of a corn header in response to a signal from the controller 200.) Regarding claim 20, Yanke teaches the method of claim 19, comprising processing, via the one or more processors and using computer vision techniques, the imagery to identify the respective portion of each crop of the plurality of crops received at the at least one row unit of the header and to determine the respective location of the initial contact between the respective portion of each crop of the plurality of crops received at the at least one row unit of the header and the deck plates of the header (see Yanke at col. 8 lines 60-62 which discloses controlling motion of a header or a control transient movement of deck plates of a row unit, such as when the deck plates are moved in response to engagement with incoming crops and that the deck plate width actuator 242 alters a spacing between deck plates of a stalk roll assembly of a corn header in response to a signal from the controller 200; see Yanke at col. 10 lines 12-19 which discloses that in some implementations, neural networks, including neural networks using deep learning, may also be used to identify and classify crop material present in the image data, that example neural networks include perception neural networks, that feed-forward neural networks, convolutional neural networks, recurrent neural networks, and autoencoders, to name only a few and that other types of neural networks are also within the scope of the present disclosure. Examiner maps neural networks using deep learning to identify and classify crop material present in image data corresponds to utilizing computer vision techniques to identify a portion of a crop in the imagery. Also, see Yanke at col. 10 lines 20-32 which discloses that the image analyzer identifies a location of the identified crop material within the images, and that for example, the image analyzer is operable to detect whether the identified crop material is attached to a crop plant; located on a surface, such as a surface of a header; in the air; or on the ground. Examiner notes that the identifying of a location of the identified crop material by the image analyzer, such as at a surface of a header or at any location in which the image analyzer may identify a location of identified crop material within any of the images corresponds to determining the respective location of the initial contact between the respective portion of each crop of the plurality of crops received at the at least one row unit of the header and the deck plates of the header.) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Yanke et al. (US 12,022,772) in view of Weston et al. (US 2023/0410528). Regarding claim 13, Yanke does not expressly disclose the impact detection system of claim 11, wherein the reference feature comprises one or more bolts, one or more grooves, a cover, or any combination thereof which, in a related art, Weston teaches (see Weston at [0028] which discloses that in some examples, machine-learning algorithms can be used to refine vehicle calibration over time to account for wear of the suspension components and similar effects and that in some examples disclosed herein, visually identifiable suspension features (e.g., spring seats, Panhard bolts, etc.) can be used as reference points.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Yanke to include wherein the reference feature comprises one or more bolts, one or more grooves, a cover, or any combination, as taught by Weston. One would have been motivated to make such a modification to provide visually identifiable suspension features as reference points for use in machine-learning algorithms, as suggested by Weston at [0028]. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROY RHEE whose telephone number is 313-446-6593. The examiner can normally be reached M-F 8:30 am to 5:30 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, Applicant may contact the Examiner via telephone or use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito Robinson, can be reached on 571-270-3921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, one may visit: https://patentcenter.uspto.gov. In addition, more information about Patent Center may be found at https://www.uspto.gov/patents/apply/patent-center. Should you have questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROY RHEE/Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Sep 28, 2023
Application Filed
Dec 27, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589731
IN-VEHICLE APPARATUS
2y 5m to grant Granted Mar 31, 2026
Patent 12566022
DRONE SNOWMAKING AUTOMATION
2y 5m to grant Granted Mar 03, 2026
Patent 12559265
Off-Channel Unmanned Aerial Vehicle Remote ID Beaconing
2y 5m to grant Granted Feb 24, 2026
Patent 12550961
SYSTEMS AND METHODS OF A SMART HELMET
2y 5m to grant Granted Feb 17, 2026
Patent 12542065
UNMANNED AIRCRAFT VEHICLE STATE AWARENESS
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
92%
With Interview (+24.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 143 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month