Prosecution Insights
Last updated: April 19, 2026
Application No. 18/617,938

SYSTEMS AND METHODS FOR IMAGE CAPTURE AND ANALYSIS OF AGRICULTURAL FIELDS

Non-Final OA §103§DP
Filed
Mar 27, 2024
Examiner
ABDI, AMARA
Art Unit
2668
Tech Center
2600 — Communications
Assignee
Climate LLC
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
76%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
677 granted / 816 resolved
+21.0% vs TC avg
Minimal -8% lift
Without
With
+-7.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
33 currently pending
Career history
849
Total Applications
across all art units

Statute-Specific Performance

§101
9.8%
-30.2% vs TC avg
§103
60.7%
+20.7% vs TC avg
§102
10.2%
-29.8% vs TC avg
§112
10.0%
-30.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 816 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 2-6 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-5 of U.S. Patent No. 10,438,343, (see table below). Although the claims at issue are not identical, they are not patentably distinct from each other because: Claim 2 of the instant Application, recite common subject matter with patented claim 1; Whereby claim 2 of the instant application, which recites the open-ended transitional phrase “comprising”, does not preclude the additional elements recited by patented claim 1, and Whereby the elements of claim 2 of the instant Application are fully anticipated by patented claim 1. Instant Application comparison U.S. Patent No. 10,438,343 2. (New) A computer system comprising: one or more processors in data communication with one or more sensors that are coupled to an agricultural machine configured to traverse a field; and one or more non-transitory computer-readable storage media storing sequences of program instructions which, when executed by the one or more processors, cause the one or more processors to: obtain, from a database that stores agricultural image data, a plurality of images of at least one stage of crop development, the plurality of images captured by a mobile image capture device while the mobile image capture device moves along a path of the field, the plurality of images including at least a first image of a crop captured with the mobile image capture device at a first angle of view at a first viewpoint along the path of the field and a second image of the crop captured at a second angle of view with the mobile image capture device at a second viewpoint along the path of the field; and analyze the plurality of images, to determine relevant images that indicate a change in at least one condition of the crop development. 1. (Currently Amended) A computer system comprising: one or more processors in data communication with one or more sensors that are coupled to an agricultural machine configured to intact with soil; obtain, from a database that stores agricultural image data, a plurality of images of at least one stage of crop development, the plurality of images captured by a mobile image capture device while the mobile image capture device moves along a path of a field, the plurality of images including at least a first image of a crop captured with the mobile image capture device at a first angle of view of at least 90 degrees at a first viewpoint along the path of the field and a second image of the crop captured at a second angle of view of at least 90 degrees with the mobile image capture device at a second viewpoint along the path of the field; captured images, to determine relevant images that indicate a change in at least one condition of the crop development, and to generate a localized view map layer for viewing the field at the at least one stage of crop development based on at least the relevant captured images. 1. (Currently Amended) A computer system comprising: one or more processors in data communication with one or more sensors that are coupled to an agricultural machine configured to interact with soil; one or more non-transitory computer-readable storage media storing sequences of program instructions which, when executed by the one or more processors, cause the one or more processors to: obtain, from a database that stores agricultural image data, a plurality of images of at least one stage of crop development, the plurality of images captured by a mobile image capture device while the mobile image capture device moves along a path of a field, the plurality of images including at least a first image of a crop captured with the mobile image capture device at a first angle of view of at least 90 degrees at a first viewpoint along the path of the field and a second image of the crop captured at a second angle of view of at least 90 degrees with the mobile image capture device at a second viewpoint along the path of the field; analyze the captured images, to determine relevant images that indicate a change in at least one condition of the crop development, and to generate a localized view map layer for viewing the field at the at least one stage of crop development based on at least the relevant captured images. 3. (New) The computer system of claim 2, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate yield data including a yield map to be displayed on a graphical user interface and to receive a user selection of a region of the yield map.2 2. (Original) The computer system of claim 1, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate yield data including a yield map to be displayed on a graphical user interface and to receive a user selection of a region of the yield map. 2. (Original) The computer system of claim 1, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate yield data including a yield map to be displayed on a graphical user interface and to receive a user selection of a region of the yield map. 4. (New) The computer system of claim 3, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate a localized view map layer that is geographically associated with a selected region of the yield map in response to the user selection. 3. (Currently Amended) The computer system of claim [[1]]2, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate a localized view map layer that is geographically associated with a selected region of the yield map in response to the user selection. 3. (Currently Amended) The computer system of claim [[1]]2, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate a localized view map layer that is geographically associated with a selected region of the yield map in response to the user selection. 5. (New) The computer system of claim 2, wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate a localized view map layer for viewing the field at the at least one stage of crop development based on at least the relevant images. 4. (Original) The computer system of claim 3, wherein the instructions, when executed by the one or more processors, cause the one or more processors to execute instructions to superimpose the localized view map layer yielp. 4. (Original) The computer system of claim 3, wherein the instructions, when executed by the one or more processors, cause the one or more processors to execute instructions to superimpose the localized view map layer with the yield map. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2, 5-6, 12, and 15-16 are rejected under 35 U.S.C. 103 as being unpatentable over Mas et al (US-PGPUB 2004/0264762) in view of Redden (US-PGPUB 2013/0238201); and further in view of Pickett et al, (US-patent 9,719,973) In regards to claim 2, Mas discloses computer system comprising: one or more processors in data communication with one or more sensors that are coupled to an agricultural machine configured to traverse a field, (see at least: Fig. 1, and Para 0056, computer 14, [i.e., implicitly comprises one or more processors], in communication with the camera 12, [i.e., one or more sensors]; and from Par. 0092, the camera 12 mounted on a vehicle 16 moving through a field, [i.e., vehicle 16 configured to traverse a field]), one or more non-transitory computer-readable storage media storing sequences of program instructions which, when executed by the one or more processors, cause the one or more processors, (implied by computer 14) to: obtain, from a database that stores agricultural image data, a plurality of images of at least one stage of crop development, the plurality images are captured by a mobile image capture device while the mobile image capture device moves along a path of the field, (see at least: Para 0058, the computer 14 can save the stereo images and generate real time disparity images from the stereo images captured by the camera 12. Further, Para 0060, discloses the system 10, which comprises the camera 12 can be used in a wide variety of applications such as crop growth monitoring, [i.e., camera 12 of the system 10 implicitly acquires plurality of images of at least one stage of crop development]. Further, from Par. 0092, the camera 12 mounted on a vehicle 16 moving through a field, [i.e., the one or more camera(s) 12 implicitly captures plurality images while the vehicle 16 moves along a path of a field]); Mas does not expressly disclose that the plurality of images including at least a first image of a crop captured with the mobile image capture device at a first angle of view at a first viewpoint along the path of the field and a second image of the crop captured at a second angle of view with the mobile image capture device at a second viewpoint along the path of the field; and analyze the captured images, to determine relevant images that indicate a change in at least one condition of the crop development Redden discloses a detector 240 (i.e., an image capture or camera) attached to an automated crop thinning 100, (see at least: Fig. 9), which includes a processor integrated to a moving vehicle, (e.g., a mobile plant-thinning vehicle), for capturing plurality of images of crop rows along the path of the field, and capture image being analyzed independently and/or in comparison to other images to account for changes in time, (see at least: Figs. 9-10, and Par. 0019, 0041, 0045, 0050-0051). The system for automated crop thinning 100 includes a detection mechanism 200 and an elimination mechanism 300. The detection mechanism 200 preferably directs radiation at the plant at an angle between the ground 12 (e.g. 90 degrees) and a normal vector to the ground (e.g. 0 degrees), and the detector 240 preferably arranged above the crop row 10, receives radiation reflected in a vector substantially parallel to the normal vector to the ground 12 or parallel to a gravity vector, [which implicitly enables capturing plurality of images, at different angles of view at different viewpoints as the vehicle is moving along the path of the field, as the detector 240 can have an adjustable field of view]. Mas and Redden are combinable because they are both concerned with agricultural crop imaging. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify Mas, to include the detector 240, as though by Redden, with the Mas system 10, in order to capture plurality of images of crop, at different viewpoints when the vehicle is moving along the path of the field, (see at least: Para 0046, 0050-0051). The combine teaching Mas and Redden as whole does not expressly disclose analyzing the captured images, to determine relevant images that indicate a change in at least one condition of the crop development Pickett discloses analyzing the captured images, to determine relevant images that indicate a change in at least one condition of the crop development, (see at least: col. 10, lines 34-62, performing comparison of a first selected set of crop data to a second selected set of crop data (step 325). When images during the subsequent passes are captured from the same location, angle, general lighting conditions, etc., a number of variations between the images can be detected by performing pixel-level comparison, which can be applied to determine general health of an agricultural plant or a weed between first crop image and second crop image, [i.e., analyzing the captured images, “performing pixel-level comparison based automated analysis between digitized images”, to determine relevant images that indicate a change in at least one condition of the crop development, “implicit by performing color differentiation to distinguish plant matter between first crop image and second crop image”]). Mas, Redden, and Pickett are combinable because they are all concerned with agricultural crop imaging. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify the combine teaching Mas and Redden, to perform color differentiation, as though by Pickett, in order to distinguish plant matter between crop images, facilitate analyzing crop health over time, (Pickett, col. 11, lines 7-8) In regards to claim 5, the combine teaching Mas, Redden, and Pickett as whole discloses the limitations of claim 2. Mas further discloses generating a localized view map layer for viewing the field at the at least one stage of crop development based on at least the relevant images, (see at least: Par. 0007, disparity images can also be used to generate 3-dimensional maps of the agricultural field scene, “i.e., generating a localized view map layer”. Further, from Par. 0146, images obtained from a camera 12 mounted on a ground vehicle 38 can be used for in a variety of applications, such as to estimate the volume of extensive crops like barley, wheat, alfalfa, etc. One method for estimating the volume of an area of crops 48, “i.e., implicitly obtaining relevant images to the crop volume”; and from Par. 0011, the 3D agricultural field scene maps can be utilized to track the state of development of vegetation, as well as sensing physical parameters important for production such as crop row spacing, tree height, or crop volume, [i.e., the localized view map layer is generating for viewing the field at the at least one stage of crop development, “3D agricultural field scene maps can be utilized to track the state of development of vegetation”, based on at least the relevant images, “implicitly based on the images obtained from a camera 12 relevant to the volume of extensive crops”]). In regards to claim 6, the combine teaching Mas, Redden, and Pickett as whole discloses the limitations of claim 2. Pickett further discloses wherein the agricultural machine is configured to spray a pesticide, herbicide, and/or fungicide based on the change in at least one condition of the crop development, (see at least: col. 4, lines 4-15, an applicator machine 145 may further comprise machinery for treating a crop following cultivation and plant germination, … such as a sprayer for applying pesticides, implicitly based on the crop condition change, such cultivation and plant germination). Regarding claim 12, claim 12 recites substantially similar limitations as set forth in claim 2. As such, claim 12 is rejected for at least similar rational. The Examiner further acknowledged the following additional limitation: “a computer-implemented method”. However, Mas discloses the computer-implemented method, (Mas, see at least: Abstract, and Para 0007). Regarding claim 15, claim 15 recites substantially similar limitations as set forth in claim 5. As such, claim 15 is rejected for at least similar rational. Regarding claim 16, claim 16 recites substantially similar limitations as set forth in claim 6. As such, claim 16 is rejected for at least similar rational. Claims 3-4, and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Mas, Redden, and Pickett, as applied to claim 2; and further in view of Johnson (US-PGPUB 2014/0089045) In regards to claim 3, the combine teaching Mas, Redden, and Pickett as whole discloses the limitations of the claim 2 Furthermore, Mas discloses wherein the instructions, when executed by the one or more processors, cause the one or more processors to generate yield data including a yield map to be displayed on a graphical user interface, (see at least: Para 0207, generating a 3-D agricultural field scene map. Further, Para 0061, the map is ready for a 3-dimensional display). The combine teaching Mas, Redden, and Pickett as whole does not expressly disclose the receiving a user selection of a region of the yield map. Johnson discloses receiving a user selection of a region of the yield map, (see at least: Para 0127-0130, screen 800 provides a user with information regarding stand determination for a specific field along with additional information that may be helpful to the user. Along with the field image 820, there is a modifiable field view area 830 that contains controls that allow the user to alter the views of the field 820). Mas, Redden, Pickett, and Johnson are combinable because they are all concerned with the agricultural crop imaging. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify the combine teaching Mas, Redden, and Pickett, to use the modifiable field view area 830, as though by Johnson, in order to provide the controls that allow the user to alter the views of the field, (Johnson, Para 0127) In regards to claim 4, the combine teaching Mas, Redden, Pickett, and Johnson as whole discloses the limitations of claim 2. Johnson further generating a localized view map layer that is geographically associated with the selected region of the field map in response to the user selection, (see at least: Par. 0053, the Graphical User Interface (GUI) may be configured to receive data from the user concerning the agricultural crop. This data may relate to the agricultural fields (location, size, shape, ID, or name), planned events (planting and chemical application dates, types, and locations), …; and from Para 0120, 0127, field identifiers 810 as well as an image of the field 820 are provided. Along with the field image 820, there is a modifiable field view area 830 that contains controls that allow the user to alter the views of the field 820, [i.e., generating localized view map layer, “the system implicitly provides the image of the field 820”, that is geographically associated with the selected region of the field map in response to the user selection, “the provided image of the field 820 by the system is implicitly associated with user’s preference selection parameters for receiving an alerts, comprising locations in agricultural fields, and the planned events”]). Regarding claim 13, claim 13 recites substantially similar limitations as set forth in claim 3. As such, claim 13 is rejected for at least similar rational. Regarding claim 14, claim 14 recites substantially similar limitations as set forth in claim 4. As such, claim 14 is rejected for at least similar rational. Claims 7-10, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Mas, Redden, and Pickett, as applied to claim 2; and further in view of Cavender- Bares et al, (US-PGPUB 2015/0142250), (thereinafter referenced as Cavender); and further in view of Scharf et al, (US-PGPUB 20170039449) In regards to claim 7, the combine teaching Mas, Redden, and Pickett as whole discloses the limitations of the claim 2 The combine teaching Mas, Redden, and Pickett as whole does not expressly disclose wherein the instructions, when executed by the one or more processors, cause the one or more processors to: perform, with an apparatus, an application pass for the field and at the same time capturing images of the field including crops if visible during the application pass; generate a localized view for viewing the field during the application pass based on the captured images; and automatically analyze the application pass including at least one of a planting analysis, a fertilizer analysis, a harvesting analysis, and a tillage analysis based on the captured images. However, Cavender discloses performing, with an apparatus, an application pass for the field and at the same time capturing images of the field including crops if visible during the application pass, (see at least: Par. 0074, aerial vehicle 170 can include one or more cameras or sensors configured to at least capture an image of the agricultural field 102 where an autonomous vehicle platform 100 is operating; and from Par. 0086, fertilizer can be applied substantially between two rows 104 of planted crops 106; in this manner the autonomous vehicle platform 100 effectively treats one-half of each row of planted crop 106, “i.e., performing an application pass for the field, “applying Fertilizer between rows”, and at the same time capturing images of the field including crops, “implicit by capturing an image of the agricultural field 102, by an aerial vehicle 170, where the autonomous vehicle platform 100 is operating”); generate a localized view for viewing the field during the application pass based on the captured images, (see at least: Par. 0095, and 0101, implicit by creating a "base map" from which the autonomous vehicle platform 100 can navigate. Such a base map can detail the precise location of individual rows 104 of planted crop 106, or even the location of individual plants 106); and automatically analyze the application pass including at least one of a planting analysis, a fertilizer analysis, a harvesting analysis, and a tillage analysis based on the captured images, (Par. 0076, an operator can receive video, images and other sensor data remotely via wireless communications; and Par. 0101, discloses creating a "base map" from which the autonomous vehicle platform 100 can navigate. Such a base map can detail the precise location of individual rows 104 of planted crop 106, or even the location of individual plants 106; where Par. 0098, discloses that the physical sample can be analyzed at the autonomous vehicle platform 100 or cataloged or tagged for later analysis, [i.e., automatically analyze the application pass including at least one of a planting analysis, a fertilizer analysis, a harvesting analysis, and a tillage analysis based on the captured images]). Mas, Redden, Pickett, and Cavender are combinable because they are all concerned with tracking agricultural field. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify the combine teaching Mas, Redden, and Pickett, to include the performing of an application pass by autonomous vehicle platform 100, as though by Cavender, in order to 100 effectively treats two rows of planted crop 106 on each pass to thereby double its coverage in comparison to fertilization substantially between two rows 104 of planted crops 106, (Cavender, Par. 0086) The combine teaching Mas, Redden, Pickett, and Cavender as whole does not expressly disclose capturing images of the field including crops if visible during the application pass. However, Scharf discloses capturing images of the field including crops if visible during the application pass, (see at least: Par. 0048, control a fertilizer applicator to apply the recommended rate at each point in the field, varying the rate as it crosses the field in response to crop color as seen in the aerial image, “i.e., capturing images of the field including crops if visible during the application pass based on crop color”). Mas, Redden, Pickett, Cavender, and Scharf are combinable because they are all concerned with tracking agricultural field. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify the combine teaching Mas, Redden, Pickett, and Cavender, to control a fertilizer applicator, as though by Scharf, in order to vary the rate as it crosses the field in response to crop color as seen in the aerial image, (Scharf, Par. 0048) In regards to claim 8, the combine teaching Mas, Redden, Pickett, Cavender, and Scharf as whole discloses the limitations of the claim 7. Cavender further discloses wherein the instructions, when executed by the one or more processors, cause the one or more processors to adjust, with the apparatus or an agricultural computer system in communication with the apparatus, settings of the application pass if appropriate based on the analysis of the captured images, (see at least: Par. 0095, he autonomous vehicle platform 100 can adjust fertilizer output as needed based on map planted crop 108 conditions indicating that more or less nutrients are required, “implicitly by analyzing the captured images”). In regards to claim 9, the combine teaching Mas, Redden, Pickett, Cavender, and Scharf as whole discloses the limitations of the claim 7 Cavender further discloses wherein the application pass comprises a planting application pass, (see at least: Par. 0085, implicit by applying fertilizer substantially between two rows of planted crops); and the planting analysis includes determining current field conditions from the captured images, wherein the planting analysis causes an adjustment to parameters of the apparatus during the application pass, (Par. 0076-0077, user interface module 128 transmits and receives data directly from a portable computer 181, which communicates directly, or indirectly via server 180, with the autonomous vehicle platform 100, where an autonomous vehicle platform 100 can communicate a status update to an operator or team of remote; and from Par. 0097, the autonomous vehicle platform 100 can include a soil sampling structure 154, configured to measure soil conditions, as well as other parameters, For example, in areas where soil 103 conditions indicate that more or less nutrients are required, the autonomous vehicle platform 100 can adjust fertilizer output as needed, “i.e., the planting analysis causes an adjustment to parameters of the apparatus during the application pass”). In regards to claim 10, the combine teaching Mas, Redden, Pickett, Cavender, and Scharf as whole discloses the limitations of the claim 7 Cavender further discloses wherein the application pass comprises a fertilizer application pass, (Par. 0086, fertilizer can be applied substantially between two rows of planted crops, “i.e., application pass implicitly comprises a fertilizer application pass”), and a remote sensor leads the apparatus to gather the captured images of the crops ahead of the apparatus, (see at least: Par. 0074-0075, implicit by one or more cameras or sensors of the one or more aerial vehicles 170), determines a crop health criterion based on the captured images as part of the fertilizer analysis, (see at least: Par. 0096, sensor 150 can observe conditions from below planted crops 108. Sensor 150 can be mounted on a robotic arm 152 to observe planted crops 106 conditions above autonomous vehicle platform 100), and then adjusts settings automatically including adjusting an application rate for the fertilizer based on the crop health criterion, (see at least: Par. 0095, in areas where planted crop 106 conditions indicate that more or less nutrients are required, the autonomous vehicle platform 100 can adjust fertilizer output as needed, implicitly based on observed condition by the sensor 150, which can be in communication with microprocessor 122). Regarding claim 17, claim 17 recites substantially similar limitations as set forth in claim 7. As such, claim 17 is rejected for at least similar rational. Regarding claim 18, claim 18 recites substantially similar limitations as set forth in claim 8. As such, claim 18 is rejected for at least similar rational. Regarding claim 19, claim 19 recites substantially similar limitations as set forth in claim 9. As such, claim 19 is rejected for at least similar rational. Regarding claim 20, claim 20 recites substantially similar limitations as set forth in claim 10. As such, claim 20 is rejected for at least similar rational. Claims 11 and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Mas, Redden, and Pickett, Cavender, and Scharf et al, as applied to claim 7; and further in view of Anderson, (US-PGPUB 20150163992) In regards to claim 11, the combine teaching Mas, Redden, Pickett, Cavender, and Scharf as whole discloses the limitations of the claim 7 Cavender further discloses wherein the application pass comprises a harvesting application pass, (Par. 0107, periodically all or part of one of the planted crops 106 can be harvested, “implicit the harvesting application pass”). In the other hand, Mas discloses identifying crop components in a crop processing device, (see at least: Par. 0054, implicit by detecting crop rows in an agricultural field scene) The combine teaching Mas, Redden, Pickett, Cavender, and Scharf as whole does not expressly disclose identifying size and health of the crop component. Anderson discloses identifying size and health of the crop component, (see at least: Par. 0087, identifying crop data comprising crop residue size , and plant disease or pest, ear, pod, bowl, fruit, berry or seed attributes, “health of the crop component) Mas, Redden, Pickett, Cavender, Scharf, and Anderson are combinable because they are all concerned with tracking agricultural field. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify the combine teaching Mas, Redden, Pickett, Cavender, and Scharf, to the Environmental sensor and Crop sensor, as though by Anderson, in order to identify the crop data, (Anderson, Par. 0087). Regarding claim 21, claim 21 recites substantially similar limitations as set forth in claim 11. As such, claim 21 is rejected for at least similar rational. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMARA ABDI whose telephone number is (571)272-0273. The examiner can normally be reached 9:00am-5:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vu Le can be reached at (571) 272-7332. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMARA ABDI/Primary Examiner, Art Unit 2668 01/24/2026
Read full office action

Prosecution Timeline

Mar 27, 2024
Application Filed
Jan 24, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602822
METHOD DEVICE AND STORAGE MEDIUM FOR BACK-END OPTIMIZATION OF SIMULTANEOUS LOCALIZATION AND MAPPING
2y 5m to grant Granted Apr 14, 2026
Patent 12597252
METHOD OF TRACKING OBJECTS
2y 5m to grant Granted Apr 07, 2026
Patent 12576595
SYSTEMS AND METHODS FOR IMPROVED VOLUMETRIC ADDITIVE MANUFACTURING
2y 5m to grant Granted Mar 17, 2026
Patent 12574469
VIDEO SURVEILLANCE SYSTEM, VIDEO PROCESSING APPARATUS, VIDEO PROCESSING METHOD, AND VIDEO PROCESSING PROGRAM
2y 5m to grant Granted Mar 10, 2026
Patent 12563154
VIDEO SURVEILLANCE SYSTEM, VIDEO PROCESSING APPARATUS, VIDEO PROCESSING METHOD, AND VIDEO PROCESSING PROGRAM
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
76%
With Interview (-7.5%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 816 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month