Prosecution Insights
Last updated: April 19, 2026
Application No. 19/001,794

WORK MAP PROVISION SERVER

Non-Final OA §103
Filed
Dec 26, 2024
Examiner
DUNNE, KENNETH MICHAEL
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Yanmar Power Technology Co. Ltd.
OA Round
1 (Non-Final)
76%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
87%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
217 granted / 285 resolved
+24.1% vs TC avg
Moderate +11% lift
Without
With
+11.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
23 currently pending
Career history
308
Total Applications
across all art units

Statute-Specific Performance

§101
10.2%
-29.8% vs TC avg
§103
42.5%
+2.5% vs TC avg
§102
22.8%
-17.2% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 285 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 04/11/2025 was filed before the first action on the merits of the application. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3 and 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over JP 2016076123 A, Isemura et al and further in view of US 20150302305 A1, Rupp et al. Regarding Claim 1, Isemura teaches “A work map provision server comprising: a memory configured to store first processor-executable code, a plurality of(“(Work Plan Registration Unit) The work plan registration unit 151 of the memory unit 135 stores work plans (information linking work date and time with work content) for all fields owned by the owner, with each field associated with it.” There are multiple work plans (maps));”and respective attribute information for each of ([0107] When the work target value selection unit 146 receives date and time information and location information related to the work area from the remote monitoring terminal device 200 or the mobile terminal device 180, it selects a work target value from the work target value registration unit 157 that corresponds to the work plan selected by the work plan selection unit 142.” Work plans (maps) include target values);” the respective attribute information for each of the ([0079] (Work Plan Registration Unit) The work plan registration unit 151 of the memory unit 135 stores work plans (information linking work date and time with work content) for all fields owned by the owner, with each field associated with it. In other words, when owning multiple fields, the crops produced in each field may differ, and the agricultural work required in each field (for example, plowing, rice planting, pesticide application, harvesting, etc.) may also differ. Therefore, a work plan is individually created for each field according to the crops and required agricultural work, and this information is registered in the work plan registration unit 151. Specifically, since the same agricultural work on a single field is often carried out over several days, the work plan is registered by linking the content of the agricultural work with the period during which that work will be carried out.” Here Isemura teaches that a work plan links the content (specific work to be carried out) in the field with the timeframe of the overall plan (i.e. beginning at the creation/start of the plan the corresponding work is linked with time));” and each ([0106]-[0107] “(Work Target Value Registration Unit) The work target value registration unit 157 has work target values for each work plan pre-registered. For example, work target values such as working time, yield, and fuel consumption of agricultural machinery 110 are pre-registered for each work plan and for each field. These work target values can be entered as arbitrary values by the field owner by operating a fixed terminal 160 or a portable terminal 170. (Work Target Value Selection Unit) When the work target value selection unit 146 receives date and time information and location information related to the work area from the remote monitoring terminal device 200 or the mobile terminal device 180, it selects a work target value from the work target value registration unit 157 that corresponds to the work plan selected by the work plan selection unit 142.” Isemura’s work plans include target values/amount for the corresponding work);”the processor configured to execute the first processor-executable code to perform operations including: receiving a map acquisition request at a receiving time and date, selecting, from among the plurality of ([0082] “The work plan selection unit 142 selects a work plan from among the multiple work plans registered in the work plan registration unit 151 or the multiple work plans generated by the production management model, based on the received date and time information and the specified work area (specified by the work area specification unit 141). “ + [0083] “ In other words, the work plan selection unit 142 selects (extracts) from among multiple work plans registered in the work plan registration unit 151 a work plan that matches the date and time specified by the received date and time information and matches the work area specified by the received location information.” Isemura selects a work plan which corresponds to the location and dat and time information from the remote terminal, and subsequently sends the plan to the terminal; (and subsequently the work would then control the vehicle to follow the plan, i.e. “for use by the work vehicle”) Isemura however does not teach (1) that the work plans are/include fertilizing, while it teaches various examples of “agricultural work” ([0079] “In other words, when owning multiple fields, the crops produced in each field may differ, and the agricultural work required in each field (for example, plowing, rice planting, pesticide application, harvesting, etc.) may also differ.”) it does not explicitly teach fertilizing as one of those types of work. Additionally While Isemura discloses work plans corresponding to specific fields, it does not teach that the target work varies within the field, i.e. does not teach work values specific to “to each respective area of a plurality of areas in a farm field;” Rupp et al teaches a similar (agricultural work plan) transmission system which includes fertilizer plans/maps as a type of agricultural work ([0138] The method continues at step 136 where the host device sends the agriculture prescription to one or more of the agriculture equipment. For example, the host device transmits the agriculture prescription to a fleet of farming tractors. The sending may further include transmitting the agriculture prescription to one or more user devices associated with the geographic region. The method continues at step 138 where the one or more of the agriculture equipment executes at least a portion of the agriculture prescription. For example, the fleet of farming tractors executes steps of the agriculture prescription.); and that each plan includes a specific amount (pattern) of fertilizer application in the corresponding fields ([0086] Having produced the agricultural prescription 80, the application processing module 34 of the application unit 16 sends, via the network 24, one or more of the analysis summary 78 and the agricultural prescription 80 to the user device 14. The application processing module 34 of the application unit 16 may further send the agricultural prescription 80, via the network 24 and the wireless communication network 1, to the user device 1-1C for utilization in performing of one or more steps of the agricultural lifecycle in accordance with the agricultural prescription 80. For example, the user device 1-1C displays a portion of the agricultural prescription 80 and sends control information of the agricultural prescription, via wireless signals 42, to the user device 1-1A to automate a portion of the execution of at least some of the steps of the agricultural lifecycle. For the example, the user device 1-1A issues control information to a set of actuators to dispense fertilizer in accordance with the agricultural prescription 80. For instance, to control dispensing a specified volume of liquid fertilizer in a specified date range in a specified geometric pattern for at least a portion of the geographic region 1-1 as the user device 1-1A versus the drive path 1-1.” As can be seen in figure 2 below the geographic regions = respective areas in a field + [0217] “An example of operation, a user device associated with farming machinery traverses the series of drive paths. … As another specific example, the farming machinery enters the geographic region 1-1 via the encoded zone 1-1, produces the recovered data, extracts an agricultural prescription identifier from the recovered data, and facilitates a next agricultural lifecycle step in accordance with the agricultural prescription (e.g., automatically applies a desired amount of fertilizer across desired portions of the geographic region based on the agricultural prescription).” Here teaches that the plans include applying a desired amount of fertilizer per region; and that control of the vehicle is automatic) PNG media_image1.png 504 706 media_image1.png Greyscale It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application, to modify Isemura to include the fertilizer application map(s)/plan as taught by Rupp as a form of “agricultural work” as taught in Isemura. One would be motivated to include the fertilizer application as part of the plans/maps in order to allow for increased crop yield. Regarding claim 2, modified Rupp teaches “The work map provision server according to claim 1, wherein the second processor-executable code causes the work vehicle to apply a first target amount of fertilizer to a first area of the farm field and a second target amount of fertilizer, that is different than the first target amount, to a second area of the farm field.”(Rupp “[0217] “An example of operation, a user device associated with farming machinery traverses the series of drive paths. … As another specific example, the farming machinery enters the geographic region 1-1 via the encoded zone 1-1, produces the recovered data, extracts an agricultural prescription identifier from the recovered data, and facilitates a next agricultural lifecycle step in accordance with the agricultural prescription (e.g., automatically applies a desired amount of fertilizer across desired portions of the geographic region based on the agricultural prescription).” Here teaches that the plans include applying a desired amount of fertilizer per region; and that control of the vehicle is automatic” + as seen in figure 2 above the amount of fertilizer changes based on the geographic region (individual area) of the field) Regarding Claim 3, modified Isemura teaches “The work map provision server according to claim 1, wherein the predetermined period of time is based on a growth condition of crops in the farm field associated with the corresponding fertilizer map.”(Isemura [0081] “Here, a production management model is generated by automatically correcting the necessary agricultural work based on weather conditions such as rainfall, sunshine hours, and temperature from several days prior to the present, onto a standard work plan (standard production management model) that pre-specifies the relationship between dates and agricultural work according to the type of crop, etc. In other words, when storing the work plan generated by the production management model in the work plan registration unit 151, the work plan, which is updated sequentially according to weather conditions as described above, is stored corresponding to each field.” The corresponding time period/amount of work is corrected based on the growth conditions (sunshine hours, temperature, rainfall, etc) of the crops in the intervening time.) Regarding Claim 9, modified Isemura teaches “The work map provision server according to claim 1, wherein the one or more fertilizer application maps are transmitted to the work vehicle.”(Rupp [0050] + [0217] Rupp teaches that the user device can be embedded in the farm equipment (tractor/work vehicle) and includes automatically following along for fertilizer spreading) Claim(s) 4-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over modified Isemura as applied to claim 1 above, and further in view of US 20190150357 A1, “MONITORING AND CONTROL IMPLEMENT FOR CROP IMPROVEMENT”, Wu et al. Regarding Claim 4, Isemura does teach updating/correcting of the plan based on growth conditions ([0081] the date/time period for a plan is linked with growth of the crops/conditions) it does not teach determining of the current growth conditions via images. Wu et al teaches a automatic fertilizing spraying system which includes the use of flying scout drones to obtain images ([0049] “[0049] In some embodiments, the modular sensor units 50 are mounted to a rigid attachment fixture 1952 or other mounting plate. This facilitates… The modular attachment fixtures 1952 may also be bolted or strapped to smaller vehicles (autonomous scout or aerial drone).”) to determine the current growth condition/state of the crops and adjusting operation based on the current conditions([0125] “ FIG. 10 is an example method to generate 3D images obtained from two or more image sensor units 50, to determine distance, depth and/or height values. A virtual image grid 100 is associated with the captured image to determine the vanishing point or horizon line. …The magnitude of the change is determined based on a priori calibration that correlates known elevation changes with a scale in the image grid 100, similar to the calibration performed to gauge a distance or depth using the calibrated scale in the image grid 100. In some method embodiments, the vehicle controller react and adjust boom height and angle to the landscape before the vehicle gets to the location of the determined depth and height. Additional sensors are optionally mounted on the boom to detect nearby distance to ground and crops. Sensor … Other embodiments include adding crop elevation (crop height distance to ground) information obtained from a previous travel pass through the field, such as based on a measured distance from a vehicle suspension ride height sensor that is mounted to the frame or suspension components of the vehicle, or such as from analyzing a previous imaging map of the field. Then, during an on-the-go pass through the field (i.e. the current pass through), the image sensor units 50 are processing the captured images and searching for changes in either the ground altitude or crop elevation from the previous pass due to new events, e.g. ground damage from a major rain, animal skirmish, and additional crop growth. In some embodiments, the search or analysis is for differences between the ground altitude or crop elevation or current image map versus the image map from previous years when the same type of crop was previously grown.” It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Isemura to include the image capturing and subsequent growth state estimation via scout drones as taught by Wu et al as part of the automatic work plan correction taught in Isemura. One would be motivated to implement the aerial drone imaging and analysis in order to allow for the vehicle to adjust the spraying plan in realtime to avoid wasting fertilizer on weeds/non-crop objects. Wu teaches this improvement in ([0103] “In some embodiments, anomalies 11 in a crop row 12 are sprayed with herbicide. To avoid needless spray (e.g. crop residue, twig), such objects are calibrated out by void pattern instructions performed on an image captured. Alternatively, a crop row is identified and spray is applied along the entire row 12 itself. In some embodiments continuous filming is performed. Alternatively, where higher resolution is used for analysis, snapshots are taken. If an image sensor unit 50 has good peripheral or fore-aft view, the snapshot intervals are taken to reduce overlapping images and is based on a travel speed of the vehicle. Fewer images also reduces the amount of data that is uploaded for further analysis offline.”) thereby reducing the operating cost and/or reducing environmental impacts (fertilizer seepage/runoff into water table/nearby environment) Regarding Claim 5, modified Isemura teaches “the work map provision server according to claim 4, wherein the one or more images of the farm field are aerial images of the farm field.”(Wu [0160] Yet another embodiment includes placing attachment fixtures or smaller attachment fixtures having image sensor system on autonomous scouts (e.g. flying drone or ground scout). Some types of the algorithms (e.g. weed identification, small crops) lend themselves to immediate response while the air or ground scout are traveling over the area. For instance, spot spray or laser zapping can be carried out with the drones. Bigger or heavier tasks are left to the heavy duty vehicles (e.g. sprayer, crop management).” Wu as modified in claim 4 uses a flying drone to perform image capturing ergo the images are “aerial images”) Claim(s) 6-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over modified Isemura (Isemura + Rupp) as applied to claim 1 above, and further in view of US 20190116725 A1, Hanya et al. Regarding Claim 6, modified Isemura, while teaching various fertilizer maps (“additional” fertilizer maps),(Rupp “[0086] Having produced the agricultural prescription 80, the application processing module 34 of the application unit 16 sends, via the network 24, one or more of the analysis summary 78 and the agricultural prescription 80 to the user device 14. The application processing module 34 of the application unit 16 may further send the agricultural prescription 80, via the network 24 and the wireless communication network 1, to the user device 1-1C for utilization in performing of one or more steps of the agricultural lifecycle in accordance with the agricultural prescription 80. For example, the user device 1-1C displays a portion of the agricultural prescription 80 and sends control information of the agricultural prescription, via wireless signals 42, to the user device 1-1A to automate a portion of the execution of at least some of the steps of the agricultural lifecycle.” Here teaches fertilization within the lifecycle (i.e. additional fertilization) + Rupp [0171] “This spray information is then correlated with the plant yield or crop yield data for each specific location that was gathered and uploaded for offline analysis. The correlations with the crop yield help determine best practices for the next crop cycle (e.g. whether to reduce or increase the amount of herbicide, fertilizer, and so on) for each location in the crop field.” Here teaches adjusting fertilizer through each part of the cycle (i.e. “additional” fertilizer maps)), however it does not explicitly teach “basal” fertilizer application/maps thereof Hanya et al teaches a agricultural work plan creation method for fertilizer spraying which specifically includes determining a basal fertilizer work plan which determines specific amounts of basal fertilizer application at specific areas of the field. ([0057] In Step 14, the basal fertilization plan map MP shown in FIG. 6 is displayed on the display 40 and is stored in a not-shown memory. [0058] The basal fertilization plan map MP displayed on the display device 40 shows an area having good growth and an area having poor growth in the field E, and also shows an area having considerably large growth fluctuation depending on year and season and an area having significantly poor growth. The map MP also shows the amount of fertilization for each area. Accordingly, the condition of the field E can be confirmed even before the field is seeded. [0059] The data of the basal fertilization plan map MP stored in a memory is sent to the tractor 20 by the transmission/reception part 36. The data may be wirelessly sent or may be sent with signal lines. [0060] When the transmission/reception part 26 of the tractor 20 receives the data of the basal fertilization plan map MP, the control part 25 stores the data of the basal fertilization plan map MP in a not-shown memory. [0061] The basal fertilization plan map MP is displayed on the display part 24 based on the data of the basal fertilization plan map MP stored in the memory. The position of the tractor 20 is shown on the basal fertilization plan map MP of the display part 24 based on the positional information measured by the GPS device 22. An operator confirms the position of the tractor 20 on the basal fertilization plan map MP displayed on the display part 24, so as to recognize the amount of basal fertilizer for basal fertilization before seeding the area in that position. The operator thereby fertilizes that area as much as necessary.) It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Isemura to include the Basal fertilizer application plan/map creation and use as taught by Hanya as part of the agricultural work plans taught in Isemura. One would be motivated to implement basal fertilizer work plans to allow for the system to identify areas of good or poor growth earlier in the planting cycle allowing for more efficient fertilizer application spread out through the full crop lifecycle as opposed to responding to only current growing conditions. (Hanya [0062] Namely, even in the field E before being seeded, the area having good growth and the area having poor growth are clearly distinguished, and an area having considerably large growth fluctuation depending on year and season and an area having significantly poor growth are also clearly distinguished. Moreover, the appropriate amount of fertilization can be obtained in each area in the field E. The field E is thereby prevented from being excessively fertilized. As a result, the fertilizer can be saved.”) Regarding Claim 7, modified Isemura teaches “The work map provision server according to claim 6, wherein the basal fertilizer map includes a target amount of fertilizer for application to each respective area of a plurality of areas in a farm field prior to sowing or planting,”(Hanya [0060] When the transmission/reception part 26 of the tractor 20 receives the data of the basal fertilization plan map MP, the control part 25 stores the data of the basal fertilization plan map MP in a not-shown memory. [0061] The basal fertilization plan map MP is displayed on the display part 24 based on the data of the basal fertilization plan map MP stored in the memory. The position of the tractor 20 is shown on the basal fertilization plan map MP of the display part 24 based on the positional information measured by the GPS device 22. An operator confirms the position of the tractor 20 on the basal fertilization plan map MP displayed on the display part 24, so as to recognize the amount of basal fertilizer for basal fertilization before seeding the area in that position. The operator thereby fertilizes that area as much as necessary.);” and wherein the additional fertilizer map includes a target amount of fertilizer for application to each respective area of the plurality of areas in a farm field subsequent to sowing or planting.”( Rupp [0171] “This spray information is then correlated with the plant yield or crop yield data for each specific location that was gathered and uploaded for offline analysis. The correlations with the crop yield help determine best practices for the next crop cycle (e.g. whether to reduce or increase the amount of herbicide, fertilizer, and so on) for each location in the crop field.” Here Rupp teaches adjusting fertilizer through each part of the cycle (i.e. “additional” fertilizer maps) ) Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over modified Isemura (Isemura + Rupp + Hanya) as applied to claim 6 above, and further in view of Wu. Regarding Claim 8, modified Isemura (Isemura + Rupp + Hanya) does not teach creating the additional fertilizer maps/plans based on aerial imagery. Wu et al teaches a automatic fertilizing spraying system which includes the use of flying scout drones to obtain images ([0049] “[0049] In some embodiments, the modular sensor units 50 are mounted to a rigid attachment fixture 1952 or other mounting plate. This facilitates… The modular attachment fixtures 1952 may also be bolted or strapped to smaller vehicles (autonomous scout or aerial drone).”) to determine the current growth condition/state of the crops and adjusting operation based on the current conditions([0125] “ FIG. 10 is an example method to generate 3D images obtained from two or more image sensor units 50, to determine distance, depth and/or height values. A virtual image grid 100 is associated with the captured image to determine the vanishing point or horizon line. …The magnitude of the change is determined based on a priori calibration that correlates known elevation changes with a scale in the image grid 100, similar to the calibration performed to gauge a distance or depth using the calibrated scale in the image grid 100. In some method embodiments, the vehicle controller react and adjust boom height and angle to the landscape before the vehicle gets to the location of the determined depth and height. Additional sensors are optionally mounted on the boom to detect nearby distance to ground and crops. Sensor … Other embodiments include adding crop elevation (crop height distance to ground) information obtained from a previous travel pass through the field, such as based on a measured distance from a vehicle suspension ride height sensor that is mounted to the frame or suspension components of the vehicle, or such as from analyzing a previous imaging map of the field. Then, during an on-the-go pass through the field (i.e. the current pass through), the image sensor units 50 are processing the captured images and searching for changes in either the ground altitude or crop elevation from the previous pass due to new events, e.g. ground damage from a major rain, animal skirmish, and additional crop growth. In some embodiments, the search or analysis is for differences between the ground altitude or crop elevation or current image map versus the image map from previous years when the same type of crop was previously grown.” It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Isemura to include the image capturing and subsequent growth state estimation via scout drones as taught by Wu et al as part of the automatic work plan correction taught in Isemura. One would be motivated to implement the aerial drone imaging and analysis in order to allow for the vehicle to adjust the spraying plan in realtime to avoid wasting fertilizer on weeds/non-crop objects. Wu teaches this improvement in ([0103] “In some embodiments, anomalies 11 in a crop row 12 are sprayed with herbicide. To avoid needless spray (e.g. crop residue, twig), such objects are calibrated out by void pattern instructions performed on an image captured. Alternatively, a crop row is identified and spray is applied along the entire row 12 itself. In some embodiments continuous filming is performed. Alternatively, where higher resolution is used for analysis, snapshots are taken. If an image sensor unit 50 has good peripheral or fore-aft view, the snapshot intervals are taken to reduce overlapping images and is based on a travel speed of the vehicle. Fewer images also reduces the amount of data that is uploaded for further analysis offline.”) thereby reducing the operating cost and/or reducing environmental impacts (fertilizer seepage/runoff into water table/nearby environment) Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20120083907 A1; US 20130066666 A1; US 20160302351 A1; US 20180132422 A1; US 20190139158 A1; US 20190141883 A1; US 20200329632 A1 Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNETH MICHAEL DUNNE whose telephone number is (571)270-7392. The examiner can normally be reached Mon-Thurs 8:30-6:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Navid Z Mehdizadeh can be reached at (571) 272-7691. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KENNETH M DUNNE/Primary Examiner, Art Unit 3669
Read full office action

Prosecution Timeline

Dec 26, 2024
Application Filed
Mar 18, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600262
VEHICLE MANAGING ENERGY AT A LOCATION DURING AN EVENT
2y 5m to grant Granted Apr 14, 2026
Patent 12596290
DAY/NIGHT FILTER GLASS FOR AIRCRAFT CAMERA SYSTEMS
2y 5m to grant Granted Apr 07, 2026
Patent 12594956
METHOD FOR PROVIDING INFORMATION ON RAINY ENVIRONMENT BY REFERRING TO POINT DATA ACQUIRED FROM A LIDAR SENSOR AND COMPUTING DEVICE USING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12590815
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 31, 2026
Patent 12582041
A FORAGE HARVESTER EQUIPPED WITH A CROP PICK-UP HEADER
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
76%
Grant Probability
87%
With Interview (+11.1%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 285 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month