Prosecution Insights
Last updated: April 18, 2026
Application No. 18/725,681

APPARATUS AND METHOD FOR AGRICULTURAL MECHANIZATION

Final Rejection §103
Filed
Jul 01, 2024
Examiner
DUNNE, KENNETH MICHAEL
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Beagle Technology Inc.
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
87%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
217 granted / 285 resolved
+24.1% vs TC avg
Moderate +11% lift
Without
With
+11.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
23 currently pending
Career history
308
Total Applications
across all art units

Statute-Specific Performance

§101
10.2%
-29.8% vs TC avg
§103
42.5%
+2.5% vs TC avg
§102
22.8%
-17.2% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 285 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 03/24/2026 was filed after the mailing date of the non-final rejection on 10/24/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Response to Arguments The arguments concerning the previous 112(b) rejections concerning relative terminology are persuasive and the previous 112(b) rejections concerning the camera and tool being located “in close proximity” are withdrawn. The other 112(b) rejections have been overcome via the amendments. Regarding amended claim 8 as amended, the previously cited prior art does not teach or render obvious the detecting/harvesting of celery with the protected portions as recited in the claim. Regarding amended claim 1, specifically regarding the teachings of Koselka, Applicant's arguments filed 03/24/2026 have been fully considered but they are not persuasive. While Koselka does teach some embodiments where the scouting and harvesting functions are performed by separate robots and/or the camera and tool are located separately, from the general teachings of Koselka it is clear that a single robot can perform both of these functions and that there are embodiments where there is a singular arm (i.e. that the tool and camera are located in proximity/fixed position relative to each other as they move). While certain specific embodiments of Koselka may not read on the claims that does not detract from the general teachings and/or other embodiments of Koselka anticipating the claim language. From [0012] “Embodiments of the invention enable an agricultural robot system and method of harvesting, pruning, thinning, spraying, culling, weeding, measuring and managing of agricultural crops. One approach for automated harvesting of fresh fruits and vegetables, pruning of vines, culling fruit, thinning of growth or fruit buds, selective spraying and or fertilizing, weeding, measuring and managing of agricultural resources is to use a robot comprising a machine-vision system containing cameras such as rugged solid-state digital cameras. … A robot employed in these embodiments may additionally comprise a GPS sensor or other external navigational aids to simplify the mapping process. The function of taking data which is used to create the map(s) may also be called scouting. In this case, if a robot performs primarily this function, it may be called a Scout robot. If the function is performed on a robot as part of a more complex series of functions, then this function may be called the Scouting function or Scouting part of the robot. The following terms may be used interchangeably and their usage is not meant to limit the intent of the specific design feature: plants, vines and trees; fruits and vegetables; and fields, orchards and groves.” Here in [0012] Koselka is generally teaching embodiments in which the scouting function is just one part of the robot, a robot which can perform more complex series of functions (i.e. the pruning and harvesting) Further in [0013] “Once the map(s) are prepared, the robot or another robot or server can create an action plan that the robot or another robot can then implement generally by moving and using articulated arms or other task-specific actuators, such as a selective sprayer to implement an agricultural function under the direction of a processor system” Here Koselka is explicitly teaching that the robot which creates the map is the same robot which then uses its arm(s) for subsequent agricultural functions (harvesting/pruning of vines) + [0014] “In one embodiment of the invention, an agricultural robot gathers data and then determines an action plan in advance of picking, pruning, thinning, spraying, or culling a tree or a vine” Additionally [0016] “An agricultural robot may comprise zero or more actuators or articulating arms coupled with a self-propelled automated platform or coupled with a tractor, trailer or boom.” Here teaches that a given robot can include zero or more (i.e. would include only one) arm. Thus teaching a general embodiment where one arm (with camera and harvesting/pruning tool) performs both the scouting (mapping) and the harvesting functions. Applicant then cites to [0123] which teaches an embodiment in which the camera and pruning actuator are separate. This is merely a singular embodiment which was not cited too or relied on in the previous rejection, as such pointing towards it as not teaching the claim language is not persuasive as that embodiment was never said to have anticipated the claim(s). Additionally see [0117] “Processor system 1006 communicates with tractor 1002 via an electronic tether 1007. Tractor 1002 may be equipped with a hydrostatic or other drive system whose speed can be automatically controlled by processor system 1006. In this embodiment, a driver steers tractor 1002, although the processor system 1006 may control the speed of tractor 1002 and therefore trailer to allow robotic arms 1004 to adequately perform assigned tasks according to an action plan in the shortest time possible. Robotic arms 1004 may be mounted on trailer 1003 and configured to harvest, prune, scout, measure or perform any other agricultural task desired.” Here teaches that the arms themselves can perform the scouting function (i.e. camera on the arm/hand as seen in figure 3 can perform the scouting function) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 5-6, and 10-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 20060213167 A1, Koselka et al, in view of US 20160150729 A1, “Agriculture Methods”, Moore and further in view of US 20220183208 A1, “AUTONOMOUS DETECTION AND CONTROL OF VEGETATION”, Sibley et al. Regarding Claim 1, Koselka teaches “A tool carrier apparatus, comprising: a tool for working on a plant planted in the ground;”(Abstract: “An agricultural robot system and method of harvesting, pruning, culling, weeding, measuring and managing of agricultural crops. Uses autonomous and semi-autonomous robot(s) comprising machine-vision using cameras that identify and locate the fruit on each tree, points on a vine to prune,” agriculture to one of ordinary skill in the art indicates that the crops are planted in the ground under the plain meaning of the term (as opposed to more specialized terms such as hydroponics or aeroponics.) );” ; an adjustable carrier configured to hold the tool and move the tool in a horizontal direction and a vertical direction with respect to the ground”( [0115] FIG. 7 shows an alternate embodiment of a harvest robot. This embodiment comprises an eight arm harvester. A rear mounted "Boom" comprises multiple arms that are mounted higher than front mounted Booms. Each Boom may be raised or lowered which in turn moves any arms coupled to the Boom up or down simultaneously. Other embodiments are configured to allow arms to move along the booms. During harvesting, the bins are placed in the rows approximately as they are expected to be consumed. … harvester robot showing a vertical boom and coupled arm and FIG. 9 shows a side view of an embodiment of a harvester robot showing horizontally mounted booms and vertical booms with coupled arms.)” and configured to mount to a vehicle;”( [0101] Alternate embodiments of an agricultural robot may comprise semi-autonomous robot(s) that may be coupled with a tractor, boom or trailer for example coupled with an extension link to allow for movement along or about the axis of tractor travel at a velocity other than that of the tractor. Robots are mounted on a tractor, boom or trailer in one or more embodiments of the invention which eliminates or minimizes the drive mechanisms in the robots in opposition to autonomous self-propelled platforms. Robots that are not self-propelled are generally smaller and cheaper. In addition, most farms have tractors that may be augmented with robots, allowing for easy adoption of robots while minimizing capital expenditures. Some farms may require a driver to physically move robots for safety or other concerns);” a camera configured to capture an image of a plant, the plant having a( [0125] FIG. 3 illustrates an embodiment of a robotic hand. The hand-type actuator includes a camera and light system to locate and track each piece of fruit as it is picked even the fruit located inside the dark interior of some plants. The grabbing mechanism labeled as "Suction Grabber" may either be a suction cup with an internal vacuum pump as shown or any other grabbing mechanism capable of picking fruit. For fruit whose stems must be cut rather than being pulled off the plant, the hand linkage may comprise an extendable cutter shown as "Stem Cutting Tool".)” a memory having a program stored therein; a processor that when executing the program implements: an algorithm” to identify the algorithms”; a robotic controller configured to control the adjustable carrier based on the control command to position the tool to work on the plant ( [0043] Embodiments of the invention enable an agricultural robot system and method of harvesting, pruning, thinning, spraying culling, weeding, measuring and managing of agricultural crops. One approach for automated harvesting of fresh fruits and vegetables, pruning of vines, culling fruit, weeding, measuring and managing of agricultural resources, etc. is to use a robot comprising a machine-vision system containing cameras such as rugged solid-state digital cameras. … Alternatively, a robot may map the cordons and canes of grape vines. In such a case, the map would consist of the location of each cordon, cane, and sucker as well as the location and orientation of buds on each cane. The function of the map is to allow the robot to make intelligent decisions and perform tasks based on what the vision system or other attached sensors detect along with rules or algorithms in the robots software. For instance the robot may choose to pick only fruit meeting a certain size criteria and may optimize the picking order for those fruit. Or the robot may use the map of a grape vine along with rules embodied in its software to prune the vine to the 8 best canes per cordon with 2 buds left on each of those canes. Alternatively, or in addition, the map may be used for other purposes other than functional decisions by the robot. For example, data from the map may be used by the grower to track crop performance and make intelligent decisions about when to harvest or when to prune. Other embodiments gather data applicable to thinning, spraying culling, weeding and crop management. A robot employed in these embodiments ...) “wherein the plant is a grape vine a( Koselka [0043] … One approach for automated harvesting of fresh fruits and vegetables, pruning of vines, culling fruit, weeding, measuring and managing of agricultural resources, etc. is to use a robot comprising a machine-vision system containing cameras such as rugged solid-state digital cameras. The cameras may be utilized to identify and locate the fruit on each tree, points on a vine to prune, weeds around plants. … the number and size of fruit on the plants and the approximate positions of the fruit on each plant. Alternatively, a robot may map the cordons and canes of grape vines. In such a case, the map would consist of the location of each cordon, cane, and sucker as well as the location and orientation of buds on each cane. The function of the map is to allow the robot to make intelligent decisions and perform tasks based on what the vision system or other attached sensors detect along with rules or algorithms in the robots software. For instance the robot may choose to pick only fruit meeting a certain size criteria and may optimize the picking order for those fruit. Or the robot may use the map of a grape vine along with rules embodied in its software to prune the vine to the 8 best canes per cordon with 2 buds left on each of those canes. Alternatively, or in addition, the map may be used for other purposes other than functional decisions by the robot.” Koselka teaches trimming/pruning of grape vines/cordons; from the modification for excluding zones the logic naturally flows that the unharvested portions of the grape plant is its cordon); “the camera is positioned in close proximity to the tool, configured to move together with the tool, and a relative position of the tool with respect to the camera is fixed”( [0016] “An agricultural robot may comprise zero or more actuators or articulating arms coupled with a self-propelled automated platform or coupled with a tractor, trailer or boom.” Here teaches that a given robot can include zero or more (i.e. would include only one) arm. Thus teaching a general embodiment where one arm (can camera/pruning) performs both the scouting (mapping) and the harvesting functions. Which read in light of [0012]-[0013] teaches a singular robot (with one arm) which performs both the scouting and harvesting/picking functions and from Fig. 3/[0125] it can be seen that the harvesting tool and camera are both equipped on the end of the arm/on the hand (i.e. in close proximity) which move together as the arm is articulated and are fixed in position relative to each other) Koselka however does not teach that the identifying of the plant portions is performed via a “Trained” AI, instead Koselka teaches a predetermined Algorithm ([0043]”… Alternatively, a robot may map the cordons and canes of grape vines. In such a case, the map would consist of the location of each cordon, cane, and sucker as well as the location and orientation of buds on each cane. The function of the map is to allow the robot to make intelligent decisions and perform tasks based on what the vision system or other attached sensors detect along with rules or algorithms in the robots software. For instance the robot may choose to pick only fruit meeting a certain size criteria and may optimize the picking order for those fruit. Or the robot may use the map of a grape vine along with rules embodied in its software to prune the vine to the 8 best canes per cordon with 2 buds left on each of those canes. Alternatively, or in addition, the map may be used for other purposes other than functional decisions by the robot”); further while Koselka does identify portions of the plant it does not identify protected portions and create controls and commands the tool to avoid these protected portions. Moore teaches a similar robotic agricultural harvesting system which includes using a camera on the robot to identify the target plant and to create protected portions (“Exclusion Areas”) of the plant which are subsequently avoided during the operation of the harvesting implement.( Moore [0037] The CBCS 14 FIG. 2 will start pruning algorithms with the selection of an auto-pruning operation. The front robotic arms 10 will first make several passes using the Power Pruner Assembly 26 FIG. 2 to clear all limbs below the lower profile set for the trees. The DIS cameras 18a,b,c,d (FIG. 5) will process a number of dual digital images of the tree, and the superimposition of these images starting at the trunk will provide the data to develop a vector-based-stick-image of the tree trunk, limbs, and branches. The vector-based-image is represented by the image provided in FIG. 13. The CBCS 14 (FIG. 5) will be assigned a profile algorithm that will consist of about one-third of the tree to each robotic arm. The front robotic arm 10 will be assigned a profile for the lower one-third of the tree, the middle robotic arm 11 will be assigned a profile for the middle one-third of the tree, and the rear robotic arm 12 will be assigned a profile for the top one-third of the tree, and the top profile. The DIS 18 will be imaging the tree as it is pruned with the DIS cameras 18a,b,c,d (FIG. 5). The tubular paths around the stick images of the tree branches represent the exclusion areas for the pruning profiles for the half of the tree being pruned depicted in FIG. 13. When the rear robotic arm 12 completes its pruning profile, the rear MS camera 18e will store and identify the final pruning image. This image data is stored in the GIS database so the data can be retrieved based on the location of the tree. The process is continued to the next tree in the row when the front robotic arm 10 completes running the profile assigned to it, and continues by pruning all limbs below the lower profile set for the trees. The process repeats itself for the next tree in each row as described above.” ) PNG media_image1.png 512 420 media_image1.png Greyscale It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Koselka to include the “Exclusion Zone” detection/creation of Moore as part of the command generation and control for the arm(s) of Koselka. One would be motivated to implement the exclusion zones of Moore into Koselka to avoid damaging the health (of main body) of the plants during pruning operation. (Implicit to Moore [0037] “The CBCS 14 FIG. 2 will start pruning algorithms with the selection of an auto-pruning operation. The front robotic arms 10 will first make several passes using the Power Pruner Assembly 26 FIG. 2 to clear all limbs below the lower profile set for the trees. ... The DIS 18 will be imaging the tree as it is pruned with the DIS cameras 18a,b,c,d (FIG. 5). The tubular paths around the stick images of the tree branches represent the exclusion areas for the pruning profiles for the half of the tree being pruned depicted in FIG. 13. When the rear robotic arm 12 completes its pruning profile, the rear MS camera 18e will store and identify the final pruning image. This image data is stored in the GIS database so the data can be retrieved based on the location of the tree. The process is continued to the next tree in the row when the front robotic arm 10 completes running the profile assigned to it, and continues by pruning all limbs below the lower profile set for the trees. The process repeats itself for the next tree in each row as described above.”) The combination however would still lack teachings for a “trained AI engine” for identifying the protected portions of the plant. Sibley et al teaches a automated plant harvesting system which includes using a trained AI engine to identify various portions of the detected plants ( Sibley [0164] “In one example, VSLAM can be performed by detecting objects themselves, rather than arbitrary points and patterns detected in an image frame. For example, a whether referred to as keypoints or not, the system can ingest a frame, perform feature extraction and object detection and detect specific known objects with a machine learning model” from “machine learning model” it is known that the AI engine (learning model) is “trained” in some fashion + [0396]-[0398] give more details as to the implementation/learning of the learning model of [0164] + see figure 12a below which shows the representation of the trained AI engine identifying portions of the plant) and based on the identified portions determining the treatment plan (where/what to harvest from the plant (Sibley [0164]”… For example, referring to diagram 1300a of FIG. 12A, the agricultural observation and treatment system can detect a plurality of fruitlets, buds, and landmarks in a single frame using a machine learning detector embedded on board the system. From frame to frame, as the treatment system scans the orchard, the image sensors of each of the component treatment modules configured to detect individual objects and landmarks themselves, can detect objects, and match the object detections from frame to frame for the purposes of SLAM and pose estimation of the sensor sensing the object, in addition to determining whether to track an object to perform a treatment action.”) It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application, to further modify Koselka to substitute the predetermined harvesting/pruning algorithms of Koselka with the neural network (machine learning) model for identifying portions of the target plants as taught by Sibley et al. One would be motivated to implement AI learning to allow for the system which can improve (learn) overtime, increasing efficiency of operation. This motivation is implicit to the use of AI learning models as compared to predetermined algorithms/AI. As modified above Koselka then renders obvious “wherein the plant is a grape vine and the protected portion of the plant is a cordon of the grape vine.”( Koselka [0043] Embodiments of the invention enable an agricultural robot system and method of harvesting, pruning, thinning, spraying culling, weeding, measuring and managing of agricultural crops. One approach for automated harvesting of fresh fruits and vegetables, pruning of vines, culling fruit, weeding, measuring and managing of agricultural resources, etc. is to use a robot comprising a machine-vision system containing cameras such as rugged solid-state digital cameras. The cameras may be utilized to identify and locate the fruit on each tree, points on a vine to prune, weeds around plants. In addition, the cameras may be utilized in measuring agricultural parameters or otherwise aid in managing agricultural resources. … the number and size of fruit on the plants and the approximate positions of the fruit on each plant. Alternatively, a robot may map the cordons and canes of grape vines. In such a case, … the picking order for those fruit. Or the robot may use the map of a grape vine along with rules embodied in its software to prune the vine to the 8 best canes per cordon with 2 buds left on each of those canes. Alternatively, or in addition, the map may be used for other purposes other than functional decisions by the robot.” Koselka teaches trimming/pruning of grape vines/cordons; from the modification for excluding zones the logic naturally flows that the unharvested portions of the grape plant is its cordon) PNG media_image2.png 516 776 media_image2.png Greyscale Regarding Claim 2, modified Koselka teaches “The tool carrier apparatus according to claim 1, wherein the camera is mounted on the adjustable carrier.”( Koselka [0125] FIG. 3 illustrates an embodiment of a robotic hand. The hand-type actuator includes a camera and light system to locate and track each piece of fruit as it is picked even the fruit located inside the dark interior of some plants. The grabbing mechanism labeled as "Suction Grabber" may either be a suction cup with an internal vacuum pump as shown or any other grabbing mechanism capable of picking fruit. For fruit whose stems must be cut rather than being pulled off the plant, the hand linkage may comprise an extendable cutter shown as "Stem Cutting Tool".” + see figure 3 posted below a camera is mounted onto/part of the robotic hand/cutting tool which from [0115] is known to be mounted on the boom(s) (i.e. adjustable carrier) as such the camera is by extension also mounted on the adjustable carrier) PNG media_image3.png 590 580 media_image3.png Greyscale Regarding Claim 3, modified Koselka teaches “The tool carrier apparatus according to claim 2, wherein the tool is a cutting tool to work on the plant by cutting a portion of the plant.”( Koselka [0125] FIG. 3 illustrates an embodiment of a robotic hand. The hand-type actuator includes a camera and light system to locate and track each piece of fruit as it is picked even the fruit located inside the dark interior of some plants. The grabbing mechanism labeled as "Suction Grabber" may either be a suction cup with an internal vacuum pump as shown or any other grabbing mechanism capable of picking fruit. For fruit whose stems must be cut rather than being pulled off the plant, the hand linkage may comprise an extendable cutter shown as "Stem Cutting Tool".) Regarding Claim 5, modified Koselka teaches “The tool carrier apparatus according to claim 3, wherein the adjustable carrier comprises an adjustable horizontal arm moveable in the horizontal direction, an adjustable vertical arm moveable in the vertical direction with respect to the ground, “(Koselka [0115] … Because of the size of the robot base, this harvester model has lower front arms. This model also shows the concept of embedded arms, where two arms are mounted on each boom as described above. FIG. 8 shows a front view of an embodiment of a harvester robot showing a vertical boom and coupled arm and FIG. 9 shows a side view of an embodiment of a harvester robot showing horizontally mounted booms and vertical booms with coupled arms.” There are embodiments that have horizontal and vertical booms (adjustable arms) seen in figure 9 posted below):’and an end effector attached to one of the adjustable horizontal arm and the adjustable vertical arm and configured to hold the tool.”( Koselka [0125] FIG. 3 illustrates an embodiment of a robotic hand. The hand-type actuator includes a camera and light system to locate and track each piece of fruit as it is picked even the fruit located inside the dark interior of some plants. The grabbing mechanism labeled as "Suction Grabber" may either be a suction cup with an internal vacuum pump as shown or any other grabbing mechanism capable of picking fruit. For fruit whose stems must be cut rather than being pulled off the plant, the hand linkage may comprise an extendable cutter shown as "Stem Cutting Tool".” As the end of the booms are the picking arms which include the cutting tool/head as seen in figure 3 above) PNG media_image4.png 568 558 media_image4.png Greyscale Regarding Claim 6, modified Koselka teaches “The tool carrier apparatus according to claim 5, wherein the camera is attached to the end effector by a rigid support”( Koselka [0125] FIG. 3 illustrates an embodiment of a robotic hand. The hand-type actuator includes a camera and light system to locate and track each piece of fruit as it is picked even the fruit located inside the dark interior of some plants. The grabbing mechanism labeled as "Suction Grabber" may either be a suction cup with an internal vacuum pump as shown or any other grabbing mechanism capable of picking fruit. For fruit whose stems must be cut rather than being pulled off the plant, the hand linkage may comprise an extendable cutter shown as "Stem Cutting Tool".” As can be seen in figure 3 posted above a camera is attached to the end effector in close proximity to the tool) Regarding Claim 10, modified Koselka “The tool carrier apparatus according to claim 2, wherein the plant is a crop planted in one of a plurality of rows of the crop”( Koselka Abstract: An agricultural robot system and method of harvesting, pruning, culling, weeding, measuring and managing of agricultural crops. Uses autonomous and semi-autonomous robot(s) comprising machine-vision using cameras that identify and locate the fruit on each tree, points on a vine to prune, etc.,” From the term “agricultural crops” that they are planted in rows is implicit as for many crops row planting is WURC layout for agriculture/mass plantings.);” and the tool is configured to extract a weed disposed between the rows of crops while avoiding damaging the plant.”( Koselka [0003] Embodiments of the invention described herein pertain to the field of robots. More particularly, but not by way of limitation, embodiments of the invention enable an agricultural robot system and method of robotic harvesting, pruning, culling, weeding, measuring and managing of agricultural crops.” In some embodiment the robot of Koselka weeds (which under the plain meaning of the term is extracting of unwanted plants (weeds) from among groups of wanted plants (Crops). + [0012] “Embodiments of the invention enable an agricultural robot system and method of harvesting, pruning, thinning, spraying, culling, weeding, measuring and managing of agricultural crops. … The cameras may be utilized to identify and locate the fruit on each tree, points on a vine to prune, weeds around plants” from the term weeds around plants teaches that the robot is not harming the wanted crops/plants during the weeding operation. ) Regarding Claim 11, modified Koselka teaches “The tool carrier apparatus according to claim 2, wherein the vehicle is a tractor.”( Koselka [0017] Alternate embodiments of an agricultural robot may comprise semi-autonomous robot(s) that may be coupled with a tractor, boom or trailer for example coupled with an extension link to allow for movement along or about the axis of tractor travel at a velocity other than that of the tractor. Robots are mounted on a tractor, boom or trailer in one or more embodiments of the invention which eliminates or minimizes the drive mechanisms in the robots used in autonomous self-propelled platforms.) Allowable Subject Matter Claims 8-9, 12-18 are allowed. The following is an examiner’s statement of reasons for allowance: Regarding Claim 8, no prior art was found to teach or render obvious the tool carrier apparatus with the components and functions as recited in claim 8. As a comparison to the previously cited prior art the Artificial Intelligence Engine/protected portions of a plant correspond to a grape vine/cordons not celery. While a celery harvesting robot is taught in the art, in CN 108990531 A, the additionally claimed features of the camera and AI engine and their corresponding functions would require improper hindsight bias in order to combine the previously cited prior onto the celery harvesting robot of CN 108990531 A. Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. CN 108990531 A Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNETH MICHAEL DUNNE whose telephone number is (571)270-7392. The examiner can normally be reached Mon-Thurs 8:30-6:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Navid Z Mehdizadeh can be reached at (571) 272-7691. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KENNETH M DUNNE/Primary Examiner, Art Unit 3669
Read full office action

Prosecution Timeline

Jul 01, 2024
Application Filed
Oct 22, 2025
Non-Final Rejection — §103
Mar 24, 2026
Response Filed
Apr 07, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600262
VEHICLE MANAGING ENERGY AT A LOCATION DURING AN EVENT
2y 5m to grant Granted Apr 14, 2026
Patent 12596290
DAY/NIGHT FILTER GLASS FOR AIRCRAFT CAMERA SYSTEMS
2y 5m to grant Granted Apr 07, 2026
Patent 12594956
METHOD FOR PROVIDING INFORMATION ON RAINY ENVIRONMENT BY REFERRING TO POINT DATA ACQUIRED FROM A LIDAR SENSOR AND COMPUTING DEVICE USING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12590815
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 31, 2026
Patent 12582041
A FORAGE HARVESTER EQUIPPED WITH A CROP PICK-UP HEADER
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
87%
With Interview (+11.1%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 285 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month