Prosecution Insights
Last updated: April 19, 2026
Application No. 18/976,400

WORK VEHICLE AND METHOD FOR CONTROLLING WORK VEHICLE

Non-Final OA §102§112
Filed
Dec 11, 2024
Examiner
HAN, CHARLES J
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kubota Corporation
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
3y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
293 granted / 428 resolved
+16.5% vs TC avg
Strong +43% interview lift
Without
With
+42.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
26 currently pending
Career history
454
Total Applications
across all art units

Statute-Specific Performance

§101
6.5%
-33.5% vs TC avg
§103
38.2%
-1.8% vs TC avg
§102
20.8%
-19.2% vs TC avg
§112
32.1%
-7.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 428 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status YThe present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This is a first office action for application Serial No. 18/976,400 filed on 12/11/2024. Claims 1-16 have been examined. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Claim 1 recites: "A work vehicle to perform self-traveling among a plurality of crop rows, the work vehicle comprising: an exterior sensor to output sensor data indicating a distribution of geographic features around the work vehicle; and a controller configured or programmed to control self-traveling of the work vehicle; wherein the controller is configured or programmed to: detect two crop rows existing on opposite sides of the work vehicle based on the sensor data; cause the work vehicle to travel along a path between the two crop rows; during travel, if an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data, set a coordinate system for turning travel that is fixed to a ground surface and a target point for the turning travel; and control the turning travel toward the target point based on the coordinate system." This language is vague and indefinite for at least the following reasons: Means-Plus-Function Language: The following claim limitations invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: “a controller configured or programmed to control …” “the controller is configured or programmed to: detect … cause … set … control …” However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. Applicant may: (a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph; (b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)). If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either: (a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181. Intended Use: Alternatively, it is unclear whether the language of the claim is intended to affirmatively require specific performance or whether this language is deliberately articulated as an expression of intended use. For example, it is unclear whether the following language is directed to specifically limiting or intended use limitations: “a controller configured or programmed to control …” “the controller is configured or programmed to: detect … cause … set … control …” “an exterior sensor to output sensor data …” “a coordinate system for turning travel …” “a target point for the turning travel” Accordingly, this language does not serve to patentably distinguish the claimed structure over that of the reference. See In re Pearson, 181 USPQ 641; In re Yanush, 177 USPQ 705; In re Finsterwalder, 168 USPQ 530; In re Casey, 512 USPQ 235; In re Otto, 136 USPQ 458; Ex parte Masham, 2 USPQ 2nd 1647. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "A work vehicle to perform self-traveling among a plurality of crop rows, the work vehicle comprising: an exterior sensor configured to output sensor data indicating a distribution of geographic features around the work vehicle; and a processor programmed to control self-traveling of the work vehicle; wherein the processor is detect two crop rows existing on opposite sides of the work vehicle based on the sensor data; cause the work vehicle to travel along a path between the two crop rows; during travel, when an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data, set a coordinate system of turning travel of the work vehicle that is fixed to a ground surface and a target point of the turning travel of the work vehicle; and control the turning travel of the work vehicle toward the target point based on the coordinate system." Claims 2-15 are further rejected as depending on this claim. Claim 2 recites: "The work vehicle of claim 1, wherein the controller is configured or programmed to: while the work vehicle is traveling between the two crop rows, consecutively generate an obstacle map having a predetermined length and width based on the sensor data; based on the obstacle map, estimate a length, in the obstacle map, of the crop row that corresponds to the turning direction between the two crop rows; and based on a difference between a length of the obstacle map and the length of the crop row in the obstacle map, detect the end of the crop row." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 1, wherein the processor is while the work vehicle is traveling between the two crop rows, consecutively generate an obstacle map having a predetermined length and width based on the sensor data; based on the obstacle map, estimate a length, in the obstacle map, of the crop row that corresponds to the turning direction between the two crop rows; and based on a difference between a length of the obstacle map and the length of the crop row in the obstacle map, detect the end of the crop row." Claim 3 is further rejected as depending on this claim. Claim 3 recites: "The work vehicle of claim 2, wherein, if the difference between the length of the obstacle map and the length of the crop row in the obstacle map exceeds a threshold, the controller is configured or programmed to determine that the end of the crop row has been detected." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Moreover, this language is further rejected as vague and indefinite for at least the following reasons: Generally Unclear: The expression “if the difference between the length of the obstacle map and the length of the crop row in the obstacle map exceeds a threshold, the controller is configured or programmed to” as used in the claim is vague and indefinite and leaves the reader in doubt as to the meaning of the technical features to which it refers, thereby rendering the definition and scope of the subject-matter of said claim unclear. Namely, it is unclear whether the controller is programmed to conditionally perform an action (i.e. perform a determination if the difference between the length of the obstacle map and the length of the crop row in the obstacle map exceeds a threshold) or whether the controller is conditionally programed (i.e. program the controller if the difference between the length of the obstacle map and the length of the crop row in the obstacle map exceeds a threshold) Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 2, wherein, the processor is programmed to: when Claim 4 recites: "The work vehicle of claim 1, wherein the controller is configured or programmed to determine an origin of the coordinate system based on a position of the work vehicle when the end of the crop row has been detected." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 1, wherein the processor is Claims 5-7 are further rejected as depending on this claim. Claim 5 recites: "The work vehicle of claim 4, wherein the coordinate system is defined by a y axis that extends from the origin in a traveling direction of the work vehicle traveling between the two crop rows and by an x axis that extends in a direction that is parallel to the horizontal plane and perpendicular to the y axis; the controller is configured or programmed to estimate an interval between the two crop rows based on the sensor data; and set an integer multiple of the interval as an x coordinate value of the target point." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claims 1 and 3 above. Moreover, this language is further rejected as vague and indefinite for at least the following reasons: Generally Unclear: The expression “set an integer multiple of the interval as an x coordinate value of the target point” as used in the claim is vague and indefinite and leaves the reader in doubt as to the meaning of the technical features to which it refers, thereby rendering the definition and scope of the subject-matter of said claim unclear. Namely, it is unclear what is performing the function of setting (e.g. the controller). Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 4, wherein the coordinate system is defined by a y axis that extends from the origin in a traveling direction of the work vehicle traveling between the two crop rows and by an x axis that extends in a direction that is parallel to the horizontal plane and perpendicular to the y axis; and the processor is : estimate an interval between the two crop rows based on the sensor data; and set an integer multiple of the interval as an x coordinate value of the target point." Claim 7 is further rejected as depending on this claim. Claim 6 recites: "The work vehicle of claim 4, wherein the coordinate system is defined by a y axis that extends from the origin in a traveling direction of the work vehicle traveling in between the two crop rows and by an x axis that extends in a direction that is parallel to the horizontal plane and perpendicular to the y axis; and the controller is configured or programmed to set a y coordinate value of the end of the crop row in the coordinate system as a y coordinate value of the target point." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 4, wherein the coordinate system is defined by a y axis that extends from the origin in a traveling direction of the work vehicle traveling in between the two crop rows and by an x axis that extends in a direction that is parallel to the horizontal plane and perpendicular to the y axis; and the processor is Claim 7 recites: "The work vehicle of claim 5, wherein the controller is configured or programmed to set a y coordinate value of the end of the crop row in the coordinate system as a y coordinate value of the target point." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 5, wherein the processor is Claim 8 recites: "The work vehicle of claim 1, wherein, after setting the target point, if an end of another crop row is detected based on the sensor data, the controller is configured or programmed to modify a position of the target point in accordance with a position relationship between the end of the other crop row and the target point." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claims 1 and 3 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 1, wherein, the processor is programmed to: after setting the target point, when an end of another crop row is detected based on the sensor data, Claim 9 is further rejected as depending on this claim. Claim 9 recites: "The work vehicle of claim 8, wherein, after setting the target point, the controller is configured or programmed to detect an end of another crop row based on the sensor data, and, if an x coordinate value of the end of the other crop row is smaller than an x coordinate value of the target point and a y coordinate value of the end of the other crop row is greater than a y coordinate value of the target point in the coordinate system, update the y coordinate value of the target point to the y coordinate value of the end of the other crop row." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 8, wherein, the processor is programmed to: after setting the target point, when an x coordinate value of the end of the other crop row is smaller than an x coordinate value of the target point and a y coordinate value of the end of the other crop row is greater than a y coordinate value of the target point in the coordinate system, update the y coordinate value of the target point to the y coordinate value of the end of the other crop row." Claim 10 recites: "The work vehicle of claim 1, wherein the controller is configured or programmed to: operate in an inter-row travel mode to cause the work vehicle to travel along a path between the two crop rows and in a turning travel mode to cause the work vehicle to turn in a headland; in the inter-row travel mode, based on the sensor data being consecutively output from the exterior sensor, cause the work vehicle to travel along the path while setting the path in between two crop rows by detecting the two crop rows; and in the turning travel mode, set a turning path on the coordinate system, and cause the work vehicle to travel along the turning path while estimating its own position in the coordinate system based on the sensor data being consecutively output from the exterior sensor." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 14 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 1, processor is operate in an inter-row travel mode, causing , and in a turning travel mode causing in the inter-row travel mode, based on the sensor data being consecutively output from the exterior sensor, cause the work vehicle to travel along the path while setting the path in between two crop rows by detecting the two crop rows; and in the turning travel mode, set a turning path on the coordinate system, and cause the work vehicle to travel along the turning path while estimating its own position in the coordinate system based on the sensor data being consecutively output from the exterior sensor." Claims 11-14 are further rejected as depending on this claim. Claim 11 recites: "The work vehicle of claim 10, wherein, after setting the coordinate system and the target point in the inter-row travel mode, the controller is configured or programmed to switch to the turning travel mode." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claims 1 and 3 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 10, wherein, the processor is programmed to: after setting the coordinate system and the target point in the inter-row travel mode, Claims 12-13 are further rejected as depending on this claim. Claim 12 recites: "The work vehicle of claim 11, wherein, after setting the coordinate system and the target point in the inter-row travel mode, the controller is configured or programmed to determine whether turning is possible or not based on the sensor data, and if it is determined that turning is possible, switch to the turning travel mode." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claims 1 and 3 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 11, wherein, the processor is programmed to: after setting the coordinate system and the target point in the inter-row travel mode, when it is determined that turning is possible, switch to the turning travel mode." Claim 13 recites: "The work vehicle of claim 11, wherein, after setting the coordinate system and the target point in the inter-row travel mode, if it is determined based on the sensor data that a space needed for the turning exists and that the work vehicle has passed the end of the crop row, the controller is configured or programmed to switch to the turning travel mode." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claims 1 and 3 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 11, wherein, the processor is programmed to: after setting the coordinate system and the target point in the inter-row travel mode, when it is determined based on the sensor data that a space needed for the turning exists and that the work vehicle has passed the end of the crop row, Claim 14 recites: "The work vehicle of claim 10, wherein, after setting the target point, if another crop row is detected based on the sensor data, the controller is configured or programmed to modify the turning path in accordance with a position relationship between the other crop row and the turning path." This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claims 1 and 3 above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: "The work vehicle of claim 10, wherein, the processor is programmed to: after setting the target point, when another crop row is detected based on the sensor data, Claim 15 recites: “The work vehicle of claim 1, wherein the exterior sensor includes one or more LiDAR sensors to output point cloud data as the sensor data.” This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim above. Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: “The work vehicle of claim 1, wherein the exterior sensor includes one or more LiDAR sensors configured to output point cloud data as the sensor data.” Claim 16 recites: “A control method for a work vehicle to perform self-traveling among a plurality of crop rows, the control method comprising: detecting two crop rows existing on opposite sides of the work vehicle based on sensor data that is output from an exterior sensor mounted on the work vehicle; causing the work vehicle to travel along a path between the two crop rows; during travel, if an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data, setting a coordinate system for turning travel that is fixed to a ground surface and a target point for the turning travel; and controlling the turning travel toward the target point based on the coordinate system.” This language is also rejected as vague and indefinite for the same reasons discussed in the rejection of claim 1 above. Moreover this language is further rejected for at least the following reasons: Conditional use: The claim contains the following language that is vague and indefinite as it is unclear whether the scope of this language is intended to affirmatively require specific performance or whether this language is deliberately articulated as a conditional use limitation: “if an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data, setting a coordinate system for turning travel that is fixed to a ground surface and a target point for the turning travel” MPEP 2111.04 states that, “The broadest reasonable interpretation of a method (or process) claim having contingent limitations requires only those steps that must be performed and does not include steps that are not required to be performed because the condition(s) precedent are not met.” See also Ex parte Schulhauser, Appeal 2013-007847 (PTAB April 28, 2016 at pg. 10), holding in a precedential opinion that “if the condition for performing a contingent step is not satisfied, the performance recited by the step need not be carried out in order for the claimed method to be performed.”). Although the following language does not necessarily cure the issues discussed above, for purposes of examination under 35 USC 102 and 103, Examiner will interpret this language as reading: “A control method for a work vehicle to perform self-traveling among a plurality of crop rows, the control method comprising: detecting two crop rows existing on opposite sides of the work vehicle based on sensor data that is output from an exterior sensor mounted on the work vehicle; causing the work vehicle to travel along a path between the two crop rows; during travel, when an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data, setting a coordinate system of turning travel of the work vehicle that is fixed to a ground surface and a target point of the turning travel of the work vehicle; and controlling the turning travel of the work vehicle toward the target point based on the coordinate system.” Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bergman (Bergman M., et al., Robot Farmers. Autonomous Orchard Vehicles Help Tree Fruit Production. IEEE Robotics & Automation Magazine, March 2015). Regarding claim 1, Bergerman discloses a work vehicle to perform self-traveling among a plurality of crop rows (see e.g. at least pg. 54, Introduction and “Agricultural Robotics”, Fig. 1, and related text), the work vehicle comprising: an exterior sensor configured to output sensor data indicating a distribution of geographic features around the work vehicle (e.g. at least Sick LMS111 planar laser scanner, see e.g. at least pg. 55, “Base Autonomous Platform”); and a processor programmed to control self-traveling of the work vehicle (e.g. at least SmallPC SC240ML fanless computer, id.); wherein the processor is programmed (id.) to: detect two crop rows existing on opposite sides of the work vehicle based on the sensor data (see e.g. at least pg. 56-57, “Perception System”, Fig. 4, and related text, identifying tree rows on the left and right sides of the vehicle); cause the work vehicle to travel along a path between the two crop rows (id., see also e.g. at least pg. 55, Table 1., describing control modes that each include “row following”); during travel, when an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data (id., describing “end-of-row detection”), set a coordinate system of turning travel of the work vehicle that is fixed to a ground surface and a target point of the turning travel of the work vehicle (id., see also e.g. at least pg. 57-58, Fig. 5, and related text, performing pose estimation in two dimensions in order to turn the vehicle at the end of a row, see also e.g. at least pg. 59-61, “Navigation System”, determining the control variables used to navigate turns from a current row to a target row, e.g. steering angle, distance to row center line, lateral offset); and control the turning travel of the work vehicle toward the target point based on the coordinate system (id.). Regarding claim 2, Bergerman discloses that the processor is programmed to: while the work vehicle is traveling between the two crop rows, consecutively generate an obstacle map having a predetermined length and width based on the sensor data (see e.g. at least pg. 56-59, “Perception System”, Fig. 4-5, and related text); based on the obstacle map, estimate a length, in the obstacle map, of the crop row that corresponds to the turning direction between the two crop rows (id., see also e.g. at least pg. 59-61, “Navigation System”, detecting a gap with a length larger than a predefined value as an end-of-row condition); and based on a difference between a length of the obstacle map and the length of the crop row in the obstacle map, detect the end of the crop row (id.). Regarding claim 3, Bergerman discloses that, the processor is programmed to: when the difference between the length of the obstacle map and the length of the crop row in the obstacle map exceeds a threshold, determine that the end of the crop row has been detected (see e.g. at least pg. 59-61, “Navigation System”). Regarding claim 4, Bergerman discloses that the processor is programmed to determine an origin of the coordinate system based on a position of the work vehicle when the end of the crop row has been detected (see e.g. at least pg. 56-59, “Perception System”). Regarding claim 5, Bergerman discloses that the coordinate system is defined by a y axis that extends from the origin in a traveling direction of the work vehicle traveling between the two crop rows and by an x axis that extends in a direction that is parallel to the horizontal plane and perpendicular to the y axis (see e.g. at least pg. 56-59, “Perception System”); and the processor is programmed to: estimate an interval between the two crop rows based on the sensor data (id., see also e.g. at least pg. 59-61, “Navigation System”); and set an integer multiple of the interval as an x coordinate value of the target point (id.). Regarding claim 6, Bergerman discloses that the coordinate system is defined by a y axis that extends from the origin in a traveling direction of the work vehicle traveling in between the two crop rows and by an x axis that extends in a direction that is parallel to the horizontal plane and perpendicular to the y axis (see e.g. at least pg. 56-59, “Perception System”, see also e.g. at least Fig. 6, 8, 14, and related text); and the processor is programmed to set a y coordinate value of the end of the crop row in the coordinate system as a y coordinate value of the target point (id.). Regarding claim 7, Bergerman discloses that the processor is programmed to set a y coordinate value of the end of the crop row in the coordinate system as a y coordinate value of the target point (see e.g. at least pg. 56-59, “Perception System”, see also e.g. at least Fig. 6, 8, 14, and related text). Regarding claim 8, Bergerman discloses that, the processor is programmed to: after setting the target point, when an end of another crop row is detected based on the sensor data, modify a position of the target point in accordance with a position relationship between the end of the other crop row and the target point (see e.g. at least pg. 56-59, “Perception System”, see also e.g. at least Fig. 6, 8, 14, and related text). Regarding claim 9, Bergerman discloses that, the processor is programmed to: after setting the target point, detect an end of another crop row based on the sensor data, and, when an x coordinate value of the end of the other crop row is smaller than an x coordinate value of the target point and a y coordinate value of the end of the other crop row is greater than a y coordinate value of the target point in the coordinate system, update the y coordinate value of the target point to the y coordinate value of the end of the other crop row (see e.g. at least pg. 56-59, “Perception System”, see also e.g. at least Fig. 6, 8, 14, and related text). Regarding claim 10, Bergerman discloses that the processor is programmed to: operate in an inter-row travel mode, causing the work vehicle to travel along a path between the two crop rows, and in a turning travel mode causing the work vehicle to turn in a headland (see e.g. at least pg. 59-61, “Navigation System”, see also e.g. at least Fig. 6, 8, 14, and related text, describing at least “Pace Mode”); in the inter-row travel mode, based on the sensor data being consecutively output from the exterior sensor, cause the work vehicle to travel along the path while setting the path in between two crop rows by detecting the two crop rows (id.); and in the turning travel mode, set a turning path on the coordinate system, and cause the work vehicle to travel along the turning path while estimating its own position in the coordinate system based on the sensor data being consecutively output from the exterior sensor (id.). Regarding claim 11, Bergerman discloses that, the processor is programmed to: after setting the coordinate system and the target point in the inter-row travel mode, switch to the turning travel mode (see e.g. at least pg. 59-61, “Navigation System”, see also e.g. at least Fig. 6, 8, 14, and related text, describing at least “Pace Mode”). Regarding claim 12, Bergerman discloses that, the processor is programmed to: after setting the coordinate system and the target point in the inter-row travel mode, determine whether turning is possible or not based on the sensor data, and when it is determined that turning is possible, switch to the turning travel mode (see e.g. at least pg. 59-61, “Navigation System”). Regarding claim 13, Bergerman discloses that, the processor is programmed to: after setting the coordinate system and the target point in the inter-row travel mode, when it is determined based on the sensor data that a space needed for the turning exists and that the work vehicle has passed the end of the crop row, switch to the turning travel mode (see e.g. at least pg. 59-61, “Navigation System”). Regarding claim 14, Bergerman discloses that, the processor is programmed to: after setting the target point, when another crop row is detected based on the sensor data, modify the turning path in accordance with a position relationship between the other crop row and the turning path (see e.g. at least pg. 59-61, “Navigation System”). Regarding claim 15, Bergerman discloses that the exterior sensor includes one or more LiDAR sensors configured to output point cloud data as the sensor data (see e.g. at least pg. 55, “Base Autonomous Platform”). Regarding claim 16, Bergerman discloses a control method for a work vehicle to perform self-traveling among a plurality of crop rows (see e.g. at least pg. 54, Introduction and “Agricultural Robotics”, Table 1, and related text), the control method comprising: detecting two crop rows existing on opposite sides of the work vehicle based on sensor data that is output from an exterior sensor mounted on the work vehicle (see e.g. at least pg. 56-57, “Perception System”, Fig. 4, and related text, identifying tree rows on the left and right sides of the vehicle); causing the work vehicle to travel along a path between the two crop rows (id., see also e.g. at least pg. 55, Table 1., describing control modes that each include “row following”); during travel, when an end of at least a crop row that corresponds to a turning direction between the two crop rows is detected based on the sensor data (id., describing “end-of-row detection”), setting a coordinate system of turning travel of the work vehicle that is fixed to a ground surface and a target point of the turning travel of the work vehicle (id., see also e.g. at least pg. 57-58, Fig. 5, and related text, performing pose estimation in two dimensions in order to turn the vehicle at the end of a row, see also e.g. at least pg. 59-61, “Navigation System”, determining the control variables used to navigate turns from a current row to a target row, e.g. steering angle, distance to row center line, lateral offset); and controlling the turning travel of the work vehicle toward the target point based on the coordinate system (id.). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHARLES J HAN whose telephone number is (571)270-3980. The examiner can normally be reached on M-Th and every other F (7:30 AM - 5 PM). If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christian Chace can be reached on 571-272-4190. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHARLES J HAN/Primary Examiner, Art Unit 3662
Read full office action

Prosecution Timeline

Dec 11, 2024
Application Filed
Feb 18, 2026
Non-Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596009
VEHICLE DATA SERVICES CONFIGURABLE DEPLOYMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12594928
ELECTRIC WORK VEHICLE
2y 5m to grant Granted Apr 07, 2026
Patent 12584293
SYSTEMS AND METHODS FOR CONTROL OF EXCAVATORS AND OTHER POWER MACHINES
2y 5m to grant Granted Mar 24, 2026
Patent 12576710
TRUCK CABIN
2y 5m to grant Granted Mar 17, 2026
Patent 12565162
VEHICLE POWER OUTLET ADVISORY SYSTEM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
99%
With Interview (+42.9%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 428 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month