Prosecution Insights
Last updated: April 19, 2026
Application No. 18/572,322

CONTROL SYSTEM OF WORK MACHINE, WORK MACHINE, AND METHOD OFCONTROLLING WORK MACHINE

Final Rejection §103
Filed
Dec 20, 2023
Examiner
HINTON, HENRY R
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Komatsu Ltd.
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
35 granted / 46 resolved
+24.1% vs TC avg
Strong +34% interview lift
Without
With
+33.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
24 currently pending
Career history
70
Total Applications
across all art units

Statute-Specific Performance

§101
12.9%
-27.1% vs TC avg
§103
54.8%
+14.8% vs TC avg
§102
16.3%
-23.7% vs TC avg
§112
13.7%
-26.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 46 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment/Arguments The 11.12.2025 amendments to the claims are entered. Claims 1, 3-4, 7-10, 12, and 14 are amended. Claims 2, 5-6, 11, 13, 15-16 are canceled. Claims 1, 3-4, 7-10, 12, 14, and 17-20 are now pending. Regarding the Claim Objections Claims 8 and 12 having been amended to correct the minor informalities therein, the rejections thereof are withdrawn. Regarding the 35 U.S.C. 112 Rejections Claim 11 having been canceled, the rejection thereof under 35 U.S.C. 112(d) is withdrawn. Regarding the 35 U.S.C. 103 Rejections Claim 1 was amended to include the subject matter of claims 2 and 5-6 and further to include the limitation “wherein the radiation mark includes a plurality of lines extending in the radiation direction from the reference point of the target.” The additional limitation narrows the scope of claim 1 beyond the subject matter considered in the prior Office Action by requiring the radiation mark to include lines and not just any shape or character. This in turn requires recitation in the prior art of not any radial mark, but one including lines. It also requires the processor to calculate a two-dimensional position based on a radial mark including lines. To address the scope change, the examiner has introduced new prior art in the cited portions below. The examiner notes the same scope change applies similarly to the other claims. Therefore, Applicant’s arguments made in the 11.12.2025 Remarks have been fully considered but are moot because the new grounds of rejection do not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3-4, 7-10, 12, 14, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over US 20220389685 A1 to Kovanen, Tuomas et al. (Kovanen), in view of US 20210381197 A1 to Otani, Masaki et al. (Masaki) and “Robust Circular Fiducials Tracking And Camera Pose Estimation Using Particle Filtering” to Ababsa, Fakhreddine et al. (“Ababsa”), further in view of the machine translation available on Google Patents of WO 2018143151 A1 to Tokura, Kentaro et al. (“Tokura”). To aid in finding the relevant passages, the examiner notes that the paragraph numbers marked by “()” are the approximate location of the passages in this translation, provided along with this Action. Regarding claim 1, Kovanen teaches a control system of a work machine (FIG.1: Excavator 1; [0080]: “When the position determination unit PDU resides in the machine, it may for example be implemented in the control unit 11 of the machine.”) including a traveling body and a turning body (FIG.1: Undercarriage 2a and upper carriage 2b.), the control system comprising a processor (Kovanen [0079]), Kovanen does not appear to expressly teach the processor being configured to: calculate a position and azimuth angle of the turning body based on an image of the plurality of targets and an inclination angle of the turning body. However, Otani teaches a camera (Otani FIG. 1: Camera 11.) disposed in a cab of an excavator (Otani FIG. 1: Cab 7.) that is in turn mounted on the turning body of the excavator (Otani FIG. 1: Camera 11.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the excavator pose determination system that uses cameras for the pose determination taught by the above combination of Kovanen in light of Ababsa with the camera disposed in the cab taught by Otani. Doing so would have improved the reliability of the system by keeping the camera protected from harsh worksite conditions (weather, heat, etc.) in the cab. A person of ordinary skill in the art would have recognized that this combination of Kovanen and Otani further teaches the processor is configured to calculate a position and an azimuth angle of the turning body based on an image of a plurality of targets ([0121]; FIG.10: “ . . . tracking the machine with the tracking apparatus TA by determining location of at least one reference point RP in the worksite 13 with respect to the tracking apparatus TA . . . and determining by the position determination unit PDU based at least in part on the data received from the tracking apparatus TA the location and orientation of the machine in the worksite 13. . . . there is a visual connection TD_RP between a tracking device TD of the tracking apparatus TA and the reference point RP . . ..” See [0069], where it is explained that the tracking device is a camera, inherently requiring an image of the targets.) installed on an outside of the work machine(FIGS 10, 11) and an inclination angle of the turning body ([0092]: “The one or more sensors in the machine and/or in the tracking apparatus TA may be at least one of: . . . an inclinometer, . . . camera sensors . . . ”; Claim 20: “ . . . wherein determination by the at least one position determination unit, of the location and orientation of the machine in the worksite is additionally based at least in part on data received from one or more sensors installed on the upper carriage of . . . the machine . . . wherein the sensors comprise . . . inclination . . . of . . . the machine.” APOSITA would have understood in the above combination that Kovanen is relied upon to teach position/orientation determination based on image and inclination data, while Otani is relied upon to teach where the camera is located on the machine. Because both the inclination angle sensor and the camera are disposed on the upper carriage/slewing body of the excavator, APOSITA would have understood that the calculated position/orientation of Kovanen would correspond to the calculated position/orientation of the turning body.); store a three-dimensional position of each of the plurality of targets (Kovanen [0116]: “The location information regarding each reference point RP in the worksite 13 may be input using a wireless or wired I/O device and/or may be retrievable from any known location such as worksite computer, cloud service and/or any computer or memory medium reachable by any wired or wireless network.”; Kovanen [0087]: Reference points located in the WCS, shown in FIG. 1 to be a three-dimensional coordinate system.). While the above combination of Kovanen and Otani broadly teaches that the stored 3D position of a reference point may be used in combination with a tracked position of the reference point relative to a camera to estimate the location and orientation of an excavator (Kovanen [0066]: Position of RP relative to camera (tracking device) and known position of RP in the WCS calculates the MCS with respect to the WCS, taken as location and orientation.), it does not appear to explicitly teach the processor is configured to calculate a position and an azimuth angle of the turning body based on a three-dimensional position of the target, a two-dimensional position of the target in the image. However, Ababsa teaches the position/azimuth calculation unit calculates a position and an azimuth angle of the turning body based on a three-dimensional position of the target (Ababsa §I: “In this paper, we . . . estimate the camera pose using circular fiducials . . . The measurements are based on inlier/outlier counts of correspondence matches for a set of known 3D circular targets.”), a two-dimensional position of the target in the image (Ababsa §II, Fig.1: In order to calculate a camera’s pose based on a known 3D fiducial location, the conversion between the fiducial projected onto the image plane and the 3D fiducial location must be calculated (§IV).). In light of Ababsa, one of ordinary skill in the art before the effective filing date of the present invention would have understood that for the pose of a camera based on a fiducial with known 3D pose to be converted into a coordinate system other than its own, such a conversion would be based upon the location of the fiducial in the 2D image taken by the camera and the location of the fiducial in the 3D coordinate system in which the camera pose is desired to be converted. Thus, a person of ordinary skill in the art would have understood in the light of Ababsa, the above combination of Kovanen and Otani teaches the processor is configured to calculate a position and an azimuth angle of the turning body based on a three-dimensional position of the target (Kovanen [0066]), a two-dimensional position of the target in the image (Ababsa: Teaches determining a pose from a fiducial based on a known 3D and imaged 2D position of the fiducial.), and an inclination angle of the turning body (Kovanen Claim 20: Additionally, inclination angle may be used to assist in the pose determination. Furthermore, a camera image of a fiducial inherently has an inclination angle with respect to the world frame as shown in FIG. 1 of Ababsa. Thus, this angle would have also been used to calculate camera pose and subsequently excavator pose.), wherein each of the three-dimensional position and the two-dimensional position of the target includes a three-dimensional position and a two-dimensional position of a reference point defined in the target (Ababsa Fig. 1: Fiducial Fi has centerpoint in three dimensions in the world frame and in two dimensions in the image.). The above combination of Kovanen, Otani, and Ababsa does not appear to expressly teach wherein the target includes a radiation mark extending in a radiation direction from a reference point of the target, and the processor calculates a two-dimensional position of the reference point based on the radiation mark, and wherein the radiation mark includes a plurality of lines extending in the radiation direction from the reference point of the target. However, Tokura teaches wherein the target includes a radiation mark extending in a radiation direction from a reference point of the target (Tokura FIG. 1: Target 10a formed at the touch point of dark corner portions that point to the target.), and the processor calculates a two-dimensional position of the reference point based on the radiation mark (Tokura (32): “ . . . the display of the target 10a is performed in a light and shade pattern (that is, by arranging so that a plurality of corner portions of a dark portion such as black points one point) . . . the target 10a is specified at one point . . . “; Tokura FIG. 2, (38): “Target position calculation means 60 for calculating the position of the target 10a based on the image taken by the imaging means 3;” Understood that the means 60 uses the touch point of the dark corners to find target 10a since the means 60 uses an image of the target display area.), and wherein the radiation mark includes a plurality of lines extending in the radiation direction from the reference point of the target (Tokura FIG. 1: Edges of the dark corner portions 10b, 10c taken as the plurality of lines extending in the radiation direction from target 10a.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the fiducial whose centerpoint must be calculated to calculate its position taught by the above combination of Kovanen, Otani, and Ababsa with the target comprising a radiation mark including a plurality of edge lines radiating from the centerpoint target 10a of the target display area and processor for calculating the location of the target 10a using at least those lines taught by Tokura. Doing so would have improved the reliability of the calculation of the location of the fiducial by providing another means by which the centerpoint can be acquired. PNG media_image1.png 696 527 media_image1.png Greyscale Figure 1 of Tokura. Regarding claim 3, the above combination of Kovanen, Otani, Ababsa, and Tokura further teaches the control system of the work machine according to claim 1, comprising an imaging device that is disposed in the turning body and images the target (Otani [0072]: “ . . . are images that are captured by the main camera 11 disposed in the cab 7 . . .” Camera being disposed in the cab taken as being disposed in the turning body of the excavator. See also FIG. 1 of Otani. APOSITA would have understood in the above combination that the camera located in the cab taught by Otani would have tracked the reference points in the manner taught by Kovanen.), wherein the processor acquires the image from the imaging device (APOSITA would have understood in the above combination that the image from the camera tracking the target in the tracking device of Kovanen would have been acquired from the camera at the position of in the cab as taught by Otani.). Regarding claim 4, the above combination of Kovanen, Otani, Ababsa, and Tokura further teaches the control system of the work machine according to claim 3, wherein the processor calculates a position and an azimuth angle of the imaging device in a site coordinate system based on the three-dimensional position of the target, the two-dimensional position of the target, and the inclination angle of the turning body (Kovanen [0152]: “ . . . the data related to the at least one tracking apparatus TA comprises at least one of: . . . location and orientation of the tracking apparatus TA in at least one of: . . . a worksite coordinate system WCS . . . ” APOSITA would have understood in light of Kovanen and Ababsa that the pose of the tracking apparatus would have been calculated in the worksite coordinate system based on the above data as discussed in claim 1.) and calculates a position and an azimuth angle of the turning body based on the position and the azimuth angle of the imaging device (APOSITA would have understood in light of the above combination that because the camera and inclination angle sensors are mounted on the upper/slewing carriage of the excavator as taught by Otani and Kovanen, respectively, the position and azimuth angle of the turning body is inherently based on the position and azimuth angle of the imaging device.). Regarding claim 7, the above combination of Kovanen, Otani, Ababsa, and Tokura further teaches the control system of the work machine according to claim 1,wherein the target includes an identification mark (Kovanen [0062]: “The reference marker RM arranged in the worksite 13 may for example be aruco marker, . . .. Each reference marker RM provides at least one reference point RP . . ..” Understood that the reference point is identified using the aruco marker.), the processor stores correlation data indicating a relation between identification data defined by the identification mark and a three-dimensional position of the target (Kovanen [0063]: “Each of the reference points RP are identifiable and the locations of the reference points RP are determined in the worksite coordinate system WCS.”) and the processor acquires a three-dimensional position of the target from the storage unit based on the identification mark in the image (Kovanen [0063]: “Thus, having an identification data of a reference point RP, the location of the reference point RP in the worksite coordinate system WCS may be determined.”),. Regarding claim 8, the above combination of Kovanen, Otani, Ababsa, and Tokura further teaches the control system of the work machine according to claim 1, comprising: an inclination sensor disposed in the turning body ([0154]; Claim 20: “ . . . one or more sensors installed on the upper carriage of . . . the machine . . . compris[ing] . . . inclination . . . ”), wherein the processor calculates an inclination angle of the turning body based on detection data of the inclination sensor (Claim 20: Inclination of the upper carriage inherently calculated from inclination sensor data.), and acquires the inclination angle of the turning body (Claim 20: “ . . . wherein determination, by the at least one position determination unit, of the location and orientation of the machine in the worksite is additionally based in part on data received from one or more sensors . . . ”). Regarding claim 9, the above combination of Kovanen, Otani, Ababsa, and Tokura further teaches the control system of the work machine according to claim 1, wherein, after calculating the position and the azimuth angle of the turning body, when the turning body performs a turning motion, the processor calculates the position and the azimuth angle of the turning body based on an image of at least one target ([0151]: “On the validity of the determined location and orientation of the machine in the worksite it may be affected by acquiring the data by the at least one environment modelling apparatus and by the at least one tracking apparatus sufficient frequently, in case of the moving machine preferably substantially continuously, so that the determined location and orientation will not be based on very old data.” Understood that if the location/orientation of the excavator is calculated continuously, it will be calculated after the turning body performs a turning motion or any other motion is performed.). Regarding claim 10, the above combination of Kovanen, Otani, Ababsa, and Tokura further teaches the control system of the work machine according to claim 1, wherein, after calculating the position and the azimuth angle of the turning body, when the turning body performs a turning motion ([0151]: Determination of location/orientation performed continuously, understood this means it will be calculated whenever the turning body turns.), the processor calculates the position and the azimuth angle of the turning body based on detection data of a turning sensor that detects turning of the turning body (Claim 20: “ . . . wherein determination, by the at least one position determination unit, of the location and orientation of the machine in the worksite is additionally based . . . on data received from . . . sensors compris[ing] . . . orientation . . . of the upper carriage . . . of: the machine.). Claim 12 is rejected over similar reasons to claim 1, applied to a method. Claim 14 is rejected over being equivalent in scope to claims 3 and 4, applied to a method. Claim 17 is rejected over similar reasons to claim 7, applied to a method. Claim 18 is rejected over similar reasons to claim 8, applied to a method. Claim 19 is rejected over similar reasons to claim 9, applied to a method. Claim 20 is rejected over similar reasons to claim 10, applied to a method. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HENRY RICHARD HINTON whose telephone number is (703)756-1051. The examiner can normally be reached Monday-Friday 7:30-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at (571) 272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HENRY R HINTON/Examiner, Art Unit 3665 /HUNTER B LONSBERRY/Supervisory Patent Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Dec 20, 2023
Application Filed
Aug 15, 2025
Non-Final Rejection — §103
Nov 12, 2025
Response Filed
Dec 29, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601599
SYSTEM AND METHOD FOR IMPROVING THE LINEAR FEATURE AT INTERSECTION LOCATION
2y 5m to grant Granted Apr 14, 2026
Patent 12566066
HYBRID INERTIAL/STELLAR NAVIGATION METHOD WITH HARMONIZATION PERFORMANCE INDICATOR
2y 5m to grant Granted Mar 03, 2026
Patent 12559914
EXCAVATOR MANAGEMENT SYSTEM, MOBILE TERMINAL FOR EXCAVATOR, AND RECORDING MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12523018
Management Apparatus and Management System for Work Machine
2y 5m to grant Granted Jan 13, 2026
Patent 12510897
RETURN NODE MAP
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+33.7%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 46 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month