Prosecution Insights
Last updated: April 19, 2026
Application No. 18/468,008

CONSTRUCTION MACHINE AND ASSISTING DEVICE FOR CONSTRUCTION MACHINE

Non-Final OA §103
Filed
Sep 15, 2023
Examiner
LEE, TYLER J
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sumitomo Construction Machinery Co. Ltd.
OA Round
3 (Non-Final)
92%
Grant Probability
Favorable
3-4
OA Rounds
2y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 92% — above average
92%
Career Allow Rate
863 granted / 938 resolved
+40.0% vs TC avg
Moderate +7% lift
Without
With
+6.8%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
25 currently pending
Career history
963
Total Applications
across all art units

Statute-Specific Performance

§101
10.2%
-29.8% vs TC avg
§103
38.6%
-1.4% vs TC avg
§102
30.0%
-10.0% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 938 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1, 4, 11, 13, 16, 18, 19, 21 and 22 are rejected under 35 U.S.C. 103 as being unpatentable over Agarwal et al. (Pub. No.: US 2019/0196466 A1) in view of Onuma et al. (Pub. No.: US 2013/0222573 A1). Regarding claim 1 (Currently Amended), Agarwal teaches an assisting device (12, FIG. 1), the assisting device comprising: a processor (28, FIG. 1) configured to determine whether a predetermined feature exists at a construction site, using an output from a sensor and information previously registered in the processor as one or more predetermined features (“image processor 36 configured to perform object-association by comparing an image 40 from the detector 22 to one or more instances of stored images 38. As used herein, the image 40 may be comparable to a photograph provided by the camera, a radar-return-map provided by the radar, a point-cloud provided by the lidar, or a hybrid/combination of any two or more of the photograph, radar-return-map, and point-cloud. As will be recognized by those in the object classification arts, the stored-images 38 may include thousands of images, each of which having been previously classified as being associated with an instance of an object that may be found at or near the construction-zone 20.” ¶ 12); acquire feature data which is data concerning a position of the predetermined feature (image processor 36 configured to perform object association by comparing an image from the detector to one or more instances of stored images 38. ¶ 12) in response to determining that the predetermined feature exits at the construction site (construction object in a construction zone is compared to stored images ¶ 12 and confirmed objected location is map location and confirm “Yes” object associated with construction zones? Step 130, 135; FIG. 3); associate position information with respect to the construction site with the feature data to generate combined data (Object is localization object? 120 Object location = Map Location 130, Object Associated with Construction Zones 135; FIG. 3). Agarwal is silent to a construction machine wherein the assisting device control the construction machine to avoid the construction machine coming into contact with the predetermined feature based on the combined data. However, in a similar field of endeavor, Onuma teaches an obstacle detection unit that detects obstacle present around a working machine similar to machine 24 (FIG.2) in Agarwal. More specifically, features of extracted region that represents the whole body of an obstacle is identified (S505-S508, FIG. 7) and a risk level is set for the obstacles in a hazard zone (S2070, FIG. 14). Based on the information, the construction machine maybe operated accordingly to avoid contact with any obstacles (e.g., rightward swing, FIG. 15 and ¶ 2). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention to modify the vehicle that determines the combined data based on a construction vehicle on a construction site with surrounding predetermined features taught by Agarwal to be a construction machine that avoids coming into contact with the predetermined feature based on the combined data as taught by Onuma to enhance working efficiency and safety (¶¶ 4-5). Regarding claims 4 and 13, Agarwal discloses the assisting device and construction machine wherein: the processor is configured to: acquire object data which is data concerning a position of an object existing near the construction machine when a distance between the object and the construction machine becomes less than or equal to a predetermined distance, the object being not identified as one of the one or more predetermined features (host vehicle is closer than a threshold distance to the construction zone ¶ 10); and combine the object data with design data or topography data to generate the combined data (40, 34; FIG. 1). Regarding claim 11 (Currently Amended), Agarwal teaches a construction machine, comprising: a lower traveling body (24, FIG. 2); an upper swinging body swingably mounted on the lower traveling body (24, FIG. 2); and a processor (28, FIG. 1) configured to determine whether a predetermined feature exists at a construction site, using an output from a sensor and information previously registered in the processor as one or more predetermined features (“image processor 36 configured to perform object-association by comparing an image 40 from the detector 22 to one or more instances of stored images 38. As used herein, the image 40 may be comparable to a photograph provided by the camera, a radar-return-map provided by the radar, a point-cloud provided by the lidar, or a hybrid/combination of any two or more of the photograph, radar-return-map, and point-cloud. As will be recognized by those in the object classification arts, the stored-images 38 may include thousands of images, each of which having been previously classified as being associated with an instance of an object that may be found at or near the construction-zone 20.” ¶ 12) acquire feature data which is data concerning a position of a predetermined feature (image processor 36 configured to perform object association by comparing an image from the detector to one or more instances of stored images 38. ¶ 12) in response to determining that the predetermined feature exists at the construction site (construction object in a construction zone is compared to stored images ¶ 12 and confirmed objected location is map location and confirm “Yes” object associated with construction zones? Step 130, 135; FIG. 3); associate position information with respect to the construction site with the feature data to generate combined data (Object is localization object? 120 Object location = Map Location 130, Object Associated with Construction Zones 135; FIG. 3). Agarwal is silent to a construction machine wherein the assisting device control the construction machine to avoid the construction machine coming into contact with the predetermined feature based on the combined data. However, in a similar field of endeavor, Onuma teaches an obstacle detection unit that detects obstacle present around a working machine similar to machine 24 (FIG.2) in Agarwal. More specifically, features of extracted region that represents the whole body of an obstacle is identified (S505-S508, FIG. 7) and a risk level is set for the obstacles in a hazard zone (S2070, FIG. 14). Based on the information, the construction machine maybe operated accordingly to avoid contact with any obstacles (e.g., rightward swing, FIG. 15 and ¶ 2). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention to modify the vehicle that determines the combined data based on a construction vehicle on a construction site with surrounding predetermined features taught by Agarwal to be a construction machine that avoids coming into contact with the predetermined feature based on the combined data as taught by Onuma to enhance working efficiency and safety (¶¶ 4-5). Regarding claim 16, Onuma teaches the assisting device, wherein the processor is configured to suppress or stop a movement of a hydraulic actuator of the construction machine to avoid the construction machine coming into contact with the predetermined feature (hydraulic excavator is limited and suppressed ¶ 68). It would have been obvious to modify Agarwal to suppress or stop a movement of a hydraulic actuator of the construction machine to avoid the construction machine coming into contact with the predetermined feature as taught by Onuma to enhance working efficiency and safety (¶¶ 4-5). Regarding claims 18 and 21, Agarwal discloses the assisting device and construction machine, wherein the processor is configured to determine whether the predetermined feature exists by applying an image recognition process to image data acquired by the sensor (¶ 12). Regarding claims 19 and 22, Agarwal discloses the assisting device and construction machine, wherein the processor is configured to determine whether the predetermined feature exists based on information on past work of the construction machine when the predetermined feature is a completed part of construction (FIG. 2). Claim(s) 3 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Agarwal et al. (Pub. No.: US 2019/0196466 A1) in view of Onuma et al. (Pub. No.: US 2013/0222573 A1) as applied to claim 1 above, and further in view of Izumikawa (Pub. No.: US 2018/0340316 A1). Regarding claim 3, Agarwal and Onuma is silent to the assisting device, wherein: the predetermined feature is a completed part of construction that is a part for which construction has been completed; a handrail installed at a top of a slope that is a target of slope shaping work; or makeshift stairs, a wall, an electric wire, a read cone, or a building installed on the slope that is the target of the slope shaping work. However, in the same field of endeavor, Izumikawa teaches an excavator with a terrain image display section for displaying a terrain image of a work site. An excavator may directly face a slope of a work target and a direction deviation display image may be displayed. A slope of a work target may be distinguishable from other surfaces and may be displayed (¶ 101). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Agarwal and Onuma to have predetermined features including target of slope shaping work as taught by Izumikawa to enhance efficient and safe operation of the operator (¶ 5). Regarding claim 8, Izumikawa teaches the assisting device wherein: the processor is configured to display the combined data on a display different from a display on which image data acquired by the sensor is displayed (First and second displays D3, FIG. 2). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Agarwal and Onuma to wherein the processor is configured to display the combined data on a display different from a display on which image data acquired by the sensor is displayed as taught by Izumikawa to enhance efficient and safe operation of the operator (¶ 5). Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Agarwal et al. (Pub. No.: US 2019/0196466 A1) in view of Onuma et al. (Pub. No.: US 2013/0222573 A1) as applied to claim 1 above, and further in view of Ishimoto (Pub. No.: US 2013/0088593 A1). Regarding claim 5, Ishimoto teaches the assisting device wherein: the feature data is updated at a predetermined time (position information is updated ¶ 74). It would have been obvious to modify Agarwal and Onuma to have feature data update at a predetermined time as taught by Ishimoto to enhance obstacle detection for improved safety (¶ 2). Claims 6, 7, 10, 14 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Agarwal et al. (Pub. No.: US 2019/0196466 A1) in view of Onuma et al. (Pub. No.: US 2013/0222573 A1) as applied to claim 1 above, and further in view of Yamamoto et al. (2022/0154425 A1). Regarding claims 6 and 14, Yamamoto teaches the assisting device and construction machine wherein: the processor is configured to display the combined data on a display, and display a pre-registered icon at a position of the predetermined feature in a virtual space represented by design data or topographic data (FIG. 3 and 74, 75, FIG. 8). It would have been obvious to modify Agarwal and Onuma to have the processor configured to display the combined data on a display, and display a pre-registered icon at a position of the predetermined feature in a virtual space represented by design data or topographic data as taught by Yamamoto to enhance obstacle detection for improved safety. Regarding claims 7 and 15, Yamamoto teaches the assisting device an construction machine wherein: the processor is configured to display the combined data on a display device, and display a model of the predetermined feature at a position of the predetermined feature in a virtual space represented by design data or topographic data (FIG. 3 and 74, 75, FIG. 8). It would have been obvious to modify Agarwal and Onuma to display the combined data on a display device, and display a model of the predetermined feature at a position of the predetermined feature in a virtual space represented by design data or topographic data as taught by Yamamoto to enhance obstacle detection for improved safety. Regarding claim 10, Yamamoto teaches the assisting device wherein: the construction machine is an autonomous construction machine (automatically stopped, decelerated ¶¶ 16, 18). It would have been obvious to modify Agarwal and Onuma to wherein the construction machine is an autonomous construction machine as taught by Yamamoto to enhance obstacle detection for improved safety. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Agarwal et al. (Pub. No.: US 2019/0196466 A1) in view of Onuma et al. (Pub. No.: US 2013/0222573 A1) as applied to claim 1 above, and further in view of Saiki (Pat. No.: US 11,873,620 B2). Regarding claim 9, Saiki teaches the assisting device, wherein: the construction machine is a remote-controlled construction machine (Remote operation controller, FIG. 3). It would have been obvious to modify Agarwal and Onuma to wherein the construction machine is a remote controlled as taught by Saiki to enhance flexibility in controlling the construction machine. Claims 17 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Agarwal et al. (Pub. No.: US 2019/0196466 A1) in view of Onuma et al. (Pub. No.: US 2013/0222573 A1) as applied to claims 1 and 11 respectively above, and further in view of Izumikawa et al. (Pub. No.: US 2020/0291606 A1). Regarding claims 17 and 20, Izumikawa teaches the assisting device and construction machine, wherein the position information with respect to the construction site is determined in a World Geodetic System (World Geodetic System and for a construction site ¶ 54). It would have been obvious to modify Agarwal and Onuma to wherein the position information with respect to the construction site is determined in a World Geodetic System as taught by Izumikawa to enhance location accuracy for mapping (Google). Response to Arguments Applicant's arguments filed 11/28/2025 have been fully considered but they are not persuasive. Applicants generally argue that prior art Agarwal fails to teach and render obvious “acquiring feature data which is data concerning a position of the predetermine feature in response to determining that the predetermined exists at the construction site.” Examiner respectfully disagrees and would like to reemphasize the portion in paragraph 12 where it recites: “image processor 36 configured to perform object-association by comparing an image 40 from the detector 22 to one or more instances of stored images 38. As used herein, the image 40 may be comparable to a photograph provided by the camera, a radar-return-map provided by the radar, a point-cloud provided by the lidar, or a hybrid/combination of any two or more of the photograph, radar-return-map, and point-cloud. As will be recognized by those in the object classification arts, the stored-images 38 may include thousands of images, each of which having been previously classified as being associated with an instance of an object that may be found at or near the construction-zone 20.” Examiner interprets the prior art citation to still read on the claimed limitations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to TYLER J LEE whose telephone number is (571)272-9727. The examiner can normally be reached M-F 7:30-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn can be reached at 571-272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TYLER J LEE/Primary Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Sep 15, 2023
Application Filed
Apr 30, 2025
Non-Final Rejection — §103
Aug 01, 2025
Response Filed
Aug 28, 2025
Final Rejection — §103
Nov 28, 2025
Response after Non-Final Action
Dec 24, 2025
Request for Continued Examination
Feb 02, 2026
Response after Non-Final Action
Feb 12, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601156
WORK MACHINE WITH OPERATOR DISPLAY
2y 5m to grant Granted Apr 14, 2026
Patent 12594958
MOTION PLANNING WITH IMPLICIT OCCUPANCY FOR AUTONOMOUS SYSTEMS
2y 5m to grant Granted Apr 07, 2026
Patent 12589730
VEHICLE MOTION MANAGEMENT BASED ON TORQUE REQUEST WITH SPEED LIMIT
2y 5m to grant Granted Mar 31, 2026
Patent 12590440
SYSTEMS AND METHODS FOR CONTROL OF EXCAVATORS AND OTHER POWER MACHINES
2y 5m to grant Granted Mar 31, 2026
Patent 12583473
NOTIFICATION CONTROL DEVICE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
92%
Grant Probability
99%
With Interview (+6.8%)
2y 1m
Median Time to Grant
High
PTA Risk
Based on 938 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month