Prosecution Insights
Last updated: April 19, 2026
Application No. 18/288,431

DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND WORK MACHINE

Final Rejection §102
Filed
Oct 26, 2023
Examiner
CHOW, JEFFREY J
Art Unit
2618
Tech Center
2600 — Communications
Assignee
Komatsu Ltd.
OA Round
2 (Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
92%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
502 granted / 655 resolved
+14.6% vs TC avg
Strong +16% interview lift
Without
With
+15.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
27 currently pending
Career history
682
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
40.2%
+0.2% vs TC avg
§102
27.1%
-12.9% vs TC avg
§112
10.6%
-29.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 655 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments regarding claims 1 and 4 - 6, filed 26 August 2025, have been fully considered but they are not persuasive. Applicant argues Matsushita et al. (US 2021/0304708) does not disclose anything about indicating a direction corresponding to a predetermined instruction operation of forward or backward movement of an operating device for operating the undercarriage (page 6). Matsushita discloses pattern J1 can indicate forward and rearward movement directions (paragraph 35). Thus Matsushita teaches wherein the image indicating the traveling attention area includes a direction corresponding to a predetermined instruction operation of forward or backward movement for an operating device for operating the undercarriage Applicant argues Matsushita does not disclose an image (i.e., crossline image L1) indicating a traveling attention area accompany forward and backward movement of the undercarriage (pages 6, 7). The claim recites, wherein the image indicating the traveling attention area includes an arrow image indicating a length corresponding to the traveling attention area. Matsushita discloses two arrow patterns J1 at a distance apart corresponding to the distance between the center of the left track and the center of the right track (Figures 4A, 4B). Since the distances apart between the centers of the left track and the right track corresponds to the length/width of the undercarriage, Matsushita discloses an arrow image indicating the traveling attention area includes an arrow image indicating a length corresponding to the traveling attention area. Applicant’s arguments regarding claim 3, filed 26 August 2025, have been fully considered and are persuasive. The prior art rejection has been withdrawn. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 2, and 4 – 8 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Matsushita et al. (US 2021/0304708). Regarding independent claim 1, Matsushita teaches a display control device (Figure 1), comprising: a processor (paragraph 19: CPU) configured to: generate an overhead image around a work machine (paragraph 39: the control unit 210 is preferably adapted to receive sensor data relating a surrounding of the working machine 100, such as from a camera arrangement 216; Figure 1: camera arrangement 216 is located on the top portion of the cab) including an undercarriage and an upper swiveling body swivelably supported by the undercarriage (paragraph 18 and Figure 1: upper turning body 2 that turns with respect to the lower traveling body 3) based on one or a plurality of captured images imaged by one or a plurality of imaging devices (paragraph 21: The detection unit 20 is, for example, a scanning type distance measurement device such as light detection and ranging (LiDAR), or an imaging camera, and acquires an image (captured image, distance image, or the like) around the work machine 1. The detection units 20 may be provided at a plurality of locations on the work machine 1. The detection unit 20 acquires data for generating a peripheral image of at least a region behind and regions on the right and left of the work machine 1 from the inside to the outside of the turning radius of the work machine 1) provided in the upper swiveling body (paragraph 19: The periphery display device 40 includes an interface 43 that receives information from the detection unit 20 and a control unit for the work machine 1, a bird's-eye view image creation unit 42 that creates a bird's-eye view image around the work machine 1 based on the information received by the interface, 43, and a display unit 41 such as a liquid crystal monitor that outputs the created bird's-eye view image); generate a display image in which an image indicating a swiveling attention area accompanying swiveling of the upper swiveling body (paragraph 23 and Figure 3: In the bird's-eye view image data, the work machine 1, a first peripheral line Cl indicating the turning radius of the work machine 1, a second peripheral line C2 at 1 m outward from the turning radius, and a third peripheral line C3 at 2 m outward from the turning radius are illustrated by imaginary lines. At the worksite, a region between the first peripheral line Cl and the second peripheral line C2 corresponds to a warning region requiring high caution, and a region between the second peripheral line C2 and the third peripheral line C3 corresponds to a caution region requiring medium caution. The bird's-eye view image includes displays of a plurality of objects Hl to H4 detected by the detection unit 20. The objects detected by the detection unit 20 include a person such as a worker or a pedestrian, an obstacle, and the like) and an image indicating a traveling attention area accompanying forward and backward movement of the undercarriage are superimposed on the overhead image (paragraph 35 and Figures 4A, 4B: Further, the bird's-eye view image creation unit 42 causes the bird's-eye view image El to include a pattern J1 indicating a forward movement direction of the lower traveling body 3. For example, the bird's-eye view image creation unit 42 adds the pattern J1 to the picture B of the work machine 1 in an overlapping manner. Therefore, when the upper turning body 2 turns and the lower traveling body 3 in the bird's-eye view image El turns, the forward movement direction of the lower traveling body 3 is indicated by the pattern Jl. Accordingly, when the operator causes the lower traveling body 3 to travel, the operator can see the bird's-eye view image El to identify a traveling direction without confusion. The pattern J1 may not indicate the forward movement direction, but indicate a rearward movement direction); and output the display image (Figures 3 – 8: Image of C1 and Images of J1 are combined together), wherein the image indicating the traveling attention area includes an arrow image indicating a length corresponding to the traveling attention area (Figures 4A, 4B: two arrows J1 at a distance apart corresponding to the distance between the center of the left track and the center of the right track) and a direction corresponding to a predetermined instruction operation of forward or backward movement for an operating device for operating the undercarriage (paragraph 35 and Figures 4A, 4B: Further, the bird's-eye view image creation unit 42 causes the bird's-eye view image El to include a pattern J1 indicating a forward movement direction of the lower traveling body 3. The pattern J1 may not indicate the forward movement direction, but indicate a rearward movement direction). Regarding dependent claim 4, Matsushita teaches wherein the processor is configured to: generate a traveling direction image corresponding to a traveling direction of the undercarriage based on the one or the plurality of captured images, and output the traveling direction image together with the display image (paragraph 35: when the operator causes the lower traveling body 3 to travel, the operator can see the bird's-eye view image El to identify a traveling direction without confusion; Figure 3: traveling direction icon J1 aligned to the track of the lower traveling body 3). Regarding claims 5 and 6, claims 5 and 6 are similar in scope as to claim 1, thus the rejections for claim 1 hereinabove are applicable to claims 5 and 6. Matsushita teaches a work machine (Figure 1: Work Machine 1), comprising: an undercarriage (Figure 1: lower traveling body 3); an upper swiveling body (Figure 1: upper turning body 2) swivelably supported by the undercarriage (paragraph 18 and Figure 1: upper turning body 2 that turns with respect to the lower traveling body 3); one or a plurality of imaging devices (Figure 1: Detection Units 20) provided in the upper swiveling body (paragraph 21: The detection unit 20 is, for example, a scanning type distance measurement device such as light detection and ranging (LiDAR), or an imaging camera, and acquires an image (captured image, distance image, or the like) around the work machine 1. The detection units 20 may be provided at a plurality of locations on the work machine 1); and a display control device (Figure 1: Periphery Display Device 40 comprising Display Unit 41). Allowable Subject Matter Claim 3 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEFFREY J CHOW whose telephone number is (571)272-8078. The examiner can normally be reached 11AM-7PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona Faulk can be reached at 571-272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEFFREY J CHOW/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Oct 26, 2023
Application Filed
May 23, 2025
Non-Final Rejection — §102
Aug 26, 2025
Response Filed
Sep 30, 2025
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602845
UNIVERSAL STATE REPRESENTATIONS OF VISUALIZATIONS FOR DIFFERENT TYPES OF DATA MODELS
2y 5m to grant Granted Apr 14, 2026
Patent 12591949
IMAGE GENERATION USING ONE OR MORE NEURAL NETWORKS
2y 5m to grant Granted Mar 31, 2026
Patent 12586305
3D REFERENCE POINT DETECTION FOR SURVEY FOR VENUE MODEL CONSTRUCTION
2y 5m to grant Granted Mar 24, 2026
Patent 12586267
INTERACTION METHOD AND APPARATUS IN LIVE STREAMING ROOM, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12579735
A VISUALIZATION SYSTEM FOR CREATING A MIXED REALITY GAMING ENVIRONMENT
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
92%
With Interview (+15.8%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 655 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month