Prosecution Insights
Last updated: April 19, 2026
Application No. 18/422,417

OPTICAL ALIGNMENT METHOD AND SYSTEM FOR TRANSPARENT DISPLAYS TO OVERCOME PARALLAX

Final Rejection §102§103
Filed
Jan 25, 2024
Examiner
MILLER, PRESTON JAY
Art Unit
3661
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Eagle Technology LLC
OA Round
2 (Final)
56%
Grant Probability
Moderate
3-4
OA Rounds
3y 1m
To Grant
75%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
28 granted / 50 resolved
+4.0% vs TC avg
Strong +19% interview lift
Without
With
+18.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
39 currently pending
Career history
89
Total Applications
across all art units

Statute-Specific Performance

§101
17.7%
-22.3% vs TC avg
§103
48.0%
+8.0% vs TC avg
§102
15.3%
-24.7% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 50 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims 2. This office action is in response to Amendments and Remarks filed on 11/17/2025 for application number 18/422,417 filed on 01/25/2024, in which claims 1-20 were previously presented for examination. 3. Claim(s) 19 has/have been canceled, and claim(s) 1, 6, 7, 12, 16-17, and 20 has/have been amended. Accordingly, claim(s) 1-18. and 20 is/are currently pending. Prior Art of Record 4. The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. The prompt development of a clear issue requires that the replies of the Applicant meet the objections to and rejections of the claims. Applicant should also specifically point out the support for any amendments made to the disclosure (see MPEP §2163.06). Applicant is reminded that the Examiner is entitled to give the Broadest Reasonable Interpretation (BRI) of the language of the claims. Furthermore, the Examiner is not limited to Applicant’s definition which is not specifically set forth in the claims. SEE MPEP 2141.02 [R-07.2015] VI. PRIOR ART MUST BE CONSIDERED IN ITS ENTIRETY, INCLUDING DISCLOSURES THAT TEACH AWAY FROM THE CLAIMS: A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123. Response to Arguments 5. Applicant's arguments filed 11/17/2025 have been fully considered but they are not persuasive. 6. Applicant is advised if they keep amending the independent claims differently from each other, then the application might require restriction. 7. Applicant argues the amended claim(s) 1 is/are allowable over Lim (US-20240144612-A1) in view of Kashter et al. (WO-2023031933-A1) and the other cited references. Applicant continues, independent claim 1 has been amended to incorporate features to distinguish the claim from the prior art references. The cited references fail to teach the newly amended features of computing an equation of straight line that represents a line-of-sight from the eye position to the external object position through the transparent display, based on the eye position, the vehicle position, and the external object position; computing a parallax-free position on the transparent display where the line-of-sight intersects the transparent display based on the position of the transparent display; and controlling the transparent display to position a graphic object on the transparent display at the parallax-free position to align the eye position, the graphic object, and the external object to avoid parallax when the eyes view the external object through the transparent display. 8. Indeed, these references do not teach the newly amended feature(s) mentioned above. As such, this amendment has necessitated a new reference Berkovich et al. (US-20110052009-A1) which upon review was found to also teach a number of other features of the independent and dependent claims and therefore has fully replaced Lim (US-20240144612-A1) and Kashter et al. (WO-2023031933-A1) and these references are no longer relied upon for rejection of the claims. Applicant is referred to Claim Rejections - 35 USC § 102 and Claim Rejections - 35 USC § 103 sections below. 9. As such, this argument is moot. 10. Applicant argues independent claim(s) 12 has/have been amended similar to independent claim 1 and it/they is/are allowable for reasons similar to those presented in favor of patentability of claim 1. 11. This argument is unpersuasive as each independent claim has been fully rejected and for the reasons given above. 12. Applicant argues dependent claim(s) 2-11, and 13-15 is/are patentable by the virtue of their dependency on independent claims 1 or 12 and the additional features recited in the dependent claims. 13. This argument is unpersuasive as each independent claim and dependent claim has been fully rejected and for the reasons given above. 14. In regard to claim 16 and its dependent claims, Applicant is referred to Allowable Subject Matter section. 15. This argument is unpersuasive as each independent claim and dependent claim has been fully rejected and for the reasons given above. Claim Rejections - 35 USC § 102 16. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 17. Claim(s) 1-6 and 10-15 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Berkovich et al. (US-20110052009-A1). In regard to claim 1 , Berkovich teaches a method performed by a transparent display alignment system in a vehicle and including a transparent display through which eyes of a user behind the transparent display view an external scene in front of the transparent display, comprising: ([0016] A method for providing a spatially aligned head-up display to a user viewing a scene through a display, the method including the steps of sampling images of the user's face and processing the images to determine a position and attitude of the user's face and determining a viewing direction from at least one eye of the user to a point of interest within the scene and displaying on a display a visible indication aligned with the viewing direction that provides a spatially aligned head-up display to the user.) receiving an eye position of the eyes, a vehicle position for the vehicle, a position of the transparent display, and an external object position of an external object in the external scene; (Fig. 2, [0020] Determining a viewing direction includes the steps of tracking a geographical position of the vehicle. [0078] The location detection system is a global positioning system (GPS) based system. [0090-0092] At block 208, the position and attitude of the user's face is determined. Using the position and attitude of the user's face a viewing direction is determined, in block 210, from at least one eye of the user to a point of interest within the scene. After determining the viewing direction, in block 210, the location on the display where the view of the user intercepts the display needs to be determined, shown in block 212, and referred to as the visible indication location. Examiner notes, the point of interest is the external object. Examiner notes, the position of the transparent display is stationary and known.) computing an equation of a straight line that represents a line-of-sight from the eye position to the external object position through the transparent display, based on the eye position, the vehicle position, and the external object position; (Fig. 3A, [0073] Using the position and attitude of the user's face the control system determines a line of sight from at least one eye of the user 105, or in other words, where are the person's eyes and where are the eyes looking. Position refers to the three-dimensional location of the user's face relative to a given reference. Examiner notes, determining a line of sight is computing an equation of a straight line that represents a line-of-sight from the eye position to the external object position through the transparent display, based on the eye position, the vehicle position, and the external object position. The line of sight for an object depends of the field of view which depends on the location of the vehicle.) computing a parallax-free position on the transparent display where the line-of-sight intersects the transparent display based on the position of the transparent display; and (Fig. 3A (reproduced and annotated below for Applicant’s convenience), [0094] Using both a right eye 300A and a left eye 300B a user looks through a display 302 at a point of interest 304. The viewing direction 306A from the right eye 300A intercepts the display in location 308A. Similarly, the viewing direction 306B from the left eye 300B intercepts the display in location 308B.) PNG media_image1.png 555 1087 media_image1.png Greyscale Annotated Fig. 3A of Berkovich controlling the transparent display to position a graphic object on the transparent display at the parallax-free position to align the eye position, the graphic object, and the external object to avoid parallax when the eyes view the external object through the transparent display. (Fig. 1, [0064] The control system processes the data from the image sensing system 102 to determine a position and attitude of the user's face 104. Next, the control system determines a viewing direction 105 from at least one eye of the user to a point of interest within the scene 116. The control system uses the viewing direction to actuate the display 108 to display a visible indication 112, 114 aligned with the viewing direction. The display displays visible indications to the user superimposed on the scene. Examiner notes, as mentioned above, the point of interest is the external object.) In regard to claim 2 , Berkovich teaches the method of claim 1, wherein positioning includes positioning the graphic object at the parallax-free position to overlap the external object when viewed through the transparent display. (Fig. 1, [0064] The control system determines a viewing direction 105 from at least one eye of the user to a point of interest within the scene 116. The control system uses the viewing direction to actuate the display 108 to display a visible indication 112, 114 aligned with the viewing direction. The display displays visible indications to the user superimposed on the scene. Examiner notes, the visible indication is the graphic object at the parallax-free position to overlap the external object when viewed through the transparent display.) In regard to claim 3 , Berkovich teaches the method of claim 1, wherein the transparent display is fixed in position to the vehicle and physically detached from the user to allow relative movement between the user and the transparent display. (Fig. 3A, [0024] Using both a right eye 300A and a left eye 300B a user looks through a display 302 at a point of interest 304. Examiner notes, as mentioned above and illustrated by Fig. 3A, the transparent display is fixed in position to the vehicle and physically detached from the user to allow relative movement between the user and the transparent display.) In regard to claim 4 , Berkovich teaches the method of claim 1, wherein: receiving the eye position includes receiving the eye position as a tracked eye position from an eye tracker that tracks the eye position. ([0073] Using the position and attitude of the user's face the control system determines a line of sight from at least one eye of the user 105, or in other words, where are the person's eyes and where are the eyes looking.) In regard to claim 5 , Berkovich teaches the method of claim 4, further comprising: receiving a new tracked eye position from the eye tracker responsive to movement of the user relative to the transparent display; (Fig. 2, [0020] Determining a viewing direction includes the steps of tracking a geographical position of the vehicle. [0078] The location detection system is a global positioning system (GPS) based system. [0090-0092] At block 208, the position and attitude of the user's face is determined. Using the position and attitude of the user's face a viewing direction is determined, in block 210, from at least one eye of the user to a point of interest within the scene. After determining the viewing direction, in block 210, the location on the display where the view of the user intercepts the display needs to be determined, shown in block 212, and referred to as the visible indication location. Examiner notes, the point of interest is the external object. Examiner notes, the position of the transparent display is stationary and known.) computing a new parallax-free position on the transparent display that intersects with a new line-of-sight from the new tracked eye position to the external object position through the transparent display; and (Fig. 3A, [0073] Using the position and attitude of the user's face the control system determines a line of sight from at least one eye of the user 105, or in other words, where are the person's eyes and where are the eyes looking. Position refers to the three-dimensional location of the user's face relative to a given reference. Examiner notes, determining a line of sight is computing an equation of a straight line that represents a line-of-sight from the eye position to the external object position through the transparent display, based on the eye position, the vehicle position, and the external object position. The line of sight for an object depends of the field of view which depends on the location of the vehicle.) repositioning the graphic object on the transparent display at the new parallax-free position to align the new tracked eye position, the graphic object, and the external object to avoid the parallax. (Fig. 1, [0064] The control system processes the data from the image sensing system 102 to determine a position and attitude of the user's face 104. Next, the control system determines a viewing direction 105 from at least one eye of the user to a point of interest within the scene 116. The control system uses the viewing direction to actuate the display 108 to display a visible indication 112, 114 aligned with the viewing direction. The display displays visible indications to the user superimposed on the scene. Examiner notes, as mentioned above, the point of interest is the external object.) In regard to claim 6 , Lim, as modified by Kashter, teaches the method of claim 1, wherein: computing the parallax-free position on the transparent display includes translating the parallax-free position to pixel coordinates of the transparent display, (Fig. 3A, [0094] Using both a right eye 300A and a left eye 300B a user looks through a display 302 at a point of interest 304. The viewing direction 306A from the right eye 300A intercepts the display in location 308A. Similarly, the viewing direction 306B from the left eye 300B intercepts the display in location 308B. Examiner notes, the location of 308A and 308B is the parallax-free position on the transparent display. As illustrated by Fig. 3A, The 308A and 308B are on the display. That is translating the parallax-free position to pixel coordinates of the transparent display.) wherein controlling the transparent display includes positioning the graphic object at the pixel coordinates on the transparent display. (Fig. 3A, [0094] Using both a right eye 300A and a left eye 300B a user looks through a display 302 at a point of interest 304. The viewing direction 306A from the right eye 300A intercepts the display in location 308A. Similarly, the viewing direction 306B from the left eye 300B intercepts the display in location 308B. Examiner notes, the point of interest 304 is the viewed graphic object that is displayed on the screen at 308A and 308B.) In regard to claim 10 , Berkovich teaches the method of claim 1, wherein: receiving includes receiving the external object position from one of a sensor mounted on the vehicle and an object and terrain database. (Fig. 1, [0078] The location detection system includes an image sensor and processing to correlate a sampled image of a scene with known images and positions of scenes. In another implementation, the texture of the surrounding scene can be captured and correlated with images from a geographical database in conjunction with a digital terrain map (DTM) to determine the location of the platform. Examiner notes, an image sensor is a sensor mounted on the vehicle.) In regard to claim 11 , Berkovich teaches the method of claim 1, wherein: receiving includes receiving the vehicle position from a global positioning system receiver. (Fig. 1, [0078] A location detection system 118, or geo-positioning device, provides information on the geographical position of the platform, and hence the location of the user relative to the scene of the world around the user. The location detection system is a global positioning system (GPS) based system.) In regard to claim 12 , Berkovich teaches a transparent display alignment system for in a vehicle, comprising: ([0009] & [0072] A system for providing a spatially aligned head-up display to a user viewing a scene through a display the system. The display is a transparent device that allows the user to view a scene on the opposite side of the display.) a transparent display through which eyes of a user, when positioned behind the transparent display, view an external scene in front of the transparent display; and ([0016] A method for providing a spatially aligned head-up display to a user viewing a scene through a display, the method including the steps of sampling images of the user's face and processing the images to determine a position and attitude of the user's face and determining a viewing direction from at least one eye of the user to a point of interest within the scene and displaying on a display a visible indication aligned with the viewing direction that provides a spatially aligned head-up display to the user.) a controller configured to perform: (Fig. 1, and [0064] The control system 106 contains at least one processor, and is configured to provide processing for the system.) receiving an eye position of the eyes, a vehicle position for the vehicle, a position of the transparent display, and an external object position of an external object in the external scene; (Fig. 2, [0020] Determining a viewing direction includes the steps of tracking a geographical position of the vehicle. [0078] The location detection system is a global positioning system (GPS) based system. [0090-0092] At block 208, the position and attitude of the user's face is determined. Using the position and attitude of the user's face a viewing direction is determined, in block 210, from at least one eye of the user to a point of interest within the scene. After determining the viewing direction, in block 210, the location on the display where the view of the user intercepts the display needs to be determined, shown in block 212, and referred to as the visible indication location. Examiner notes, the point of interest is the external object. Examiner notes, the position of the transparent display is stationary and known.) computing an equation of a straight line that represents a line-of-sight from the eye position to the external object position through the transparent display, based on the eye position, the vehicle position, and the external object position; (Fig. 3A, [0073] Using the position and attitude of the user's face the control system determines a line of sight from at least one eye of the user 105, or in other words, where are the person's eyes and where are the eyes looking. Position refers to the three-dimensional location of the user's face relative to a given reference. Examiner notes, determining a line of sight is computing an equation of a straight line that represents a line-of-sight from the eye position to the external object position through the transparent display, based on the eye position, the vehicle position, and the external object position. The line of sight for an object depends of the field of view which depends on the location of the vehicle.) computing a parallax-free position on the transparent display where the line-of- sight intersects the transparent display based on the position of the transparent display; and (Fig. 3A (reproduced and annotated above for Applicant’s convenience), [0094] Using both a right eye 300A and a left eye 300B a user looks through a display 302 at a point of interest 304. The viewing direction 306A from the right eye 300A intercepts the display in location 308A. Similarly, the viewing direction 306B from the left eye 300B intercepts the display in location 308B.) controlling the transparent display to position a graphic object on the transparent display at the parallax-free position to align the eye position, the graphic object, and the external object to avoid parallax when the eyes view the external object through the transparent display. (Fig. 1, [0064] The control system processes the data from the image sensing system 102 to determine a position and attitude of the user's face 104. Next, the control system determines a viewing direction 105 from at least one eye of the user to a point of interest within the scene 116. The control system uses the viewing direction to actuate the display 108 to display a visible indication 112, 114 aligned with the viewing direction. The display displays visible indications to the user superimposed on the scene. Examiner notes, as mentioned above, the point of interest is the external object.) In regard to claim 13 , Berkovich teaches the transparent display alignment system of claim 12. Claim 13 recites a system having substantially the same features of claim 2 above, therefore claim 13 is rejected for the same reasons as claim 2. In regard to claim 14 , Berkovich teaches the transparent display alignment system of claim 12. Claim 14 recites a system having substantially the same features of claim 3 above, therefore claim 14 is rejected for the same reasons as claim 3. In regard to claim 15 , Berkovich teaches the transparent display alignment system of claim 12. Claim 15 recites a system having substantially the same features of claim 4 above, therefore claim 15 is rejected for the same reasons as claim 4. Claim Rejections - 35 USC § 103 18. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 19. Claim(s) 7-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berkovich et al. (US-20110052009-A1) in view of Jiang et al. (US-20240087491-A1). In regard to claim 7 , Berkovich teaches the method of claim 6. Berkovich does not teach further comprising: normalizing the eye position, the vehicle position, the external object position, and the position of the transparent display to corresponding positions in a common three-dimensional reference coordinate system having a common origin, wherein computing includes computing the parallax-free position on the transparent display based on the corresponding positions in the common three-dimensional reference coordinate system. However, Jiang teaches an AR-HUD in the vehicle or another point with a fixed position in the vehicle is used as the origin to construct a real coordinate system and a virtual coordinate system, and a correspondence between the virtual coordinate system and the real coordinate system is determined. The real coordinate system is a coordinate system of real three-dimensional space, and is used to determine a real position of a human eye, a virtual image plane of the AR-HUD, a calibration object, or the like in the real world. The virtual coordinate system is a coordinate system of virtual three-dimensional space, and is used to determine a virtual position of the human eye, the virtual image plane of the AR-HUD, the calibration object, or the like in the real world, to render a three-dimensional AR effect ([0088]). The position of the human eye in the virtual coordinate system is used as the origin ([0091]). Converting the real coordinate system and a virtual coordinate system is normalizing the positions in a common three-dimensional reference coordinate system having a common origin. The virtual image plane of the AR-HUD is the known position of the transparent display. It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the spatially aligned head-up display of Berkovich by incorporating the teachings of Jiang, such that the positions, such as eye, vehicle and object positions are converted into a virtual coordinate system. The motivation to modify is that, as acknowledged by Jiang, to ensure that the display picture of the HUD is always fused with a real world under a premise that the driver has different sitting postures or the vehicle is driven by different drivers ([0005]) which one of ordinary skill would have recognized allows the HUD to become more useful. In regard to claim 8 , Berkovich, as modified by Jiang, teaches the method of claim 7. Further, Jiang teaches information such as the detected human eye, the calibration object, and an installation position and a projection angle of the AR-HUD is introduced into the real coordinate system based on the constructed real coordinate system, so that a position of the human eye, a position of the virtual image plane of the AR-HUD, and a position of the calibration object in the real coordinate system is separately acquired, where the position is specifically three-dimensional coordinates in the real coordinate system. A position of the human eye in the virtual coordinate system is obtained based on the position of the human eye in the real coordinate system and the correspondence between the virtual coordinate system and the real coordinate system. The position of the human eye in the virtual coordinate system is used as the origin ([0090]-[0091]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the spatially aligned head-up display of Berkovich, as already modified by Jiang, by further incorporating the teachings of Jiang, such that the position of the human eye in the virtual coordinate system is used as the common origin for the virtual coordinate system. The motivation to do so is the same as acknowledged by Jiang in regard to claim 7. In regard to claim 9 , Berkovich, as modified by Jiang, teaches the method of claim 7, wherein: the vehicle position and the external object position include real-world coordinates in one or more of (i) latitude and longitude, or (ii) range, azimuth, and bearing. ([0078] The location detection system is a global positioning system (GPS) based system. Examiner notes, A GPS outputs the latitude and the longitude of the position.) Allowable Subject Matter 20. Claim(s) 16-18, and 20 allowed. REASONS FOR ALLOWANCE 21. The following is an examiner’s statement of reasons for allowance: 22. In regard to claim 16, when reading the claim in light of the specification, as per MPEP §2111.01, none of the references of record, either individually or in combination, reasonably disclose the specific arrangement of elements in the same combination specified in independent claim 16. Independent claim 16 recites in particular “a reference display spaced-apart from the transparent display; aim sights including a first visible marker on the transparent display and a second visible marker on the reference display and spaced-apart from the first visible marker along an aim site-line extending from a fixed eye position behind the transparent display, through the aim sights, and to the external scene, such the aim sights serve as an eye-alignment guide for moving the eyes to the fixed eye position for parallax-free viewing of the external scene.” 23. No single prior art reference has been found to anticipate the claim as a whole, including the above particular limitations, nor has any combination of prior art references been found that would render these limitations obvious, when viewed in the context of the remaining limitations of the claim. 24. Dependent claim(s) 17-18, and 20 also contain(s) allowable subject matter by virtue of its/their dependency on independent claim 16. Conclusion 25. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Lim (US-20240144612-A1) teaches a vehicle AR display device and method. Kashter et al. (WO-2023031933-A1) teaches a Dynamic Parallel Monocular Projection (DPMP) system for the provision of dynamically adjusted images to a Head Motion Box (HMB). Takeuchi (WO-2023228752-A1) teaches an image display unit is configured to display an image that causes the observer to perceive a parallax-free image. 26. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). 27. A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. 28. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Preston J Miller whose telephone number is (703)756-1582. The examiner can normally be reached Monday through Friday 7:30 AM - 4:30 PM EST. 29. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. 30. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramya P Burgess can be reached at (571) 272-6011. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. 31. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /P.J.M./Examiner, Art Unit 3661 /RAMYA P BURGESS/Supervisory Patent Examiner, Art Unit 3661
Read full office action

Prosecution Timeline

Jan 25, 2024
Application Filed
Sep 10, 2025
Non-Final Rejection — §102, §103
Oct 30, 2025
Interview Requested
Nov 12, 2025
Examiner Interview Summary
Nov 17, 2025
Response Filed
Jan 23, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12559091
CONTROL DEVICE FOR CONTROLLING SAFETY DEVICE IN VEHICLE
2y 5m to grant Granted Feb 24, 2026
Patent 12490678
VEHICLE LOCATION WITH DYNAMIC MODEL AND UNLOADING CONTROL SYSTEM
2y 5m to grant Granted Dec 09, 2025
Patent 12466388
Method for Operating a Motor Vehicle Drive Train and Electronic Control Unit for Carrying Out Said Method
2y 5m to grant Granted Nov 11, 2025
Patent 12454806
WORK MACHINE
2y 5m to grant Granted Oct 28, 2025
Patent 12447827
Electric Vehicle Control Device, Electric Vehicle Control Method, And Electric Vehicle Control System
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
56%
Grant Probability
75%
With Interview (+18.8%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 50 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month