Prosecution Insights
Last updated: April 19, 2026
Application No. 18/764,082

NAVIGATION DEVICE AND STORAGE MEDIUM

Final Rejection §103
Filed
Jul 03, 2024
Examiner
WANG, KAI NMN
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Jidosha Kabushiki Kaisha
OA Round
2 (Final)
54%
Grant Probability
Moderate
3-4
OA Rounds
3y 4m
To Grant
65%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
41 granted / 76 resolved
+1.9% vs TC avg
Moderate +11% lift
Without
With
+10.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
44 currently pending
Career history
120
Total Applications
across all art units

Statute-Specific Performance

§101
17.4%
-22.6% vs TC avg
§103
47.9%
+7.9% vs TC avg
§102
9.8%
-30.2% vs TC avg
§112
23.4%
-16.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 76 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims • This action is in reply to the Application Number 18/764, 082 filed on 07/03/2024. • Claims 1-13 are currently pending and have been examined. • This action is made FINAL in response to the “Amendment” and “Remarks” filed on 12/23/2025. • The examiner would like to note that this application is now being handled by examiner Kai Wang. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in Application No. 18/764, 082 filed on 07/03/2024. Information Disclosure Statement The information disclosure statements (IDS) submitted on 07/03/2024, 02/13/2026 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. CLAIM INTERPRETATION The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: input device in claims 1-2, 11. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 5, 7, 9 are rejected under 35 U.S.C. 103 as being unpatentable over JUNG (US20190217863A1) in view of Beard (US9021384B1). Regarding Claims 1 and 5: JUNG teaches: A navigation device configured to be mounted on a subject vehicle, the navigation device comprising: (JUNG, para [40], “the vehicle 100 can include… a navigation system”) an information processor that stores map data… in advance; (JUNG, para [112], “The navigation information may include at least one of map information”) a display configured to show an image that corresponds to information output by the information processor; (JUNG, para [50], “The output unit 250 may include at least one of a display module”) and an input device configured to input information to the information processor from outside the information processor, (JUNG, para [51], “The display module 251 may be inter-layered or integrated with a touch input module”) the vehicles including the subject vehicle on which the navigation device is mounted; (JUNG, para [40], “the vehicle 100 can include… a navigation system”) show, on the display, a navigation image of three navigation images in each of which an icon indicating the present position of a corresponding one of the vehicles is superimposed on a map image of a subject range; (JUNG, Fig.18 and para [188], “the location at which the icon 1800 corresponding to the vehicle 100 is output”, and Fig.18 depicts a map image of range of 0-500 meters, and Fig.19 depicts three images of similar as Fig.18.) PNG media_image1.png 714 976 media_image1.png Greyscale generate, as the three navigation images, a normal image in which the subject range is set to a range including the present position of the subject vehicle, (JUNG, Fig.18 and para [188], “the location at which the icon 1800 corresponding to the vehicle 100 is output”, and Fig.18 depicts a map image of range of 0-500 meters.) an overall image in which the subject range is set to a range including the present positions of all of the vehicles, (JUNG, Fig.19 depicts an overall image including the present positions of all of the vehicles (both in front of and rearward of the vehicle 100 included) in 0-500 meters range and para [194], “a first icon 1910 corresponding to the vehicle 100 and a second icon 1920 corresponding to the rearward vehicle which is overtaking can be output on the road image”) PNG media_image2.png 746 972 media_image2.png Greyscale and a confirmation image in which the subject range is set to a range including the present position of a specified vehicle designated by an input from the input device, …the specified vehicle being one of the vehicles; (JUNG, Fig.19 depicts an confirmation image including a specified vehicle (a rearward vehicle 1920) included designated by an input from the user in 0-500 meters range and para [193], “when a rearward vehicle that attempts to pass the vehicle 100 is detected, a message 1900 asking whether to monitor the rearward vehicle can be output.”, para [194], “In response to this, when the “OK” button is pressed, a first icon 1910 corresponding to the vehicle 100 and a second icon 1920 corresponding to the rearward vehicle which is overtaking can be output on the road image”) and switch a manner in which the three navigation images are displayed in accordance with an input from the input device. (JUNG, para [193], “when a rearward vehicle that attempts to pass the vehicle 100 is detected, a message 1900 asking whether to monitor the rearward vehicle can be output.”, para [194], “In response to this, when the “OK” button is pressed, a first icon 1910 corresponding to the vehicle 100 and a second icon 1920 corresponding to the rearward vehicle which is overtaking can be output on the road image”) JUNG does not explicitly teach, but Beard teaches: an ID and a registered name of each of vehicles registered in advance,( Beard, Col.13, lines 53-56, “access DMV data to determine information related to the particular license plate (for example, a registered owner name and address, and a vehicle description and vehicle identification number)”) wherein the information processor is configured to: obtain information on present positions of the vehicles registered in advance, (Beard, claim 14, “indications on the interactive map of geographical locations associated with each of the identified vehicle”) the overall image indicating the registered names of all of the vehicles,( Beard, Fig2A depicts an overall image indicating the registered owner name of all the vehicles.) the confirmation image indicating the registered name of the specified vehicle, ( Beard, Fig5A depicts An image indicating the registered name such as John Doe of the specified vehicle.) PNG media_image3.png 496 1315 media_image3.png Greyscale Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Beard in order to include an ID and a registered name of each of vehicles registered in advance, wherein the information processor is configured to: obtain information on present positions of vehicles registered in advance, the overall image indicating the registered names of all of the vehicles, the confirmation image indicating the registered name of the specified vehicle. One of ordinary skill in the art would have been motivated to make this modification in order to “provide sufficient information to the system” (Beard, Description) Regarding Claim 2: JUNG in view of Beard, as shown in the rejection above, discloses the limitations of claim 1.JUNG teaches: The navigation device according to claim 1, wherein the display is a touch panel that is also configured as the input device, (JUNG, para [51], “The display module 251 may be inter-layered or integrated with a touch input module”) and the information processor is further configured to: show on the display a first button, a second button, and only one of the three navigation images; (JUNG, Fig.13 shows on the display a first button (left button), a second button (right button), and only one of the three types of the navigation images (normal image in the back layer) and para [165], “As another example, when it is possible to enter the overtaking available area in the left or right direction, icons 1320 for selecting a direction can be output.”) switch, in response to the first button being operated, the navigation image shown on the display to one of two navigation images not shown on the display; and switch, in response to the second button being operated, the navigation image shown on the display to another one of the two navigation images not shown on the display. (JUNG, Fig.13 and para[169], “if it is set that the driver prefers a left or right lane for overtaking, a highlighting effect can be output to the left or right arrow icon 1340 or 1350”, Fig 13 depicts that if the driver select left button, a highlighting effect left arrow icon is added to the normal image as a new image, so does to the right arrow icon output as another new image when the driver selects the right button.) PNG media_image4.png 905 812 media_image4.png Greyscale Regarding Claims 7 and 9: JUNG in view of Beard, as shown in the rejection above, discloses the limitations of claim 1 and 5. JUNG does not explicitly teach, but Beard teaches: The navigation device according to claim 1, wherein the confirmation image indicates the position and the registered name of the specified vehicle only, without indicating the position of the subject vehicle. (Beard, Fig5B depicts an image indicates the position and the registered name of the specified vehicle only, without indicating the position of the subject vehicle.) PNG media_image5.png 530 685 media_image5.png Greyscale Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Beard in order to include a wherein the confirmation image indicates the position and the registered name of the specified vehicle only, without indicating the position of the subject vehicle. One of ordinary skill in the art would have been motivated to make this modification in order to “provide sufficient information to the system” (Beard, Description) Claim(s) 3-4, 6, 8 are rejected under 35 U.S.C. 103 as being unpatentable over JUNG (US20190217863A1) in view of Beard (US9021384B1), further in view of Watanabe (US20030083812A1) and Lee (US 20160097651 A1). Regarding Claim 3: JUNG in view of Beard, as shown in the rejection above, discloses the limitations of claim 1.JUNG does not explicitly teach, but Watanabe teaches: The navigation device according to claim 1, wherein the information processor is further configured to: obtain, from each of the vehicles, information on a destination and a scheduled travel route from the present position to the destination;( Watanabe, abstract, “receiving data via a communication line and a plurality of vehicle terminals”, and para[122], “the individual database 203 manages the present position information and destination information for each vehicle and the route information from the present position to the destination.”) Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Watanabe in order to include obtain, from each of the vehicles, information on a destination and a scheduled travel route from the present position to the destination. One of ordinary skill in the art would have been motivated to make this modification in order to “efficient to know the jamming status on the day”(Watanabe, Description). JUNG does not explicitly teach, but Lee teaches: superimpose, in each of the three navigation images, the scheduled travel route on the map image within the subject range of the map image; (Lee, Fig.7C and para[192], “referring to FIG. 7C, …, the controller 212 may generate road guide information indicating the determined geographical location 720 c as “Destination” and output a vehicle travel route (driving path) corresponding to the generated road guide information on the display unit 201”, and para[281], “The controller 180 then changes an output range of the first map screen output on the display unit 151 in real time”) and set, when generating the overall image, the subject range of the overall image to a range including the destinations of all of the vehicles in addition to the present positions of all of the vehicles. (Lee, para [31], “current locations and moving paths of a plurality of vehicles can be simultaneously recognized on one screen, and the moving paths or destinations of the plurality of vehicles can simultaneously be recommended based on a touch input applied to the screen”, and para[281], “The controller 180 then changes an output range of the first map screen output on the display unit 151 in real time”) Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Lee in order to generate, as the navigation image, an image in which the scheduled travel route is superimposed on the map image within the subject range of the map image, and set, when generating the overall image, the subject range of the overall image to a range including the destinations of all of the vehicles in addition to the present positions of all of the vehicles . One of ordinary skill in the art would have been motivated to make this modification in order to “providing intuitive interfaces to a user”(Lee, Description). Regarding Claim 4: JUNG in view of Beard, as shown in the rejection above, discloses the limitations of claim 1. JUNG does not explicitly teach, but Lee teaches: The navigation device according to claim 3, wherein the information processor is further configured to: obtain, from each of the vehicles, time-related information including at least one of an estimated arrival time at the destination and a required time to the destination; (Lee, para [180], “The controller 212 may also control the display unit 201 to output additional data information related to the determined geographical location on the first map screen. Here, additional data information, for example, … an expected arrival time”). and superimpose, in at least one of the three navigation images, the time-related information of any of the vehicles located within the subject range of the map image on the map image. (Lee, para [180], “The controller 212 may also control the display unit 201 to output additional data information related to the determined geographical location on the first map screen. Here, additional data information, for example, … an expected arrival time”, and para[281], “The controller 180 then changes an output range of the first map screen output on the display unit 151 in real time”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Lee in order to obtain, from each of the vehicles, time-related information including at least one of an estimated arrival time at the destination and a required time to the destination; and generate, as the navigation image, an image in which the time-related information of any of the vehicles located within the subject range of the map image is superimposed on the map image. One of ordinary skill in the art would have been motivated to make this modification in order to “providing intuitive interfaces to a user”(Lee, Description). Regarding Claims 6 and 8: JUNG in view of Beard, as shown in the rejection above, discloses the limitations of claim 1 and 5. JUNG does not explicitly teach, but Watanabe teaches: The navigation device according to claim 1, wherein the vehicles registered in advance share travel information with one another, and only the vehicles registered in advance and sharing the travel information with one another are shown on the navigation images. ( Watanabe, para [202], “the registered vehicles”, abstract, “receiving data via a communication line and a plurality of vehicle terminals”, and para[122], “the individual database 203 manages the present position information and destination information for each vehicle and the route information from the present position to the destination.”, and Para [182], “FIG. 11 shows the number of vehicles which will run into the point A at about a prescribed time (AM 10:00) when the measuring range having a radius of 1 km around the point A as center is specified as a range data from the vehicle terminal 10.”) Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Watanabe in order to include wherein the vehicles registered in advance share travel information with one another, and only the vehicles registered in advance and sharing the travel information with one another are shown on the navigation images. One of ordinary skill in the art would have been motivated to make this modification in order to “efficient to know the jamming status on the day”(Watanabe, Description). Regarding Claim 10: JUNG in view of Beard, Watanabe and Lee as shown in the rejection above, discloses the limitations of claim 6 .JUNG does not explicitly teach, but Beard teaches: The navigation device according to claim 6, wherein the confirmation image indicates the position and the registered name of the specified vehicle only, without indicating the position of the subject vehicle. (Beard, Fig5B depicts an image indicates the position and the registered name of the specified vehicle only, without indicating the position of the subject vehicle.) PNG media_image5.png 530 685 media_image5.png Greyscale Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Beard in order to include a wherein the confirmation image indicates the position and the registered name of the specified vehicle only, without indicating the position of the subject vehicle. One of ordinary skill in the art would have been motivated to make this modification in order to “provide sufficient information to the system” (Beard, Description) Regarding Claim 11: JUNG in view of Beard, Watanabe and Lee as shown in the rejection above, discloses the limitations of claim 10 .JUNG teaches: The navigation device according to claim 10, wherein the display is a touch panel that is also configured as the input device, (JUNG, para [51], “The display module 251 may be inter-layered or integrated with a touch input module”) and the information processor is further configured to: show on the display a first button, a second button, and only one of the three navigation images; (JUNG, Fig.13 shows on the display a first button (left button), a second button (right button), and only one of the three types of the navigation images (normal image in the back layer) and para [165], “As another example, when it is possible to enter the overtaking available area in the left or right direction, icons 1320 for selecting a direction can be output.”) switch, in response to the first button being operated, the navigation image shown on the display to one of the two navigation images not shown on the display; and switch, in response to the second button being operated, the navigation image shown on the display to another one of the two navigation images not shown on the display. (JUNG, Fig.13 and para[169], “if it is set that the driver prefers a left or right lane for overtaking, a highlighting effect can be output to the left or right arrow icon 1340 or 1350”, Fig 13 depicts that if the driver select left button, a highlighting effect left arrow icon is added to the normal image as a new image, so does to the right arrow icon output as another new image when the driver selects the right button.) Regarding Claim 12: JUNG in view of Beard, Watanabe and Lee as shown in the rejection above, discloses the limitations of claim 11 . JUNG does not explicitly teach, but Watanabe teaches: The navigation device according to claim 11, wherein the information processor is further configured to: obtain, from each of the vehicles, information on a destination and a scheduled travel route from the present position to the destination; ( Watanabe, abstract, “receiving data via a communication line and a plurality of vehicle terminals”, and para[122], “the individual database 203 manages the present position information and destination information for each vehicle and the route information from the present position to the destination.”) Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Watanabe in order to include obtain, from each of the vehicles, information on a destination and a scheduled travel route from the present position to the destination. One of ordinary skill in the art would have been motivated to make this modification in order to “efficient to know the jamming status on the day”(Watanabe, Description). JUNG does not explicitly teach, but Lee teaches: superimpose, in each of the three navigation images, the scheduled travel route on the map image within the subject range of the map image; (Lee, Fig.7C and para[192], “referring to FIG. 7C, …, the controller 212 may generate road guide information indicating the determined geographical location 720 c as “Destination” and output a vehicle travel route (driving path) corresponding to the generated road guide information on the display unit 201”, and para[281], “The controller 180 then changes an output range of the first map screen output on the display unit 151 in real time”) and set, when generating the overall image, the subject range of the overall image to a range including the destinations of all of the vehicles in addition to the present positions of all of the vehicles. (Lee, para [31], “current locations and moving paths of a plurality of vehicles can be simultaneously recognized on one screen, and the moving paths or destinations of the plurality of vehicles can simultaneously be recommended based on a touch input applied to the screen”, and para[281], “The controller 180 then changes an output range of the first map screen output on the display unit 151 in real time”) Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Lee in order to generate, as the navigation image, an image in which the scheduled travel route is superimposed on the map image within the subject range of the map image, and set, when generating the overall image, the subject range of the overall image to a range including the destinations of all of the vehicles in addition to the present positions of all of the vehicles . One of ordinary skill in the art would have been motivated to make this modification in order to “providing intuitive interfaces to a user”(Lee, Description). Regarding Claim 13: JUNG in view of Beard, Watanabe and Lee as shown in the rejection above, discloses the limitations of claim 12 . JUNG does not explicitly teach, but Watanabe teaches: The navigation device according to claim 12, wherein the information processor is further configured to: obtain, from each of the vehicles, time-related information including at least one of an estimated arrival time at the destination and a required time to the destination; and superimpose, in at least one of the three navigation images, the time-related information of any of the vehicles located within the subject range of the map image on the map image. (Watanabe, Para [190], “FIG. 13 shows the number of vehicles running into the range data having a radius of 500 m around a prescribed point (point A in FIG. 12) at about a prescribed time (AM 10:00)”) PNG media_image6.png 896 1321 media_image6.png Greyscale Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the vehicle control device mounted on vehicle of JUNG to include these above aforementioned teachings from Watanabe in order to include wherein the information processor is further configured to: obtain, from each of the vehicles, time-related information including at least one of an estimated arrival time at the destination and a required time to the destination; and superimpose, in at least one of the three navigation images, the time-related information of any of the vehicles located within the subject range of the map image on the map image. One of ordinary skill in the art would have been motivated to make this modification in order to “efficient to know the jamming status on the day”(Watanabe, Description). RESPONSE TO ARGUMENTS Claim Rejections - 35 USC § 103. Applicant’s arguments filed on 12/23/2025 with respect to claims 1-5 (See applicant’s response, page 7, “Rejections under 35 U.S.C. 103”) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Kai Wang whose telephone number is (571) 270-5633. The examiner can normally be reached Mon-Fri 8:30-5:30 Eastern. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rachid Bendidi can be reached on (571) 272-4896. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KAI NMN WANG/Examiner, Art Unit 3664 /REDHWAN K MAWARI/Primary Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Jul 03, 2024
Application Filed
Sep 23, 2025
Non-Final Rejection — §103
Dec 23, 2025
Response Filed
Mar 18, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603004
WARNING DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12573298
OBJECT RECOGNITION DEVICE, MOVABLE BODY COLLISION PREVENTION DEVICE, AND OBJECT RECOGNITION METHOD
2y 5m to grant Granted Mar 10, 2026
Patent 12552357
METHOD AND CONTROL DEVICE FOR CONTROLLING A PARKING BRAKE FOR A VEHICLE, AND PARKING BRAKE SYSTEM FOR A VEHICLE
2y 5m to grant Granted Feb 17, 2026
Patent 12523497
MAP UPDATE DEVICE, METHOD, AND COMPUTER PROGRAM FOR UPDATING MAP
2y 5m to grant Granted Jan 13, 2026
Patent 12510364
METHOD FOR PLANNING A TRAJECTORY IN PRESENCE OF WATER CURRENT
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
54%
Grant Probability
65%
With Interview (+10.8%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 76 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month