Prosecution Insights
Last updated: April 19, 2026
Application No. 17/486,508

OCCUPANT MOBILITY VALIDATION

Final Rejection §103
Filed
Sep 27, 2021
Examiner
MILLER, LEAH NICOLE
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Motor North America, Inc.
OA Round
6 (Final)
56%
Grant Probability
Moderate
7-8
OA Rounds
3y 4m
To Grant
48%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
18 granted / 32 resolved
+4.3% vs TC avg
Minimal -8% lift
Without
With
+-8.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
32 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
9.3%
-30.7% vs TC avg
§103
38.3%
-1.7% vs TC avg
§102
23.6%
-16.4% vs TC avg
§112
27.3%
-12.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 32 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This Office Action is in response to the amendments filed on 11 June 2025. Claims 1-20 are presently pending and are presented for examination. Response to Amendment In response to Applicant’s amendments dated 11 June 2025, Examiner withdraws the previous 35 U.S.C. 112 rejections; and withdraws the previous prior art rejections. Response to Arguments Applicant’s arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. The remaining arguments are essentially the same as those addressed above and/or below and are unpersuasive for at least the same reasons. Therefore, Examiner is unpersuaded and maintains the corresponding rejections. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-6, 8-13, and 15-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over US-20190156679-A1, hereinafter “Bartel” (previously of record), in view of DE-102019207986-B3, hereinafter “Rose” (newly of record). Regarding claim 8, and analogous claims 1 and 15, Bartel discloses A system (Bartel, para. 0012: “Systems [i.e., A system] and processes are disclosed herein that accommodate riders of a transportation arrangement service that have certain physical impairments, such as visual, auditory, or movement impairments."), comprising: Regarding claims 1 and 15, Bartel discloses A method (Bartel, para. 0012: “Systems and processes [i.e., A method] are disclosed herein that accommodate riders of a transportation arrangement service that have certain physical impairments, such as visual, auditory, or movement impairments.") and A non-transitory computer readable medium comprising instructions (Bartel, para. 0018: “One or more examples described herein provide that methods, techniques, and actions performed by a computing device are performed programmatically, or as a computer-implemented method. Programmatically, as used herein, means through the use of code or computer-executable instructions. These instructions can be stored in one or more memory resources [i.e., computer readable medium comprising instructions] of the computing device.”; Claim 8: “A non-transitory computer readable medium storing instructions [i.e., A non-transitory computer readable medium comprising instructions] that, when executed by one or more processors, cause the one or more processors to…”). a processor of a transport (Bartel, para. 0021: “Furthermore, one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors."); a memory on which are stored machine-readable instructions that when executed by the processor (Bartel, para. 0021: " Furthermore, one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors [i.e., machine-readable instructions that when executed by the processor]. These instructions may be carried on a computer-readable medium [i.e., memory on which are stored machine-readable instructions]. "), cause the processor to: receive a pick-up location of a prospective occupant via the transport (Bartel, para. 0082: "Furthermore, the pick-up request can include a pick-up location [i.e., receive a pick-up location of a prospective occupant via the transport], or the transport facilitation system 100 can receive location information from the user device 195 of the rider. Based on the location, the transport facilitation system 100 can receive location information from available transport vehicles proximate to the pick-up location (515)."), control the transport to autonomously maneuver to the pick-up location (Bartel, para. 0084: "The configuration commands 144 can cause the designated application 185 on the user's device 195 to track the SDV 109 as it travels to the pick-up location (540) [i.e., transport to autonomously maneuver to the pick-up location]."; para. 0002: "Autonomous vehicle (AV) or self-driving vehicle (SDV) [i.e., autonomously maneuver] design and technology offers the potential to bridge many gaps currently prevalent in such transportation arrangement services."), capture an image, via an external sensor of the transport, of the prospective occupant (Bartel, para. 0044: “The SDV 200 can be equipped with multiple types of sensors 201, 203 which can combine to provide a computerized perception of the space and the physical environment surrounding the vehicle 200 [i.e., an external sensor of the transport].”; para. 0045: “By way of example, the sensors 201, 203 can include multiple sets of cameras sensors 201 [i.e., capture an image] (video cameras, stereoscopic pairs of cameras or depth perception cameras, long range cameras), remote detection sensors 203 such as provided by radar or LIDAR, proximity or touch sensors, and/or sonar sensors (not shown).”; para. 0059: “According to examples, the external audio system can be executed in conjunction with an external monitoring system as the SDV 200 approaches a visually impaired requesting user [i.e., of the prospective occupant]. For example, the monitoring system can utilize one or more sensors 201 i.e., capture an image, via an external sensor of the transport], 203 of the AV's sensory array to identify the user [i.e., of the prospective occupant] as the SDV 200 approaches the pick-up location.”)… …autonomously maneuver, by the transport to an alternate location to pick-up the prospective occupant based on the determined ability of the prospective occupant to enter the transport safely (Bartel, para. 0059: “For example, the monitoring system can utilize one or more sensors 201, 203 of the AV's sensory array to identify the user as the SDV 200 approaches the pick-up location. In one example, if the requesting user is not precisely at the pick-up location, the control system 220 can override the inputted pick-up location (e.g., the pick-up location indicated in the transport invitation 213), and operate the acceleration 252, braking 256, and steering systems 254 of the SDV 200 to maneuver the SDV 200 [i.e., autonomously maneuver, by the transport] to the curbside next to the visually impaired user [i.e., to an alternate location to pick-up the prospective occupant based on the determined ability of the prospective occupant to enter the transport safely]. Once the SDV 200 approaches the user, the accommodation logic 221 can implement the accommodation commands 233 to, for example, automatically open a door proximate to the user [i.e., to enter the transport safely] (e.g., a rear curbside door, and provide audio assistance using an audio system and the monitoring system of the SDV 200 if needed. Additionally or alternatively, the accommodation logic 221 can further utilize the audio system to greet the user [i.e., to enter the transport safely] with an audio greeting by name.”). Bartel does not appear to explicitly disclose the following: …execute object recognition functionality on the image of the prospective occupant to identify an object with the prospective occupant; determine, that the prospective occupant has an impairment based on the identified object; determine an ability of the prospective occupant to enter the transport safely based on the impairment and a threshold level of impairment… However, in the same field of endeavor, Rose teaches the following: execute object recognition functionality on the image of the prospective occupant to identify an object with the prospective occupant (Rose, para. 0049: “In program step 412, the data from the environment monitoring sensors 150, 151, 182, 186 are evaluated. Person recognition is carried out using an appropriate image recognition algorithm [i.e., execute object recognition functionality on the image of the prospective occupant]. The image recognition algorithm also evaluates whether the person is using a cane 14 [i.e., identify an object with the prospective occupant]. This is intended to determine whether the person is blind or severely visually impaired. In program step 412, an image evaluation is also carried out to determine whether the person is traveling with luggage.”); determine, that the prospective occupant has an impairment based on the identified object (Rose, para. 0049: “In program step 412, the data from the environment monitoring sensors 150, 151, 182, 186 are evaluated. Person recognition is carried out using an appropriate image recognition algorithm. The image recognition algorithm also evaluates whether the person is using a cane 14 [i.e., based on the identified object]. This is intended to determine whether the person is blind or severely visually impaired [i.e., determine, that the prospective occupant has an impairment based on the identified object]. In program step 412, an image evaluation is also carried out to determine whether the person is traveling with luggage.”); determine an ability of the prospective occupant to enter the transport safely based on the impairment and a threshold level of impairment (Rose, para. 0049: “In program step 412, the data from the environment monitoring sensors 150, 151, 182, 186 are evaluated. Person recognition is carried out using an appropriate image recognition algorithm. The image recognition algorithm also evaluates whether the person is using a cane 14. This is intended to determine whether the person is blind or severely visually impaired. In program step 412, an image evaluation is also carried out to determine whether the person is traveling with luggage.”; para. 0050: “Query 414 checks whether a visually impaired person has been detected [i.e., a threshold level of impairment]. If not [i.e., a threshold level of impairment], this program is terminated in program step 430. If a visually impaired person has been detected [i.e., a threshold level of impairment], a connection to the electronic device 15 of the target person 12 is established in program step 416.”; para. 0051: “Well-known image recognition algorithms are used to detect temporary obstacles. These image recognition algorithms can also be used to detect static obstacles. Next, in program step 420, a check is carried out to determine whether an obstacle has been detected in the path of the target person 12 [i.e., determine an ability of the prospective occupant to enter the transport safely based on the impairment].”)… Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Bartel, with the concept of an autonomous vehicle (used for a transportation service) that uses object recognition to determine an object with a prospective occupant and determines whether or not the prospective occupant is physically impaired based on the object, taught by Rose, in order to accurately and effectively ensure that all prospective occupants can be accommodated in the autonomous vehicle (i.e., receive the transportation service), regardless of their individual mobility needs/devices (Rose, para. 0001: “The proposal concerns the technical field of vehicles, in particular vehicles with automatic driving functions. The vehicle is equipped in such a way that it can allow people, especially visually impaired people, safe access to the vehicle.”). Regarding claim 9, and analogous claims 2 and 16, Bartel and Rose teach the system of claim 8, and Bartel further discloses the following: wherein the processor is further configured to: connect a device of the transport and a device associated with the prospective occupant via a software application (Bartel, FIG. 1: User Devices 195, Available Transport Vehicles 190, Network(s) 180, Designated App 185 (i.e., "a software application"), Transport Facilitation System 100; para. 0024: "A designated application 185 corresponding to the transportation arrangement service can be executed on the user devices 195."); and reroute the transport to a different location for pick-up via the software application (Bartel, para. 0027: "The requesting user can configure such locations via the designated application 185 prior to submitting the pick-up request 197. For example, a user can be provided with an application configuration screen that enables the user to set specified pick-up locations and/or notes for a driver when submitting any pick-up request 197 indicating a pick-up location within a predetermined area. In one example, the pick-up location can be inputted by the user setting a virtual location pin on a mapping resource of the designated application 185."). Regarding claim 10, and analogous claims 3 and 17, Bartel and Rose teach the system of claim 8 and Bartel further discloses the following: wherein the processor is further configured to transmit a message to a different transport with an additional notification about the prospective occupant (Bartel, Fig. 1, Transport Facilitation System 100, connected to Available Transport Vehicles 190 via the Network(s) 180; para. 0037, "For example, upon selection of a human driven transport vehicle 191 [i.e., a different transport], the configuration engine 140 can transmit a request 144 [i.e., the processor is further configured to transmit a message] to the driver device 194 to notify the driver of the requesting user's physical impairment [i.e., an additional notification about the prospective occupant]."), Regarding claim 11, and analogous claims 4 and 18, Bartel and Rose teach the system of claim 10 and Bartel further discloses the following: wherein the processor is configured to send the message to a user device of the different transport (Bartel, Fig. 1: Transport Facilitation System 100, connected to Available Transport Vehicles 190 (like TV 191 and the associated Driver Device 194) via the Network(s) 180; para. 0037: "For example, upon selection of a human driven transport vehicle 191 [i.e., the different transport], the configuration engine 140 can transmit a request 144 [i.e., processor is configured to send the message] to the driver device 194 to notify the driver [i.e., to a user device of the different transport] of the requesting user's physical impairment."). Regarding claim 12, and analogous claims 5 and 19, Bartel and Rose teach the system of claim 8 and Bartel further discloses the following: wherein the processor is further configured to modify settings of at least one system in the transport to accommodate the prospective occupant based on the impairment (Bartel, FIG. 2: accommodation logic 221, 4A: step 404, 4B: steps 414-450; para. 0057, "The SDV 200 can receive the SDV configuration set 218 while the SDV 200 is en route to the pick-up location such that the accommodation commands 233 can be executed on the accommodation features 266 [i.e., the processor is further configured to modify settings of at least one system in the transport to accommodate the prospective occupant] when the SDV 200 arrives at the pick-up location."; para. 0029: “As used herein, such “accommodation features” can include specialized vehicle features that can assist a rider that has one or more physical impairments, such as a visual, auditory, or movement impairment [i.e., based on the impairment]. Such accommodation features can include automatic doors, a trunk or side lift gate, an exterior audio system, an external monitoring system (e.g., a sensor array to monitor rider ingress and egress), speech recognition features, an exterior notification system, and the like. Accordingly, such accommodation features can comprise vehicle features that can directly assist a user to mitigate the effects of the user's physical impairments.”). Regarding claim 13, and analogous claim 6, Bartel and Rose teach the system of claim 8, and Rose further teaches the following: wherein the processor is further configured to perform facial recognition on the prospective occupant within the image (Rose, para. 0018: “To expand the procedure, an algorithm is also processed which is used to identify the target person (e.g. B. by facial recognition) [i.e., perform facial recognition on the prospective occupant within the image] and in particular to detect whether the target person is visually impaired and is designed to detect a cane used by a visually impaired person and/or to detect any other sign identifying a visually impaired person.”), and determine the prospective occupant is impaired based on the facial recognition (Rose, para. 0018: “To expand the procedure, an algorithm is also processed which is used to identify the target person (e.g. B. by facial recognition) and in particular to detect whether the target person is visually impaired [i.e., determine the prospective occupant is impaired based on the facial recognition] and is designed to detect a cane used by a visually impaired person and/or to detect any other sign identifying a visually impaired person.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Bartel, as modified by Rose, with the concept of an autonomous vehicle (used for a transportation service) that uses facial recognition to determine whether or not the prospective occupant is impaired, taught by Rose, in order to accurately and effectively ensure that all prospective occupants can be accommodated in the autonomous vehicle (i.e., receive the transportation service), regardless of their individual physical abilities (Rose, para. 0001: “The proposal concerns the technical field of vehicles, in particular vehicles with automatic driving functions. The vehicle is equipped in such a way that it can allow people, especially visually impaired people, safe access to the vehicle.”). Claim(s) 14, 7 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bartel, in view of Rose, and further in view of US-10780888-B2, hereinafter “Hunt” (previously of record). Regarding claim 14, and analogous claims 7 and 20, Bartel and Rose teach the system of claim 8, but do not appear to explicitly teach the following: wherein the processor is further configured to send an alert to a device associated with another person at a destination of the transport, wherein the alert contains details of the prospective occupant. However, in the same field of endeavor, Hunt teaches: wherein the processor is further configured to send an alert to a device associated with another person at a destination of the transport (Hunt, col.7, lines 11-16, "In some embodiments, vehicle 202 may then provide status information to server 208. Server 208, may then provide status information regarding vehicle 202 to user device 206 (or other user devices.)"), wherein the alert contains details of the prospective occupant (Hunt, col.9, lines 37-43, "In some embodiments, capabilities 302 may include a camera feed capability. In some embodiments, processor 312 may be able to gather video feed data from cameras inside or outside of the vehicle (e.g., camera 152 of FIG. 1B). In some embodiments, processor 312 may be able to send the camera feed to an external device (e.g., user device 206 of FIG. 2, server 208 of FIG. 2 or any other device)."; col. 12, lines 19-22, "For example, a parent may wish to be apprised of a location of his or her child. In this example, the processing circuitry may send periodic or real-time location updates to the user device of the parent."). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing data of the application, and with a reasonable likelihood of success, to modify the invention disclosed by Bartel, as modified by Rose, with the capability of alerting third-parties, taught by Hunt, because when a transportation arrangement service provides transportation services to impaired requestors, communication capability with other relevant parties would help ensure the requestor received the service they requested (Hunt, col. 12, lines 19-22, "For example, a parent may wish to be apprised of a location of his or her child. In this example, the processing circuitry may send periodic or real-time location updates to the user device of the parent."). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Leah N Miller whose telephone number is (703)756-1933. The examiner can normally be reached M-Th 8:30am - 5:30pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached at (571) 270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /L.N.M./Examiner, Art Unit 3666 /HELAL A ALGAHAIM/SPE , Art Unit 3666
Read full office action

Prosecution Timeline

Sep 27, 2021
Application Filed
Oct 25, 2023
Non-Final Rejection — §103
Dec 11, 2023
Applicant Interview (Telephonic)
Dec 11, 2023
Examiner Interview Summary
Dec 13, 2023
Response Filed
Feb 05, 2024
Final Rejection — §103
Mar 13, 2024
Applicant Interview (Telephonic)
Mar 14, 2024
Examiner Interview Summary
Apr 09, 2024
Response after Non-Final Action
Apr 18, 2024
Response after Non-Final Action
Apr 18, 2024
Examiner Interview (Telephonic)
Apr 30, 2024
Request for Continued Examination
May 01, 2024
Response after Non-Final Action
May 20, 2024
Non-Final Rejection — §103
Jul 16, 2024
Applicant Interview (Telephonic)
Jul 16, 2024
Examiner Interview Summary
Jul 29, 2024
Response Filed
Oct 03, 2024
Final Rejection — §103
Dec 13, 2024
Response after Non-Final Action
Jan 16, 2025
Request for Continued Examination
Jan 18, 2025
Response after Non-Final Action
Apr 10, 2025
Non-Final Rejection — §103
Jun 11, 2025
Response Filed
Aug 15, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585279
Navigating a robotic mower along a guide wire
2y 5m to grant Granted Mar 24, 2026
Patent 12579894
MULTI-LANE TRAFFIC MANAGEMENT SYSTEM FOR PLATOONS OF AUTONOMOUS VEHICLES
2y 5m to grant Granted Mar 17, 2026
Patent 12565229
SYSTEM FOR CONTROLLING VEHICLE BASED ON STATE OF CONTROLLER AND SYSTEM FOR CONTROLLING VEHICLE BASED ON COMMUNICATION STATE
2y 5m to grant Granted Mar 03, 2026
Patent 12560930
IDENTIFYING TRANSPORT STRUCTURES
2y 5m to grant Granted Feb 24, 2026
Patent 12552361
HYBRID VEHICLE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
56%
Grant Probability
48%
With Interview (-8.3%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 32 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month