Prosecution Insights
Last updated: April 19, 2026
Application No. 18/436,750

DRIVING ASSISTANT DEVICE, DRIVING ASSISTANT SYSTEM, AND DRIVING ASSISTANT METHOD

Final Rejection §103
Filed
Feb 08, 2024
Examiner
GENTILE, ALEXANDER VINCENT
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Panasonic Intellectual Property Management Co., Ltd.
OA Round
2 (Final)
75%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
88%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
18 granted / 24 resolved
+23.0% vs TC avg
Moderate +13% lift
Without
With
+12.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
26 currently pending
Career history
50
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
51.4%
+11.4% vs TC avg
§102
27.4%
-12.6% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 24 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Status of Claims The following is a final office action in response to the communication filed on 09/09/2025. Claims 1-10 and 12-20 are pending and have been examined. Claim 11 has been canceled. Claims 1-10 and 12-20 are either amended directly or via a claim they depend from. Claims 1-10 and 12-20 are rejected. Response to Arguments Regarding the Objections to the Specification: Applicant’s arguments and corresponding amendments, see page 8, filed on 09/09/2025, have been fully considered and are persuasive in view of the amendments. The corresponding objections have been withdrawn. Regarding the claim rejections under 35 § USC 112: Applicant’s arguments and corresponding amendments, see page 9, filed on 09/09/2025, have been fully considered and are persuasive in view of the amendments. The corresponding rejections have been withdrawn. Regarding the claim rejections under 35 § USC 102: Applicant’s arguments and corresponding amendments, see pages 9-11, filed on 09/09/2025, have been fully considered and are moot in view of the amendments. Newly cited prior art has been applied in an obviousness type rejection to address the amendments to independent claims 1 and 16 in the Claim Rejections - 35 USC § 103 section. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-8, 10, and 12-20 are rejected under 35 U.S.C. 103 as being unpatentable over Volos et al., (US 2019/0280954 A1, hereinafter Volos) in view of Panchal. (US 2021/0136870 A1, hereinafter Panchal) Claim 1 Discloses: (Currently Amended) “An assistant device comprising: a memory; and a hardware processor coupled to the memory, the hardware processor being configured to:” Volos teaches, (Abstract, Lines 1-3) “A vehicle application enabling system is provided and includes a memory and initialization, latency evaluation, and application enable modules.” Volos additionally teaches, (Paragraph [0115]) “In this application … The term “module” may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.” “add latency information indicating required latency to data” Volos teaches, (Paragraph [0010], Lines 1-3) “FIG. 1 is a functional block diagram of an example of a latency based vehicle operating system including nodes each having application specific latency characterization.” Volos additionally teaches, (Paragraph [0039], Lines 7-9) “The latency characterization and prediction includes determining a current latency status, determining latency trends, and predicting latency characteristics.” “that is transmitted to a server of a plurality of servers via a communication line and used for an assistance process performed by the server; transmit data for external processing to which the latency information is added,” Volos teaches, (Paragraph [0036]), Lines 16-24) “The cloud-based network 14 may include some of the nodes 12, routers, 18, servers 20 and/or other network devices. Although shown separately, the nodes 12 of the cloud-based network 14 may be implemented as or in one or more of the servers 20. Each of the nodes 12 may include application specific latency characterization and prediction modules (ASLCPMs) 22 and/or a HLDM 24. The ASLCPMs 22 perform latency characterization and provide latency models and projections for signals transmitted between the nodes.” “and receive assistance information including a processing result of the server based on the data for external processing; and control a vehicle based on the assistance information,” Volos teaches, (Paragraph [0039], Lines 1-7) “By having ASLCPMs at each of the nodes, a network of ASLCPMs is provided that is able to share latency information. The latency information may be used to (i) determine whether to enable and/or execute in-vehicle applications (referred to as “applications”), (ii) best network routes for signal transmission, and/or (iii) best vehicle routes for vehicles to follow. Volos additionally teaches, (Paragraph [0065], Lines 1-7) “The vehicle control module 302 may execute the applications 321 and may control operation of an engine 331, a converter/generator 333, a transmission 334, a brake system 336, electric motors 338 and/or a steering system 340 according to parameters, data, values, commands, etc. determined, calculated, estimated, predicted, and/or generated as a result of executing the applications 321.” “wherein the latency information is a latency type indicating a class of the required latency.” Volos does not explicitly teach its latency based vehicle operating system comprising latency information which is a latency type indicating a class of the required latency. Panchal does teach utilizing latency information which is a latency type indicating a class of the required latency. Panchal teaches a wireless network system in an autonomous vehicle environment wherein, (Paragraph [0018], Lines 20-23) “low latency services may include edge computing services that control autonomous vehicles or robots, and/or other devices or systems operating in real-time,” and that a, (Paragraph [0032], Lines 1-12) “Controller 105 may receive (at 1) the service request, and may determine that the first service identifier is associated with a first class of service. The determination may be based on controller 105 storing a mapping between different service identifiers, profile identifiers, and/or other identifiers and different classes of services (e.g., latency sensitive services, latency insensitive services, etc.). In this example, the first class of service may correspond to latency insensitive services (e.g., enhanced Mobile Broadband (“eMBB”) services and/or other services that may be considered “latency insensitive”) that are accessed from external data network 150.” Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to combine the latency based vehicle operating system of Volos with the explicit determination of latency type indicating a class of the required latency as taught by Panchal, in order to yield predictable results. Combining the references would yield the benefits of allowing wireless resources to be properly applied based on the latency class requirements of a particular autonomous vehicles function. As Panchal describes, (Paragraph [0018], Lines 18-27) “Low latency services” may include services that require less than 20 milliseconds of latency. For instance, low latency services may include edge computing services that control autonomous vehicles or robots, and/or other devices or systems operating in real-time. In contrast, “latency insensitive services” may include services that are not affected by latencies greater than 20 milliseconds. For instance, latency insensitive services may include website access, email access, video/streaming access, or the like.” Claim 2 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to add the latency information depending on a processing content of the assistance process.” Volos teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-15) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings, vehicle types, number of vehicles in a geographical area or linked to a network, weather metrics/conditions, time of day, dates, information regarding local incidents, information regarding major events, whether it is a holiday, number of vehicles and/or network devices connected (or linked) to a node, number of vehicles and/or network devices connected to a RSD, amount of data being transferred during a certain period of time across one or more links, etc.” Claim 3 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to add the latency information depending on a state of the vehicle.” Volos teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 4 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to add the latency information depending on a controlled object of the assistance process.” Volos teaches, (Paragraph [0065], Lines 1-3) “The vehicle control module 302 may execute the applications 321 and may control operation.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 5 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to add the latency information depending on a sensor type of an in-vehicle sensor of the vehicle, the in-vehicle sensor outputting sensor data included in the data for external processing.” Volos teaches, (Paragraph [0063], Lines 7-8) “The vehicle 300 may further include a vehicle control module 302, sensors 308.” Volos additionally teaches, (Paragraph [0064], Lines 1-6) “The sensors 308 may include cameras, objection detection sensors, temperature sensors, and/or other sensors that provide parameters and/or data associated with the state of the vehicle 300 and/or an environment in which the vehicle 300 is located. The parameters and data may include contextual data.” Volos additionally teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 6 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to add the latency information depending on a surrounding environment of the vehicle.” Volos teaches, (Paragraph [0064], Lines 1-6) “The sensors 308 may include cameras, objection detection sensors, temperature sensors, and/or other sensors that provide parameters and/or data associated with the state of the vehicle 300 and/or an environment in which the vehicle 300 is located. The parameters and data may include contextual data.” Volos additionally teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-13) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example … weather metrics/conditions, … information regarding local incidents, information regarding major events.” Claim 7 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to limit a scope of control of the vehicle when a latency requirement depending on the latency information is not satisfied.” Volos teaches, (Paragraph [0030], Lines 8-11) “Depending on the durations of these latencies, some of the data transferred may be outdated and as a result unusable. This can limit and/or prevent use of an application.” Volos additionally teaches, (Paragraph [0099], Lines 3-5) “the latency estimate module 251 may determine whether to enable the application based on the confidence interval parameters.” Volos additionally teaches, (Paragraph [0065], Lines 1-3) “The vehicle control module 302 may execute the applications 321 and may control operation.” Under broadest reasonable interpretation, deciding whether or not to execute vehicle applications based upon latency parameters is an example of limiting the scope of control of the vehicle. Claim 8 Discloses: (Original) “The assistant device according to claim 1, wherein the data for external processing includes sensor data output from an in-vehicle sensor of the vehicle.” Volos teaches, (Paragraph [0063], Lines 7-8) “The vehicle 300 may further include a vehicle control module 302, sensors 308.” Volos additionally teaches, (Paragraph [0064], Lines 1-6) “The sensors 308 may include cameras, objection detection sensors, temperature sensors, and/or other sensors that provide parameters and/or data associated with the state of the vehicle 300 and/or an environment in which the vehicle 300 is located. The parameters and data may include contextual data.” Volos additionally teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 10 Discloses: (Original) “The assistant device according to claim 1, wherein the hardware processor is configured to add the latency information for each transmission unit or each processing unit of data with respect to the data for external processing.” Volos teaches, (Paragraph [0036], Lines 8-18) “The latency based vehicle operating system 10 includes nodes 12 and a cloud-based network (or “the cloud”) 14. One or more of the nodes may be located in the cloud-based network. As an example, the nodes 12 may communicate with each other directly or via a distributed network 16. The distributed network may include the Internet, switches, routers, base stations, gateways, satellites, intermediary communication devices. The cloud-based network 14 may include some of the nodes 12, routers, 18, servers 20 and/or other network devices.” Volos additionally teaches, (Paragraph [0038]) “Each of the nodes outside of the cloud-based network 14 may be a vehicle, a road side device (RSD), or other network device. A RSD may be located at a signal light, on a building, on a pole near a road, or on some other structure. A RSD may (i) monitor an environment in a local area of the RSD, (ii) monitor latencies of signals transmitted to and/or from the RSD, and/or (iii) share with other nodes information indicative of a state of the environment, the latencies, and/or other related information, as further described below.” Volos additionally teaches, (Paragraph [0036], Lines 20-27) “Each of the nodes 12 may include application specific latency characterization and prediction modules (ASLCPMs) 22 and/or a HLDM 24. The ASLCPMs 22 perform latency characterization and provide latency models and projections for signals transmitted between the nodes. The one or more HLDMs 24 characterize historical latency data and provide latency models and projections including distance based latency estimates.” Claim 12 Discloses: (Original) Volos teaches, (Paragraph [0036], Lines 16-20) “The cloud-based network 14 may include some of the nodes 12, routers, 18, servers 20 and/or other network devices. Although shown separately, the nodes 12 of the cloud-based network 14 may be implemented as or in one or more of the servers 20.” Volos additionally teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Claim 13 Discloses: (Currently Amended) “The assistant device according to claim 12, wherein the latency information is an allowable delay time indicating allowable latency, and wherein the hardware processor is configured to allocate the data for external processing to the server of the plurality of servers according to the allowable delay time.” Volos teaches, (Paragraph [0056], lines 3-9) “The confidence interval module 266 may calculate latency confidence intervals and/or levels and latency projection data, which may include calculating transmission delays (time needed to transmit and/or receive a predetermined amount of data). This may include transmission times and response times and/or a sum thereof.” Volos additionally teaches, (Paragraph [0056], Lines 9-15) “The transmission delays depend on the corresponding data transmission rates involved and the amounts of data being transferred. The predicted decision module 268 provides predicted decisions of whether latencies for current operating conditions satisfy latency requirements, such as a maximum latency L.sub.max for a particular application.” Volos additionally teaches, (Paragraph [0054]) “The confusion matrix module 260 determines and/or updates confusion matrix statistics including a number of true positives (TPs), true negatives (TNs), false positives (FPs), and false negatives (FNs). The TPs refer to times when an application was executed and the corresponding latency requirement for the application was satisfied. The TNs refer to times when an application was not executed and the corresponding latency requirement for the application would not have been satisfied. The FPs refer to times when an application was executed, but the corresponding latency requirement for the application was not satisfied. The FNs refer to times when an application was not executed, but the corresponding latency requirement for the application would have been satisfied.” Claim 14 Discloses: (Original) “An assistant system comprising: the assistant device according to claim 1; and a router configured to allocate the data for external processing to the server of the plurality of servers according to the latency information added to the data for external processing; and the plurality of servers.” Volos teaches, (Paragraph [0036], Lines 1-16) “FIG. 1 shows an example of a latency based vehicle operating system 10 that operates as a distributed edge computing system … The distributed network may include the Internet, switches, routers, base stations, gateways, satellites, intermediary communication devices.” Claim 15 Discloses: (Original) “An assistant system comprising: the assistant device according to claim 1; and the plurality of servers, wherein a first server of the plurality of servers is configured to allocate the data for external processing to the server of the plurality of servers according to the latency information added to the data for external processing.” Volos teaches, (Paragraph [0036], Lines 16-20) “The cloud-based network 14 may include some of the nodes 12, routers, 18, servers 20 and/or other network devices. Although shown separately, the nodes 12 of the cloud-based network 14 may be implemented as or in one or more of the servers 20.” Volos additionally teaches, (Paragraph [0033], Lines 1-5) “Edge or fog computing applications may be used instead of cloud based applications. The terms “fog” and “edge” are interchangeable. The term “edge” may refer to a first or closest wireless communication node that is connected to a UE executing an application.” Volos additionally teaches, (Paragraph [0070], Lines 4-9) “FIG. 7 is shown as an approximate edge computing scenario, where a first visible or closest node to the UE 352 (e.g., a cell phone), connected to a long-term evolution (LTE) network, is the P-GW 360. Edge computing including processing of the data may be performed at the P-GW 360.” Volos additionally teaches, (Paragraph [0042], Lines 13-20) “the ASLCPMs may share latency information, such that when a first node is not connected to a second node, an ASLCPM of the first node can estimate latency for communicating with the second node by receiving latency information associated with the second node from ASLCPMs of one or more other nodes. The stated latency information may be stored in memories of the nodes and/or in cache of the ASLCPMs.” Volos additionally teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Claim 16 Discloses: (Currently Amended) “An assistant method comprising:” Volos teaches, (Paragraph [0007], lines 1-2) “A method of operating a vehicle application enabling system.” “adding latency information indicating required latency to data” Volos teaches, (Paragraph [0010], Lines 1-3) “FIG. 1 is a functional block diagram of an example of a latency based vehicle operating system including nodes each having application specific latency characterization.” Volos additionally teaches, (Paragraph [0039], Lines 7-9) “The latency characterization and prediction includes determining a current latency status, determining latency trends, and predicting latency characteristics.” “that is transmitted to a server of a plurality of servers via a communication line and used for an assistance process performed by the server; transmitting data for external processing to which the latency information is added;” Volos teaches, (Paragraph [0036]), Lines 16-24) “The cloud-based network 14 may include some of the nodes 12, routers, 18, servers 20 and/or other network devices. Although shown separately, the nodes 12 of the cloud-based network 14 may be implemented as or in one or more of the servers 20. Each of the nodes 12 may include application specific latency characterization and prediction modules (ASLCPMs) 22 and/or a HLDM 24. The ASLCPMs 22 perform latency characterization and provide latency models and projections for signals transmitted between the nodes.” “receiving assistance information including a processing result of the server based on the data for external processing; and controlling a vehicle based on the assistance information,” Volos teaches, (Paragraph [0039], Lines 1-7) “By having ASLCPMs at each of the nodes, a network of ASLCPMs is provided that is able to share latency information. The latency information may be used to (i) determine whether to enable and/or execute in-vehicle applications (referred to as “applications”), (ii) best network routes for signal transmission, and/or (iii) best vehicle routes for vehicles to follow. Volos teaches, (Paragraph [0065], Lines 1-7) “The vehicle control module 302 may execute the applications 321 and may control operation of an engine 331, a converter/generator 333, a transmission 334, a brake system 336, electric motors 338 and/or a steering system 340 according to parameters, data, values, commands, etc. determined, calculated, estimated, predicted, and/or generated as a result of executing the applications 321.” “wherein the latency information is a latency type indicating a class of the required latency” Volos does not explicitly teach its latency based vehicle operating system comprising latency information which is a latency type indicating a class of the required latency. Panchal does teach utilizing latency information which is a latency type indicating a class of the required latency. Panchal teaches a wireless network system in an autonomous vehicle environment wherein, (Paragraph [0018], Lines 20-23) “low latency services may include edge computing services that control autonomous vehicles or robots, and/or other devices or systems operating in real-time,” and that a, (Paragraph [0032], Lines 1-12) “Controller 105 may receive (at 1) the service request, and may determine that the first service identifier is associated with a first class of service. The determination may be based on controller 105 storing a mapping between different service identifiers, profile identifiers, and/or other identifiers and different classes of services (e.g., latency sensitive services, latency insensitive services, etc.). In this example, the first class of service may correspond to latency insensitive services (e.g., enhanced Mobile Broadband (“eMBB”) services and/or other services that may be considered “latency insensitive”) that are accessed from external data network 150.” Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to combine the latency based vehicle operating system of Volos with the explicit determination of latency type indicating a class of the required latency as taught by Panchal, in order to yield predictable results. Combining the references would yield the benefits of allowing wireless resources to be properly applied based on the latency class requirements of a particular autonomous vehicles function. As Panchal describes, (Paragraph [0018], Lines 18-27) “Low latency services” may include services that require less than 20 milliseconds of latency. For instance, low latency services may include edge computing services that control autonomous vehicles or robots, and/or other devices or systems operating in real-time. In contrast, “latency insensitive services” may include services that are not affected by latencies greater than 20 milliseconds. For instance, latency insensitive services may include website access, email access, video/streaming access, or the like.” Claim 17 Discloses: (Original) “The assistant method according to claim 16, further comprising: adding the latency information depending on a processing content of the assistance process.” Volos teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-15) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings, vehicle types, number of vehicles in a geographical area or linked to a network, weather metrics/conditions, time of day, dates, information regarding local incidents, information regarding major events, whether it is a holiday, number of vehicles and/or network devices connected (or linked) to a node, number of vehicles and/or network devices connected to a RSD, amount of data being transferred during a certain period of time across one or more links, etc.” Claim 18 Discloses: (Currently Amended) “The assistant method Volos teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 19 Discloses: (Currently Amended) “The assistant method Volos teaches, (Paragraph [0065], Lines 1-3) “The vehicle control module 302 may execute the applications 321 and may control operation.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 20 Discloses (Currently Amended) “The assistant method Volos teaches, (Paragraph [0063], Lines 7-8) “The vehicle 300 may further include a vehicle control module 302, sensors 308.” Volos additionally teaches, (Paragraph [0064], Lines 1-6) “The sensors 308 may include cameras, objection detection sensors, temperature sensors, and/or other sensors that provide parameters and/or data associated with the state of the vehicle 300 and/or an environment in which the vehicle 300 is located. The parameters and data may include contextual data.” Volos additionally teaches, (Paragraph [0043], Lines 9-12) “The latency data and the results of the characterization may be stored along with corresponding contextual information in memory of the cloud-based datacenter 56.” Volos additionally teaches, (Paragraph [0061], Lines 1-8) “The latency data referred to herein may include, for example, propagation latencies of signals, processing latencies of signals and data, and queueing of data latencies associated with the signals. The context data referred to herein may include, for example, vehicle identifiers (IDs), vehicle location information (e.g., latitudes and longitudes), vehicle speeds, vehicle headings and/or bearings.” Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Volos in view of Panchal, further in view of Jornod et al. (US 2020/0393850 A1, hereinafter Jornod) Claim 9 Discloses: (Original) “The assistant device according to claim 1, wherein the data for external processing includes data output from an input device in response to a user operation.” Volos does not teach the limitations of claim 9. However, Volos does teach the following. Volos teaches, (Abstract, Lines 1-3) “A vehicle application enabling system is provided and includes a memory and initialization, latency evaluation, and application enable modules.” Volos teaches, (Paragraph [0069], Lines 6-9) “The network includes user equipment (UE) (or Node.sub.1) 352, a cluster of towers (or evolved node Bs (eNBs)) 354, a mobility management entity (MME) 356, a serving gateway (S-GW) 358, packet gateway (P-GW) (or Node.sub.2) 360. “ Volos additionally teaches, (Paragraph [0120], Lines 4-10) “The computer programs may encompass a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services, background applications, etc.” Therefore, a person of ordinary skill in the art would understand the system of Volos can be applied towards an application present on user equipment. Panchal does not teach the limitations of claim 9. However, Panchal does teach the following. Panchal teaches, (Paragraph [0082], Lines 1-3) “Input component 840 may include a mechanism that permits an operator to input information to device 800, such as a keyboard, a keypad, a button, a switch, etc.” Jornod does teach the limitations of claim 9. Jornod teaches, (Paragraph [0021]) “The present proposal relates to the field of performing HDPL driving maneuvers … Since the HDPL relies on communications between the platoon members to drive with very short inter-vehicle distances, a drop in the communication performance, in particular, in an end-to-end latency, would affect its safe running. Therefore, it is needed to adapt the inter-vehicle distance based the prediction of the link-based QoS.” Jornod additionally teaches, (Paragraph [0054], Lines 1-7) “FIG. 1 shows the system architecture for the proposal. Reference number 10 denotes a user device. The depicted user device is exemplified as a transportation vehicle and, more in particular, it is a car. In other examples it may be differently exemplified, e.g., a smart phone, a smart watch, a tablet computer, notebook or laptop computer or the like.” Jornod additionally teaches, (Paragraph [0062]) “FIG. 3 shows schematically a block diagram of the transportation vehicle's 10 board electronics system. Part of the board electronics system is an infotainment system which comprises: the touch-sensitive display unit 20, a computing device 40, an input unit 50, and a memory 60. The display unit 20 includes both a display area for displaying variable graphical information and an operator interface (touch-sensitive layer) arranged above the display area) for inputting commands by a user.” Therefore, Jornod provides evidence that an infotainment screen capable of receiving user inputs is a well-known example of user equipment (UE). Therefore, it would have been obvious to a person of ordinary skill in the art to combine the vehicle application system capable of enabling or disabling vehicle applications based upon latency requirements as taught by Volos, with a specific node comprising a user device within a transportation vehicle capable of receiving user inputs as taught by Jornod, in order to yield predictable results. The rationale for combining the references would be to measure latency data is relation to data corresponding to the well know inputs of a vehicle infotainment system such as a navigation requests. As Jornod describes, (Paragraph [0064], Lines 1-4) “The other parts of the infotainment system such as camera 150, radio 140, navigation device 130, telephone 120 and instrument cluster 110 are connected via the data bus 100 with the computing device 40.” RELEVANT, BUT NOT CITED ART The prior art made of record and not relied upon is considered pertinent to applicant'sdisclosure. Kim (US 2020/0029191 A1) teaches, “a method of setting a server bridge of a vehicle in an autonomous driving system includes establishing a communication connection with a first server through a first base station; transmitting set information of the vehicle and a request message for generating the server bridge to the first server; and receiving address information of servers constituting the server bridge from the first server, wherein the server bridge is composed of one or more servers for distributed processing first data generated by the vehicle, and connects a server performing an uplink for the vehicle with a server performing a downlink, and the set information comprises driving path information of the vehicle and data information about a type and a size of the first data.” Tong et al. (US 2024/0059313 A1) teaches, “A method of providing an onboard assistant for a vehicle, comprising: receiving sensor data from sensors of the vehicle; transforming the sensor data using a trained machine learning model to provide transformed sensor data; inferring from the transformed sensor data that a special condition for the vehicle requires intervention; and contacting an operator of the vehicle to inform the operator of the special condition and receive instructions for handling the special condition.” Moustafa et al. (US 2022/0126878 A1) teaches, (Paragraph [0081], Lines 23-27) “The virtual driver may then provide inputs at the remote valet terminal to cause corresponding low latency, high priority data to be communicated (over network 155) to the vehicle 105 to control the steering, acceleration, and braking of the vehicle 105. Ogawa (JP2020191605A) teaches, (Abstract, Lines 1-3) “To provide a connection destination server election device, on-vehicle device including the same, connection destination server selection method, and server for taking a network state into consideration.” Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDER V. GENTILE whose telephone number is (703)756-1501. The examiner can normally be reached Monday - Friday 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito R. Robinson can be reached at (571)270-3921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALEXANDER V GENTILE/Examiner, Art Unit 3664 /KITO R ROBINSON/Supervisory Patent Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Feb 08, 2024
Application Filed
Jul 09, 2025
Non-Final Rejection — §103
Sep 08, 2025
Response Filed
Nov 19, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596381
TRAVEL MAP CREATING APPARATUS, TRAVEL MAP CREATING METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12584747
STATE ESTIMATION DEVICE AND STATE ESTIMATION METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12560939
DRIVING ROBOT GENERATING DRIVING MAP AND CONTROLLING METHOD THEREOF
2y 5m to grant Granted Feb 24, 2026
Patent 12545186
DISPLAY CONTROL DEVICE
2y 5m to grant Granted Feb 10, 2026
Patent 12517512
CONTROL METHOD FOR CONTROLLING DELIVERY SYSTEM
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
75%
Grant Probability
88%
With Interview (+12.6%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 24 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month