Prosecution Insights
Last updated: April 19, 2026
Application No. 18/977,024

REMOTE MONITORING DEVICE, REMOTE MONITORING METHOD, NON-TRANSITORY COMPUTER READABLE RECORDING MEDIUM STORING REMOTE MONITORING PROGRAM, REMOTE MONITORING SYSTEM, AND DEVICE

Non-Final OA §101§103§112
Filed
Dec 11, 2024
Examiner
SHAFI, MUHAMMAD
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Panasonic Intellectual Property Corporation of America
OA Round
1 (Non-Final)
89%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 89% — above average
89%
Career Allow Rate
978 granted / 1100 resolved
+36.9% vs TC avg
Strong +17% interview lift
Without
With
+16.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
35 currently pending
Career history
1135
Total Applications
across all art units

Statute-Specific Performance

§101
18.8%
-21.2% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
7.2%
-32.8% vs TC avg
§112
20.7%
-19.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1100 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This communication is a first office action, non-final rejection on the merits. Claims 1-14, as originally filed, are currently pending and have been considered below. Claim Rejections - 35 USC § 101 3. 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 4. Claims 11 and 1 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claims 11 and 1 are directed to a method, which is one of the statutory categories of invention. (Step 1: YES). Claim 11 and 1 recites: acquiring a plurality of pieces of positional information indicative of respective positions of the vehicles and a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; performing detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; estimating, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; identifying a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and indicating the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. The limitations from claim 11, acquiring a plurality of pieces of positional information indicative of respective positions of the vehicles and a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; ( a person is driving a vehicle, looking ahead at vehicles coming from opposite direction, knowing their position and also listening to any sound coming from opposite direction); performing detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; ( the person is hearing (detecting) the sound of Emergency vehicle siren on the road); estimating, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; (depending on the intensity of the sound of the Siren, the person is estimating the proximity of the emergency vehicle ( i.e., high intensity Siren vehicle very close)); identifying a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; ( the person is seeing an Emergency vehicle in opposite incoming direction and is approaching towards the vehicles ); and indicating the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles ( the person is telling his fellow passenger about a particular vehicle (in opposite incoming vehicle direction) being approached by the Emergency vehicle), which is a process that, under its broadest reasonable interpretation, covers performance of the limitation(s) as a mental process, more specifically, a concept performed in the human mind of acquiring vehicle position…, detecting siren sound, estimating emergency vehicle position proximity, identifying a vehicle being approached by the emergency vehicle, and indicating the vehicle being approached by the emergency vehicle. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation a certain method of a concept performed in the human mind, then it falls within the “mental process” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. The claim recites remote monitoring device, remote control, autonomous vehicle …which are electro-mechanical devices or any device, nothing in the claim precludes the steps from being practically performed in the human mind. Thus claim 11 and 1 recite a mental process. Therefore Claim 11 and 1 are abstract for similar reasons. (Step 2A-Prong 1: YES. The claim is abstract). This judicial exception is not integrated into a practical application. Limitations that are not indicative of integration into a practical application include: (1) Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea (MPEP 2106.05.f), (2) Adding insignificant extra-solution activity to the judicial exception (MPEP 2106.05.g), (3) Generally linking the use of the judicial exception to a particular technological environment or field of use (MPEP 2106.05.h). In particular, the claims only recite the steps of: acquiring vehicle position…, detecting siren sound, estimating emergency vehicle position proximity, identifying a vehicle being approached by the emergency vehicle, and indicating the vehicle being approached by the emergency vehicle. These step of “acquiring information” amounts to mere data gathering, which is a form of insignificant extra-solution activity. Accordingly, these additional elements, when considered separately and as an ordered combination, do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore claim 11 and 1 are directed to an abstract idea without a practical application. (Step 2A-Prong 2: NO. The additional claimed elements are not integrated into a practical application). The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because, when considered separately and as an ordered combination, they do not add significantly more (also known as an “inventive concept”) to the exception. As discussed above with respect to integration of the abstract idea into a practical application, there are no additional elements recited in the claim beyond the judicial exception. At least the “indicating” is considered to be post-solution activity and it does not appear to be more than what is considered well-understood, routine, conventional activity in the field (WURC). The MPEP provides support that the additional limitations in the claim are directed to well-understood routine and conventional steps: MPEP 2106.05(d) II recites: II. ELEMENTS THAT THE COURTS HAVE RECOGNIZED AS WELL-UNDERSTOOD, ROUTINE, CONVENTIONAL ACTIVITY IN PARTICULAR FIELDS Because examiners should rely on what the courts have recognized, or those of ordinary skill in the art would recognize, as elements that describe well-understood, routine activities, the following section provides examples of elements that have been recognized by the courts as well-understood, routine, conventional activity in particular fields. It should be noted, however, that many of these examples failed to satisfy other Step 2B considerations (e.g., because they were recited at a high level of generality and thus were mere instructions to apply an exception, or were insignificant extra-solution activity). Thus, examiners should carefully analyze additional elements in a claim with respect to all relevant Step 2B considerations, including this consideration, before making a conclusion as to whether they amount to an inventive concept. The courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); but see DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245, 1258, 113 USPQ2d 1097, 1106 (Fed. Cir. 2014) ("Unlike the claims in Ultramercial, the claims at issue here specify how interactions with the Internet are manipulated to yield a desired result-a result that overrides the routine and conventional sequence of events ordinarily triggered by the click of a hyperlink." (emphasis added)); iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681,1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; The MPEP further recites with respect to claims directed to insignificant solution activity: 2106.05(g) Insignificant Extra-Solution Activity Selecting a particular data source or type of data to be manipulated: iii. Selecting information, based on types of information and availability of information in a power-grid environment, for collection, analysis and display, Electric Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1354-55, 119 USPQ2d 1739, 1742 (Fed. Cir. 2016); and Mere instructions to implement an abstract idea, on or with the use of generic computer components, or even without any computer components, cannot provide an inventive concept - rendering the claim patent ineligible. Thus claim 11 and 1 are not patent eligible. (Step 2B: NO. The claim does not provide significantly more). Claims 2-10 inherit the deficiencies of the base claim 1 respectively and therefore are non-statutory by virtue of their dependency. 5. Claims 14 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. Claim 14 recites elements which are computer software programs per se. Claim 14 recites, “an acquisition part that acquires”, “a detection part that performs detection” , “an estimation part that estimates”, “an identification part that identifies” and “an indication part that indicates” each of which are only “processors (See Specification [0131])”. Under the BRI of the specification these are software programs. Software program per se represent data structure without being connected to a processor, computer, or server does not fit into any of the four statutory classes (process, apparatus, article of manufacture and composition of matter) and therefore is not a statutory subject matter under 35 USC 101. For a computer-implemented 35 U.S.C. 112(f) claim limitation, the specification must disclose an algorithm for performing the claimed specific computer function, or else the claim is indefinite under 35 U.S.C. 112(b). See Net MoneyIN, Inc. v. Verisign. Inc., 545 F.3d 1359, 1367, 88 USPQ2d 1751, 1757 (Fed. Cir. 2008). See also In re Aoyama, 656 F.3d 1293, 1297, 99 USPQ2d 1936, 1939 (Fed. Cir. 2011) ("[W]hen the disclosed structure is a computer programmed to carry out an algorithm, ‘the disclosed structure is not the general purpose computer, but rather that special purpose computer programmed to perform the disclosed algorithm.’") (quoting WMS Gaming, Inc. v. Int’l Game Tech., 184 F.3d 1339, 1349, 51 USPQ2d 1385, 1391 (Fed. Cir. 1999)). In cases involving a special purpose computer-implemented means-plus-function limitation, the Federal Circuit has consistently required that the structure be more than simply a general purpose computer or microprocessor and that the specification must disclose an algorithm for performing the claimed function. See, e.g., Noah Systems Inc. v. Intuit Inc., 675 F.3d 1302, 1312, 102 USPQ2d 1410, 1417 (Fed. Cir. 2012); Aristocrat, 521 F.3d at 1333, 86 USPQ2d at 1239. For a computer-implemented means-plus-function claim limitation invoking 35 U.S.C. 112(f) the Federal Circuit has stated that "a microprocessor can serve as structure for a computer-implemented function only where the claimed function is ‘coextensive’ with a microprocessor itself." EON Corp. IP Holdings LLC v. AT&T Mobility LLC, 785 F.3d 616, 622, 114 USPQ2d 1711, 1714 (Fed. Cir. 2015), citing In re Katz Interactive Call Processing Patent Litigation, 639 F.3d 1303, 1316, 97 USPQ2d 1737, 1747 (Fed. Cir. 2011). "‘It is only in the rare circumstances where any general-purpose computer without any special programming can perform the function that an algorithm need not be disclosed.’" EON Corp., 785 F.3d at 621, 114 USPQ2 at 1714, quoting Ergo Licensing, LLC v. CareFusion 303, Inc., 673 F.3d 1361, 1365, 102 USPQ2d 1122, 1125 (Fed. Cir. 2012). "‘[S]pecial programming’ includes any functionality that is not ‘coextensive’ with a microprocessor or general purpose computer." EON Corp., 785 F.3d at 623, 114 USPQ2d at 1715 (citations omitted). "Examples of such coextensive functions are ‘receiving’ data, ‘storing’ data, and ‘processing’ data—the only three functions on which the Katz court vacated the district court’s decision and remanded for the district court to determine whether disclosure of a microprocessor was sufficient." 785 F.3d at 622, 114 USPQ2d at 1714. Thus, "[a] microprocessor or general purpose computer lends sufficient structure only to basic functions of a microprocessor. All other computer-implemented functions require disclosure of an algorithm." Id., 114 USPQ2d at 1714 . See PMEP 2118 II.B. Claim Rejections - 35 USC § 112 6. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 7. Claims 1 and 14 limitations “an acquisition part that acquires”, “a detection part that performs detection” , “an estimation part that estimates”, “an identification part that identifies” and “an indication part that indicates” invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The specification discloses that these are processors [0131]. Therefore, the claims 1 and 14 are indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. Applicant may: (a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph; (b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)). If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either: (a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181. Claims 2-10 inherit the deficiencies of the base claim 1 respectively and therefore are they are also rejected under 112(b) Second Paragraph . Claim Rejections - 35 USC § 103 8. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. 9. Claims 1, 5-9 and 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Cogna et al. (USP 2020/0409358) in view of Tariq et al. ( USP 2021/0201676). As Per Claim 1, Cogna et al. (Cogna ) teaches, a remote monitoring device (teleoperation system 114), Fig.5) for remotely monitoring a plurality of vehicles (fleet 502, Fig.5) configured to travel autonomously ([0027]) and travel under remote control, (Figs.1,5) comprising: an acquisition part that acquires a plurality of pieces of positional information indicative of respective positions of the vehicles ( 114 receiving vehicle sensor data 436 [0083]) which contains location and orientational data [0055] also see [0017]). However, Cogna does not explicitly teach, acquiring a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; a detection part that performs detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; an estimation part that estimates, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; an identification part that identifies a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and an indication part that indicates the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. In a an analogous art , Tariq et al. (Tariq) teaches, emergency vehicle detection and response, wherein, an acquisition part that acquires a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; ( via remote center 130 being configured with vehicle computing system 106 of autonomous vehicle 102. E V 108 of vehicle computing system capturing audio data and classifying a sound as an emergency sound 110 (e.g. siren) and remote computing device 130 is receiving raw and /or processed data audio and visual data [0047], [0046], [0048], fig.1); a detection part that performs detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; ( vehicle computing system 106, [0014]); an estimation part that estimates, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data (vehicle computing system 106, [0014-0015])[0037-0038]); an identification part that identifies a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and an indication part that indicates the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. ( vehicle computing device 106, [0048], Figs. 2A-2B and [0058]-[0079]) also see Fig.1). It would have been obvious to one of ordinary skill in the art, having the teachings of Cogna and Tariq before him before the effective filing date of the claimed invention to modify the systems of Cogna to include the teachings ( vehicle computing device 106, emergency vehicle detection and response component 108 and sensor data processing component 136) of Tariq and configure with the teleoperation system of Cogna in order to teleoperation system receiving raw and /or processed audio and visual data from the vehicle and detecting an emergency vehicle based on its siren sound classification and its proximity to the autonomous vehicle, to provide navigation guidance to the autonomous vehicle. Motivation to combine the two teachings is, to receive information of vehicles’ surrounding situation and providing navigation guidance. As per Claim 5, Cogna as modified by Tariq teaches the limitation of Claim 1. However, Cogna in view of Tariq teaches, wherein the identification part includes an emergency vehicle position storage section that stores the estimated position of the emergency vehicle, a moving direction estimation section that estimates a moving direction of the emergency vehicle on the basis of a previously estimated position of the emergency vehicle which is stored in the emergency vehicle position storage section and the estimated position of the emergency vehicle, and a vehicle identification section that identifies the vehicle approached by the emergency vehicle on the basis of the estimated moving direction of the emergency vehicle and the respective pieces of positional information of the vehicles. (Tariq : computing device 106, [0058-0079], Figs. 2A-2B). As per Claim 6, Cogna as modified by Tariq teaches the limitation of Claim 1. However, Cogna in view of Tariq teaches, wherein the indication part indicates a vehicle that is within a predetermined range from the estimated position of the emergency vehicle among the vehicles. (Tariq : [0012], Fig. 2A, 2B). As per Claim 7, Cogna as modified by Tariq teaches the limitation of Claim 1. However, Cogna further teaches, wherein the acquisition part (114 ) further acquires video data taken by a camera included in each of the vehicles, and the indication part outputs to a display part, in a case where the vehicle approached by the emergency vehicle is identified among the vehicles, the video data taken by the camera included in the vehicle approached by the emergency vehicle. (Cogna : [0018],[0090], [0096], Figs. 6-7). However, Cogna does not explicit teach, the vehicle approached by the emergency vehicle. However, Tariq teaches, the vehicle approached by the emergency vehicle (Figs. 2A, 2B, vehicle 102 being approached by an emergency vehicle ). (see claim 1 above for rationale supporting obviousness, motivation, and reason to combine.). As per Claim 8, Cogna as modified by Tariq teaches the limitation of Claim 1. However, Cogna in view of Tariq teaches, wherein the indication part outputs to a display part, ( Cogna : “The teleoperator interface may include one or more displays configured to provide the teleoperator with data related to operation of the autonomous vehicles 102(1)-(2).”, [0037], also see [0026], [0086], [0018]). Congna does not explicitly teach the following, however, Tariq teaches, in a case where the vehicle approached by the emergency vehicle is identified among the vehicles, an indication image including a map, a plurality of first icons indicative of the respective positions of the vehicles on the map, and a second icon indicative of the estimated position of the emergency vehicle on the map. (Tariq: [0047], [0048], [0058-0079], Figs. 2A, 2B). (see claim 1 above for rationale supporting obviousness, motivation, and reason to combine). As per Claim 9, Cogna as modified by Tariq teaches the limitation of Claim 8. However, Cogna in view of Tariq teaches, wherein the indication part displays (Cogna : “The teleoperator interface may include one or more displays configured to provide the teleoperator with data related to operation of the autonomous vehicles 102(1)-(2)”, [0037]). However, Cogna does not explicitly teach, a first icon indicative of the position of the vehicle approached by the emergency vehicle on the map in a state different from a state for another first icon indicative of the position of another vehicle not approached by the emergency vehicle on the map. However, Tariq teaches, a first icon indicative of the position of the vehicle approached by the emergency vehicle on the map in a state different from a state for another first icon indicative of the position of another vehicle not approached by the emergency vehicle on the map. (Tariq : Figs. 2A, 2B). (see claim 1 above for rationale supporting obviousness, motivation, and reason to combine.). As Per Claim 11, Cogna et al. ( Cogna) discloses, a remote monitoring method (teleoperation system 114), Fig.5) by a remote monitoring device for remotely monitoring a plurality of vehicles (fleet 502, Fig.5) configured to travel autonomously and under remote control, comprising: acquiring a plurality of pieces of positional information indicative of respective positions of the vehicles ( 114 receiving vehicle sensor data 436 [0083]) which contains location and orientational data [0055] also see [0017]). However, Cogna does not explicitly teach, acquiring a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; performing detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; estimating, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; identifying a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and indicating the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. In a an analogous art , Tariq et a l. (Tariq) teaches, emergency vehicle detection and response, wherein, acquiring a plurality a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; ( via remote center 130 being configured with vehicle computing system 106 of autonomous vehicle 102. E V 108 of vehicle computing system capturing audio data and classifying a sound as an emergency sound 110 (e.g. siren) and remote computing device 130 is receiving raw and /or processed data audio and visual data [0047], [0046], [0048], fig.1); performing detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; ( vehicle computing system 106, [0014]); estimating, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; ( vehicle computing system 106, [0014-0015])[0037-0038]); identifying a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and indicating the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles ( vehicle computing device 106, [0048], Figs. 2A-2B and [0058]-[0079]) also see Fig.1). It would have been obvious to one of ordinary skill in the art, having the teachings of Cogna and Tariq before him before the effective filing date of the claimed invention to modify the systems of Cogna to include the teachings (vehicle computing device 106, emergency vehicle detection and response component 108 and sensor data processing component 136) of Tariq and configure with the teleoperation system of Cogna in order to teleoperation system receiving raw and /or processed audio and visual data from the vehicle and detecting an emergency vehicle based on its siren sound classification and its proximity to the autonomous vehicle, to provide navigation guidance to the autonomous vehicle. Motivation to combine the two teachings is, to receive information of vehicles’ surrounding situation and provide navigation guidance. Claim 12 is being rejected using the same rationale as claim 11. As Per Claim 13, Cogna et al. (Cogna ) teaches, a remote monitoring system, (teleoperation system 114), Fig.5) comprising: a plurality of vehicles (fleet 502, Fig.5) configured to travel autonomously ( [0027]) and travel under remote control; ( Figs.1, 5) and a remote monitoring device ( Teleoperation system 114, Fig.5) for remotely monitoring the vehicles wherein each of the vehicles includes: a positional information acquisition part that acquires a piece of positional information indicative of a position of the vehicle; ( sensor system 406, [0055]-0057], Fi.4); a microphone that acquires a piece of sound data indicative of a sound in surroundings of the vehicle;(“teleoperation device may include a microphone”, [0027]); a communication part that transmits the piece of positional information and the piece of sound data to the remote monitoring device, ( emitters 408,[0074], Fig.4); and the remote monitoring device ( teleoperation system 114, Fig.5) includes: an acquisition part that acquires the pieces of positional information indicative of the respective positions of the vehicles ( 114 receiving vehicle sensor data 436 [0083]) which contains location and orientational data [0055] also see [0017]). However, Cogna does not explicitly teach, an acquisition part that acquires the pieces of sound data indicative of the sounds in the respective surroundings of the vehicles; a detection part that performs detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; an estimation part that estimates, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; an identification part that identifies a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and an indication part that indicates the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. In a an analogous art, Tariq teaches, emergency vehicle detection and response, wherein, an acquisition part that acquires the pieces of sound data indicative of the sounds in the respective surroundings of the vehicles; (via remote center 130 being configured with vehicle computing system 106 of autonomous vehicle 102. E V 108 of vehicle computing system capturing audio data and classifying a sound as an emergency sound 110 (e.g. siren) and remote computing device 130 is receiving raw and /or processed data audio and visual data [0047], [0046], [0048], fig.1); a detection part that performs detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; ( vehicle computing system 106, [0014]); an estimation part that estimates, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; ( vehicle computing system 106, [0014-0015])[0037-0038]); an identification part that identifies a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and an indication part that indicates the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. ( vehicle computing device 106, [0048], Figs. 2A-2B and [0058]-[0079]) also see Fig.1). It would have been obvious to one of ordinary skill in the art, having the teachings of Cogna and Tariq before him before the effective filing date of the claimed invention to modify the systems of Cogna to include the teachings (vehicle computing device 106, emergency vehicle detection and response component 108 and sensor data processing component 136) of Tariq and configure with the teleoperation system of Cogna in order to teleoperation system receiving raw and /or processed audio and visual data from the vehicle and detecting an emergency vehicle based on its siren sound classification and its proximity to the autonomous vehicle, to provide navigation guidance to the autonomous vehicle. Motivation to combine the two teachings is, to receive information of vehicles’ surrounding situation and to provide navigation guidance (i.e., smooth trip, time saving). As Per Claim 14, Cogna et al. (Cogna ) teaches, a device (teleoperation system 114), Fig.5), comprising: an acquisition part that acquires a plurality of pieces of positional information indicative of respective positions of a plurality of vehicles ( 114 receiving vehicle sensor data 436 [0083]) which contains location and orientational data [0055] also see [0017]). However, Cogna does not explicitly teach, an acquisition part that acquires a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; a detection part that performs detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; an estimation part that estimates, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; an identification part that identifies a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and an indication part that indicates the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. In a an analogous art, Tariq et al. (Tariq) teaches, emergency vehicle detection and response, wherein, an acquisition part that acquires a plurality of pieces of sound data indicative of sounds in respective surroundings of the vehicles; ( via remote center 130 being configured with vehicle computing system 106 of autonomous vehicle 102. E V 108 of vehicle computing system capturing audio data and classifying a sound as an emergency sound 110 (e.g. siren) and remote computing device 130 is receiving raw and /or processed data audio and visual data [0047], [0046], [0048], fig.1); a detection part that performs detection of a siren sound of an emergency vehicle in each of the acquired pieces of sound data; ( vehicle computing system 106, [0014]); an estimation part that estimates, in a case where the siren sound is detected in pieces of sound data, a position of the emergency vehicle on the basis of the respective pieces of positional information of vehicles and the pieces of sound data; ( vehicle computing system 106, [0014-0015])[0037-0038]); an identification part (vehicle computing device 106),that identifies a vehicle approached by the emergency vehicle among the vehicles on the basis of time variation of the estimated position of the emergency vehicle; and an indication part that indicates the vehicle approached by the emergency vehicle in a case where the vehicle approached by the emergency vehicle is identified among the vehicles. ( vehicle computing device 106, [0048], Figs. 2A-2B and [0058]-[0079]) also see Fig.1). It would have been obvious to one of ordinary skill in the art, having the teachings of Cogna and Tariq before him before the effective filing date of the claimed invention to modify the systems of Cogna to include the teachings ( vehicle computing device 106, emergency vehicle detection and response component 108 and sensor data processing component 136) of Tariq and configure with the teleoperation system of Cogna in order to teleoperation system receiving raw and /or processed audio and visual data from the vehicle and detecting an emergency vehicle based on its siren sound classification and its proximity to the autonomous vehicle, to provide navigation guidance to the autonomous vehicle. Motivation to combine the two teachings is, to receive information of vehicles’ surrounding situation and to provide navigation guidance (i.e., smooth trip, time saving). 10. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Cogna et al. (USP 2020/0409358) in view of Tariq et al. ( USP 2021/0201676) in view of Iwamoto et al. (USP 2020/0326702). As per Claim 10, Cogna as modified by Tariq teaches the limitation of Claim 1. However, Cogna in view of Tariq teaches, wherein the indication part (user interface) in a case where the vehicle approached by the emergency vehicle is identified among the vehicles, (Tariq :Figs. 2A, 2B). However Cogna inn view of Tariq does not explicitly teach, wherein the indication part outputs to a speaker, an indication sound for indicating the vehicle approached by the emergency vehicle. In an analogous art, Iwamoto et al. ( Iwamoto) discloses, vehicle remote instruction system, wherein, wherein the indication part outputs to a speaker, an indication sound for indicating the vehicle approached by the emergency vehicle. ( via information providing unit 12 of remote instruction server , providing sound information to remote operator R through speaker of the output unit 3a of the operator interface, Figs. 1, 4). It would have been obvious to one of ordinary skill in the art, having the teachings of Cogna and Tariq and Iwamoto before him before the effective filing date of the claimed invention to modify the systems of Cogna to include the teachings (information providing unit and interface) of Iwamoto and configure with the system of Cogna in order to teleoperation operator having speaker sounding the siren of the emergency vehicle approaching the subject vehicle. Motivation to combine the two teachings is, to receive information of vehicles’ surrounding situation and provide navigation guidance. Allowable Subject Matter 11. Claims 2, 3 and 4 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MUHAMMAD SHAFI whose telephone number is (571)270-5741. The examiner can normally be reached M-F 8:30 am -5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Browne can be reached at 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MUHAMMAD SHAFI/Primary Examiner, Art Unit 3666C
Read full office action

Prosecution Timeline

Dec 11, 2024
Application Filed
Mar 04, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587320
DISTANCE-BASED NACK PROCEDURES IN A VEHICULAR PLATOON
2y 5m to grant Granted Mar 24, 2026
Patent 12583440
ACTIVE SAFETY SUSPENSION SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12578721
SYSTEMS AND METHODS FOR REMOTE CONTROL OF VEHICLES
2y 5m to grant Granted Mar 17, 2026
Patent 12573251
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND CONTROL APPARATUS
2y 5m to grant Granted Mar 10, 2026
Patent 12568871
SYSTEM AND METHOD FOR DETERMINING RESIDUE COVERAGE OF A FIELD
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
89%
Grant Probability
99%
With Interview (+16.7%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 1100 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month