Prosecution Insights
Last updated: April 19, 2026
Application No. 18/028,517

FIBRE OPTIC SENSING METHOD AND SYSTEM FOR GENERATING A DYNAMIC DIGITAL REPRESENTATION OF OBJECTS AND EVENTS IN AN AREA

Non-Final OA §103
Filed
Mar 24, 2023
Examiner
CHOU, SHIEN MING
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fiber Sense Limited
OA Round
3 (Non-Final)
57%
Grant Probability
Moderate
3-4
OA Rounds
4y 4m
To Grant
88%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
54 granted / 95 resolved
+4.8% vs TC avg
Strong +31% interview lift
Without
With
+30.8%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
28 currently pending
Career history
123
Total Applications
across all art units

Statute-Specific Performance

§101
14.8%
-25.2% vs TC avg
§103
49.3%
+9.3% vs TC avg
§102
14.8%
-25.2% vs TC avg
§112
20.3%
-19.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 95 resolved cases

Office Action

§103
DETAILED ACTION Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/17/2025 has been entered. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This action is in response to the amendment filed on --12/17/2025 for application 18/028,517. Claim 1 – 16 and 18 – 21 are pending and have been examined. Claim 1, 5 – 6, 10 – 11, 15 and 18 – 19 are amended. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Such claim limitation is: “means for generating a zone feature dataset”, “means for generating an object tracking dataset”, “means for generating an event dataset”, “means for generating a dynamic representation” in Claim 18. Structure for this limitation maybe found at least in Fig. 1 and page 11 – 12 of the specification of the instant application as “processing unit 114”. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Such claim limitation is: “a distributed sensing unit for repeatedly transmitting” in Claim 19. Structure for this limitation maybe found at least at least Fig 1 and page 10 of the specification of the instant application as “DFS system” which includes C-OTDR, optical circulator, receiver. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f), it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f), applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f). Respond to Amendment Applicant’s amendment filed on 12/17/2025 has been entered. Respond to Argument Applicant's remark filed on 12/17/2025 regarding claim rejection under 35. U.S.C. 103 section has been fully considered but they are not moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 3 – 4, 6 – 9, 11 – 12, 18 – 21 are rejected under 35 U.S.C. 103 as being unpatentable over Englund, US20200191613 in view of Ahmad et al., (hereinafter Ahmad), WO2015088313 and Englund (hereinafter Englund 2), US20220018980. Regarding Claim 1, Englund discloses: A method of generating a dynamic representation of a plurality of objects and associated zones in a geographic area, the method comprising: generating a zone feature dataset (0067, “digital representation of the nature and movement of the sound targets around the cable grid (zone)”; The data is collected on each grid), including identifying and classifying the associated zones in the geographic area, each zone being classified into a zone type … having at least two object-sensed conditions (0078 – 0080, “Alert criteria are stored with the symbol index database at step 212, with each symbol having at least one associated alert criterion (threshold amplitude/frequency) … For example, in the case of an excavator, the speed and direction of movement of the excavator is factored in … an excavation or intrusion event detected at a location where there are no known operations or at a time of day where no operations are expected”; the location expecting excavation is a zone type (construction) and the location without expecting excavation is another zone type (none construction). The presence of excavator is a condition, no excavator is another condition); generating an object tracking dataset ((0047, “process the acoustic data and classify it in accordance with the target classes or types to generate a plurality of datasets including classification, temporal and location-related data, and storing the datasets in the storage unit.”), including tracking signals of at least some of the plurality of objects in the geographic area using a distributed fiber optic sensing network (0058, “Fig. 7 … distribution geometry … optical fibre network”), and processing the tracking signals to obtain object-specific tracking data; generating an event dataset, including using the tracking data to determine when the conditions of the zones are changing (refer to the mapping above & 0077, “symbols representative of sound objects and/or sound events are generated and stored in the digital symbol index database. Each symbol index includes an event/object identifier with time and location stamp).”); isolating a … strain signal from the distributed fibre optic sensing network (0102, “filter may be applied to efficiently locate all pedestrians within an area and then a much more specific set of filters could be applied to classify … from low frequency pressure amplitude”, “these filters are generally initially applied to the acoustic data at the time of collection, so as to enable the storage of symbols representative of object and activity type, though for higher resolution raw acoustic or optical data may be retrieved and reprocessed.”), the … strain signal being indicative of gross weight-induced changes in a region above a fibre optic cable (0064 – 0065, “acoustic data in this disclosure should be read as including any propagating wave or signal that imparts a detectable change in the optical properties of the sensing optical fibre”, “localised strain changes in the optical fibre”; ” The raw optical data in the preferred embodiment is stream of repeating reflection sets from a series of optical pulses directed down the sensing fibre”; 0090, “the optical fibres may include those installed underground, in which case the coverage of the geographical area includes the street level of a city, which is useful in monitoring vehicle and pedestrian traffic.” i.e., the fiber network senses the weight/impact above it), using appearance or disappearance of the DC-band strain signal, and tracking data to determine when conditions of the zones are changing as a result of an object entering or exiting a zone (refer to the mapping above, the appearance and disappearance of strain signal represents the status/existence/moving of the object above; 0105, “the symbol indices associated with pedestrian activity at the relevant time and location … acoustic disturbances in the signals relating to pedestrian traffic (such as those caused by footsteps going into and leaving the shop)”); and digitizing and storing the conditions of the zones (refer to the mapping above, the event, time and location are stored as digital symbol index in database); and rendering a dynamic representation of the conditions of the zones (0093, “This may be incorporated on a GIS overlay, with digital symbols overlaid on the map, as is clear from FIG. 5B, which includes pedestrian and car symbols.”). Englund does not explicitly disclose: zone type based on static and/or quasi-static zone identification features DC-band strain signal Ahmad, in the same field of endeavor, explicitly teach: zone type based on static and/or quasi-static zone identification features (Ahmad, page 3, ln. 16 – 26, “optical fiber sensor … indicates a vehicle presence in the parking space”; Englund teaches a location based real-time monitoring and querying system that provide information of the detection based on semantic/context of the location (Englund, 0078 - 0080). Ahmad teaches monitoring application for parking space. The combination renders obviousness of using separate zone type (static) for parking space to providing different context of the object movement for the semantic engine to interpret the movement) Englund and Ahmad both teach object sensing and data reporting using fiber optic technology and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include zones of parking space of Ahmad’s teaching to Englund’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification to provide different context of the sensing result for the parking space. Englund and Ahmad combination does not explicitly teach: DC-based strain signal Englund 2, in the same field of endeavor, explicitly teach: DC-based strain signal (Englund 2, 0045, “These propagating signals detected in the system may include signal types in addition to conventional acoustic signals such as low frequency seismic waves, other low frequency vibrations, and slowly varying and very low frequency (DC-type) signals such as weight-induced compression waves that induce for example localised strain changes in the optical fibre. ” 0060 – 0061, “the DC band has significantly lower signal amplitude for the vehicle there are virtually no other ambient sound sources in this frequency band to introduce noise and hence to degrade the detection performance”, “result in a higher signal to noise ratio (SNR) for moving object detection in DC-type band compared to higher frequency”) Englund (in view of Ahmad) and Englund 2 both teach object sensing and data reporting using fiber optic technology and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include the use of DC type band signal taught by Englund 2 to Englund (in view of Ahmad)’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification to in order to achieve higher signal to noise ratio (Englund 2, 0061). Regarding Claim 3, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: at least a portion of the object tracking dataset is generated as a layer and rendered or fused on a map platform or GIS overlay (refer to the mapping in Claim 1. The data is used to create an GIS overlay map). Regarding Claim 4, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: the objects include vehicles and the object-sensed conditions of the zones include an occupied state and a vacant state (refer to the mapping in Claim 1, & Ahmad, page 2, “detect presence and absence of a vehicle”). Regarding Claim 6, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: generating the zone feature dataset includes using the static and/or quasi-static zone identification features including at least one of parking area signs and road markers, drop off and pick up area signs and road markers, off-street car parking spot signs and road markers, loading zone signs and road markers, public transport stop signs and road markers, traffic light zones or areas, and petrol stations (refer to the mapping in Claim 1 & Ahmad, fig. 1, & page 2, “identifying and allocating a parking space for a vehicle.” The identity and location of the parking space on the map can be used to recognize that the zone is for parking). Regarding Claim 7, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: the tracking data is used to determine when the objects change the condition or state of the zones by entering or exiting the zones (Englund, 0105, “the symbol indices associated with pedestrian activity at the relevant time and location … acoustic disturbances in the signals relating to pedestrian traffic (such as those caused by footsteps going into and leaving the shop)”). Regarding Claim 8, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: the tracking data associated with an object track is used to determine when the object changes the condition or state of a zone by determining when the track is terminating or beginning proximate the zone with the static and/or quasi-static zone identification features (Englund, 0105 – 0106, “track any footsteps leaving the shops to where the footsteps end (terminating). This could also be achieved by searching the pedestrian symbol index for the time and location from which pedestrian tracking information could be generated. To anticipate the possibility of the thief getting away in a vehicle, the processing unit 114 may be configured to then track any subsequent vehicle movements originating (beginning) from where those footsteps are tracked to (proximate), or by searching the vehicle symbol index and correlating this with the pedestrian index to identify potential crossover locations where pedestrian activity”; the pedestrian leaving the shop changes the condition of the shop (zone) from having customer to not having customer. Pedestrian track ends at vehicle and the subsequent vehicle movement change the condition of the street/parking lot from having pedestrian to not having pedestrian and having a stopped vehicle to having a moving vehicle). Regarding Claim 9, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: the tracking data is passed through a semantics engine to make the determination (Englund, 0045, “The processing unit may include a semantics engine to assess”). Regarding Claim 11, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination further teach: the step of generating the object tracking dataset using the distributed fibre optic sensing network includes: repeatedly transmitting, at multiple instants, interrogating optical signals into each of one or more optical fibres distributed across the geographical area and forming at least part of a fibre-optic communications network (Englund, 0046, “an optical signal transmitter arrangement for repeatedly transmitting, at multiple instants, interrogating optical signals into each of one or more optical fibres distributed across the geographical area and forming at least part of an installed fibre-optic communications network;”); receiving, during an observation period following each of the multiple instants, returning optical signals scattered in a distributed manner over distance along the one or more of optical fibres, wherein scattering of the optical signals is influenced by acoustic disturbances caused by multiple objects within the observation period (Englund, 0046, “an optical signal detector arrangement for receiving, during an observation period following each of the multiple instants, returning optical signals scattered in a distributed manner over distance along the one or more of optical fibres, the scattering influenced by acoustic disturbances caused by the multiple targets within the observation period;”); demodulating acoustic data from the optical signals; and processing the acoustic data to identify tracks made by the objects over a period of time across the geographic area (Englund, 0047, “system may include a processing unit configured to demodulate acoustic data from the optical signals; process the acoustic data and classify it in accordance with the target classes or types to generate a plurality of datasets including classification, temporal and location-related data, and storing the datasets in the storage unit.”). Regarding Claim 12, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 11. The combination further teach: the step of generating the object tracking dataset using the distributed fibre optic sensing network further includes using beamforming techniques (Englund, 0071 – 0072, “Beamforming can therefore be used to ensure the area that is covered by the sensing range of the fibre grid has minimal gaps or areas where a sound source may not be detected.”). Regarding Claim 18, Claim 18 is the corresponding system claim of Claim 1. The combination of Englund and Ahmad further teach: the system comprising means (Englund, fig. 1, the system 100 and means including processing unit 114 to perform methods). Claim 18 is rejected with same reason. Regarding Claim 19. Claim 19 is the corresponding system claim of Claim 11. The combination of Englund and Ahmad further teach: a distributed sensing unit (Englund, 0046, “optical signal transmitter … optical signal detector”). Claim 19 is rejected with same reason. Regarding Claim 20 and 21, these are the system corresponding to Claim 14 and 9. Claim 20 are 21 are rejected with same reason. Claim(s) 2 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Englund, US20200191613 in view of Ahmad et al., (hereinafter Ahmad), WO2015088313 and Englund (hereinafter Englund 2), US20220018980, as applied to Claim 1, and further in view of Davidson, “Meteorologist Ryan Davidson Explains Weather Maps”. Regarding Claim 2, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination does not explicitly teach: at least portions of the zone feature dataset and the event dataset are generated as layers and rendered or fused on a map platform or GIS overlay. Davidson, in the same field of endeavor, explicitly teach: at least portions of the zone feature dataset and the event dataset are generated as layers and rendered or fused on a map platform or GIS overlay (Davidson, page 1, the zone of rain, snow and sleet are overlay on GIS map). Englund (in view of Ahmad and Englund 2) and Davidson both teach GIS overlay with event data and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include the zone overlay disclosed by Davidson’s to Englund (in view of Ahmad and Englund 2)’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification to provide a visual representation of zone events to the public. Regarding Claim 10, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination does not explicitly teach: rendering the dynamic digital representation of the conditions of the zones on a GIS overlay or map platform. Davidson, in the same field of endeavor, explicitly teach: rendering the dynamic digital representation of the conditions of the zones on a GIS overlay or map platform (Davidson, page 1, the zone of rain, snow and sleet are rendered and overlayed on GIS map). The reason for combination is same as Claim 2. Claim(s) 5 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Englund, US20200191613 in view of Ahmad et al., (hereinafter Ahmad), WO2015088313 and Englund (hereinafter Englund 2), US20220018980, as applied to Claim 1, and further in view of Knoeppel et al, (hereinafter Knoeppel) , DE102018007567. Regarding Claim 5, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination does not explicitly teach: the object-sensed conditions of the zones include zone surface-altering conditions, including the presence or otherwise of ice, snow or rain/water/spills. Knoeppel, in the same field of endeavor, explicitly teach: the object-sensed conditions of the zones include zone surface-altering conditions, including the presence or otherwise of ice, snow or rain/water/spills (Knoeppel, translation page 2, “the classification of the acoustic pattern is selected from a group consisting of sleet, hail (11 ), rain (13), snow and/ or wind”; i.e., the classification of the each zone can be rain (quasi-static) or no rain (static)). Englund (in view of Ahmad and Englund 2) and Knoeppel both teach environment sensing and alert application and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of to further include the road surface sensing of Knoeppel to Englund (in view of Ahmad and Englund 2)’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification because “for safety reasons, information about the current road conditions is necessary for autonomous vehicles” (Knoeppel, translation page 2). Regarding Claim 15, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 1. The combination does not explicitly teach: identifying and classifying the associated zones in the geographic area includes training the object-specific tracking data in a neural network Knoeppel, in the same field of endeavor, explicitly teach: identifying and classifying the associated zones in the geographic area includes training the object-specific tracking data in a neural network (Knoeppel translation page 2, “trained to recognize acoustic patterns, preferably a neural network … a neural network (9) for the detection and classification of acoustic patterns”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include the neural network classifier of Knoeppel’s teaching to Englund’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification in order to classify the acoustic pattens. Claim(s) 13 are rejected under 35 U.S.C. 103 as being unpatentable over Englund, US20200191613 in view of Ahmad et al., (hereinafter Ahmad), WO2015088313 and Englund (hereinafter Englund 2), US20220018980, as applied to Claim 12, and further in view of Cho et al., (hereinafter Cho), “Adaptive near-field beamforming techniques for sound source imaging”. Regarding Claim 13, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 12. The combination does not explicitly teach: the beamforming techniques include at least one of a far field beamforming technique and near field beamforming technique. Cho, in the same field of endeavor, explicitly teach: the beamforming techniques include at least one of a far field beamforming technique and near field beamforming technique (Cho, sec II.A., “Beamforming is an effective technique to image sound sources using near-field measurements of the sound pressure field”). Englund (in view of Ahmad and Englund 2) and Cho both teach acoustic beamforming and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include the near field beamforming of Cho to Englund (in view of Ahmad and Englund 2)’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification “to provide very effective acoustic source imaging capabilities” (Cho sec. V.). Claim(s) 14 are rejected under 35 U.S.C. 103 as being unpatentable over Englund, US20200191613 in view of Ahmad et al., (hereinafter Ahmad), WO2015088313 and Englund (hereinafter Englund 2), US20220018980, as applied to Claim 11, and further in view of Smith et al., (hereinafter Smith), US8077539. Regarding Claim 14, Englund, Ahmad and Englund 2 combination renders obviousness of all the limitation in Claim 11. The combination does not explicitly teach: the one or more optical fibres are supplemented with dedicated or purpose-built trenches or pico- trenches to provide additional sensing coverage. Smith, in the same field of endeavor, explicitly teach: the one or more optical fibres are supplemented with dedicated or purpose-built trenches or pico- trenches to provide additional sensing coverage (Smith, col 2, ln 29 – 35 “The reflector may be in the shape of … a cylinder with the circular cross section orthogonal to the generator. In the latter case the reflector would be in the form of a long continuous system”; col 1, ln. 22 – 30, “be capable of producing a strong reflected acoustic output response (i.e. high target strength)”; position the sensor/receiver (optical fibres) in the middle of reflective surface (trenches) to increase the acoustic response). Englund (in view of Ahmad and Englund 2) and Smith both teach acoustic sensing application and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include the reflector of Smith’s teaching to Englund (in view of Ahmad and Englund 2)’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification to increase the acoustic sensing response (Smith, col. 1, ln 22 – 30). Claim(s) 16 are rejected under 35 U.S.C. 103 as being unpatentable over Englund, US20200191613 in view of Ahmad et al., (hereinafter Ahmad), WO2015088313 and Englund (hereinafter Englund 2), US20220018980, Knoeppel et al, (hereinafter Knoeppel) , DE102018007567 as applied to Claim 15, and further in view of Blayvas et al., (hereinafter Blayvas), US20190005387. Regarding Claim 16, Englund, Ahmad, Englund 2 and Knoeppel combination renders obviousness of all the limitation in Claim15. The combination does not explicitly teach: the object-specific tracking data is trained with non-acoustic sources of data in the neural network. Blayvas, in the same field of endeavor, explicitly teach: the object-specific tracking data is trained with non-acoustic sources of data in the neural network (Blayvas, Fig. 1 & 0018 – 0019, “the method includes training the neural network to generate a multi-so regional neural network, by a loss function including a member depending on a classification parameter and a location of a network node in the neural network … the sensor data comprises at least one of image data, depth data and sound data”; i.e., the neural network is trained with the sound data, image data and depth/radar/lidar data). Englund (in view of Ahmad, Englund and Knoeppel) and Blayvas both teach environment sensing using multiple sensors and are analogous. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention with a reasonable likelihood of success to further include the multi-regional neural network of Blayvas’s teaching to Englund (in view of Ahmad, Englund and Knoeppel)’s fiber optic sensing system to achieve the claimed teaching. One of the ordinary skill in the art would have motivated to make this modification to make neural network adaptive to the environment (Blayvas, 0060). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHIEN MING CHOU whose telephone number is (571)272-9354. The examiner can normally be reached Monday- Friday 9 am - 5 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ALGAHAIM HELAL can be reached on (571) 270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHIEN MING CHOU/Examiner, Art Unit 3666 /HELAL A ALGAHAIM/SPE , Art Unit 3666
Read full office action

Prosecution Timeline

Mar 24, 2023
Application Filed
Mar 03, 2025
Non-Final Rejection — §103
Jun 05, 2025
Applicant Interview (Telephonic)
Jun 05, 2025
Response Filed
Jun 12, 2025
Examiner Interview Summary
Jun 16, 2025
Final Rejection — §103
Oct 15, 2025
Request for Continued Examination
Oct 29, 2025
Response after Non-Final Action
Jan 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602583
METHOD OF TRAINING A NEURAL NETWORK TO CONTROL AN AIRCRAFT SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12585954
PERFORMANCE OF AUTONOMOUS VEHICLE OPERATION IN VARYING CONDITIONS BY USING IMAGERY GENERATED WITH MACHINE LEARNING FOR SIMULATIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12576833
DEVICE AND METHOD OF CONTROLLING REMOTE PARKING ASSIST FUNCTION
2y 5m to grant Granted Mar 17, 2026
Patent 12565237
OFF-BOARD PERCEPTION BASED ON SEQUENCE TO SEQUENCE SENSOR DATA GENERATION
2y 5m to grant Granted Mar 03, 2026
Patent 12502994
METHODS AND SYSTEMS FOR MANAGING DEMAND FOR ELECTRIC VEHICLE CHARGING IN A TENANT ENVIRONMENT AND RELATED APPLICATIONS AND DEVICES
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
57%
Grant Probability
88%
With Interview (+30.8%)
4y 4m
Median Time to Grant
High
PTA Risk
Based on 95 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month