Prosecution Insights
Last updated: April 19, 2026
Application No. 18/192,481

PAIRED-CAMERA ROAD SURFACE LEAK DETECTION

Final Rejection §103
Filed
Mar 29, 2023
Examiner
HOANG, HAN DINH
Art Unit
2661
Tech Center
2600 — Communications
Assignee
Torc Robotics, Inc.
OA Round
2 (Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
3y 2m
To Grant
93%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
120 granted / 162 resolved
+12.1% vs TC avg
Strong +19% interview lift
Without
With
+19.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
25 currently pending
Career history
187
Total Applications
across all art units

Statute-Specific Performance

§101
6.9%
-33.1% vs TC avg
§103
65.7%
+25.7% vs TC avg
§102
15.5%
-24.5% vs TC avg
§112
7.1%
-32.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 162 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s amendment filed 08/07/2025 has been entered and made of record. Claims 1, 2, 3, 6, 11, 12 and 13 are amended. No New Claim was added. No Claims were cancelled. Claims 1-15 are pending. Applicant’s remarks in view of the newly presented amendments have been considered but are not found to be persuasive for at least the following reasons: The applicant argues on page 7 of the remarks filed the amendments to the independent claims would overcome the previously cited prior art. The Examiner agrees as the amendments do appear to overcome the previously cited prior art. However, after further search and consideration, the newly discovered prior art of Dudek US PG-Pub(US 20180259418 A1) would disclose wherein the differential image features include a frequency of a spot associated with a fluid leak over a plurality of first and second images in ¶[0004] the position of the fluid leak is determine by comparing the image acquired to a second portion of image data and spectral data and in ¶[0025] discloses that the fluid detector is able to determine the frequency of each fluid associated with a leak to a location in the spectral data. Please see the updated claim rejection under 35 USC § 103 below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 5-6, 10-11 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Baek et al. US PG-Pub(US 20160101734 A1) in view of Sarwar et al. US PG-Pub(US 20190392656 A1) in view of Dudek US PG-Pub(US 20180259418 A1). Regarding Claim 1, Baek teaches a method comprising: receiving, by a processor, a front image from a front camera of an automated vehicle and a rear image from a rear camera of the automated vehicle([0040] Also, the rear view camera 195b and the front view camera 195d may be located respectively near a trunk switch and at or near an emblem.), the front camera is situated at a front portion of the automated vehicle and directed towards a road surface beneath the front portion of the automated vehicle, and the rear camera is situated at a rear portion of the automated vehicle and directed towards the road surface beneath the rear portion of the automated vehicle ([0038] “First, referring to FIG. 2A, the around view cameras 195a, 195b, 195c and 195d may be located respectively at the left side, the rear, the right side, and the front of the vehicle.[0039] In particular, the left side view camera 195a and the right side view camera 195c may be located respectively in a case enclosing a left side view mirror and a case enclosing a right side view mirror. [0040] Also, the rear view camera 195b and the front view camera 195d may be located respectively near a trunk switch and at or near an emblem.[0041] A plurality of images captured by the around view cameras 195a, 195b, 195c and 195d may be transmitted to a processor 170 (see FIG. 3A or FIG. 3B) in the vehicle 200. The processor 170 (see FIG. 3A or FIG. 3B) combines the images to generate an around view image.”, as disclosed in ¶[0038]-¶[0041] the prior art uses a front camera that faces the road and a rear camera that is position in the back of the vehicle in order to capture the road image infront and behind the vehicle.) wherein the front images and the rear images are captured while the automated vehicle is traveling(¶[0007] disclose that the front view camera is capturing images in front of the vehicle and ¶[0008] discloses the rear camera is capturing images behind the vehicle during motion.determining, by the processor, one or more front image features from the front image, and one or more rear image features from the rear image (0077] “The processor 170 may acquire a plurality of images from the top view cameras 195a to 195d and combine the acquired images to generate an around view image.” [0078] “Also, the processor 170 may perform signal processing based on computer vision. For example, the processor 170 may calculate disparity for a view around or under the vehicle based on the created under vehicle image or the generated around view image, detect an object in the image based on calculated disparity information, and continuously track motion of the object after detection of the object.” [0079] “In particular, during detection of the object, the processor 170 may perform lane detection, adjacent vehicle detection, pedestrian detection, and road surface detection, for example.”, ¶[0077]-¶[0079] disclose using images acquired from the front camera and rear camera to detect objects in the images received by a processor.)and identifying, by the processor, a fluid leak of the automated vehicle using the one or more differential image features by applying an object recognition engine on the one or more differential image features. (¶[0053], “In particular, the under vehicle image 211 may assist the driver in checking, for example, the alignment of vehicle tires, tire wear, leakage of engine oil from an engine, deterioration of under vehicle components, and road surface conditions.” [0060] “The under vehicle image provision apparatus 100 may perform object detection, verification, and tracking with respect to an object located near the vehicle based on a plurality of images received from the bottom view cameras 196a to 196h.”, as disclosed in ¶[0053], the prior art determines a fluid leak of the vehicle by using the under vehicle image captured and ¶0060] disclose using an object detection engine to determine the fluid leak. ) Baek does not explicitly teach determining, by the processor, one or more differential image features based upon comparing the one or more front image features against the one or more rear image features from the rear image Sarwar teaches determining, by the processor, one or more differential image features based upon comparing the one or more front image features against the one or more rear image features from the rear image ([0052] “The controller 101 of the apparatus that detects a leak of a vehicle fluid 100 may also be configured to compare the first image and the second image to detect whether the leak is present by comparing colors of pixels of the first image to colors of pixels of the second image. The first and second image may each comprise a plurality of images. The controller 101 may compare the first image and the second image to detect whether the leak is present by determining a location of the leak based on the comparing of the colors, determining if the location of the leak corresponds to a location on the vehicle where a leak may occur”, ¶[0052] discloses comparing two separate images based on pixel data to determine if a leak has occurred.) It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the claimed invention as taught by Baek with Sarwar in order to determine image feature differences to identify a leak. One skilled in the art would have been motivated to modify Baek in this manner in order to detect a leak of vehicle fluid and that determine the potential source of the leak. (Sarwar, ¶[0002]) However, Baek and Sarwar do not explicitly teach wherein the differential image features include a frequency of a spot associated with a fluid leak over a plurality of first and second images Dudek teaches wherein the differential image features include a frequency of a spot associated with a fluid leak over a plurality of first and second images (¶[0004], “The output signal can include a location of the fluid leak. The location of the fluid leak can include the geographical location, a distance between the vehicle and the fluid leak determined based on at least a first portion of the at least one image and at least a second portion of the spectral data, and an orientation. The output signal can indicate an amount of the fluid relative to the geographical location. The amount of fluid can be compared to a set amount. The at least one image and the spectral data can be concurrently captured at the geographical location”[0025] “The receiver of the fluid detector 134 is configured to receive at least a portion of the light beam that intersected and/or was reflected by fluids within a target area (e.g., located 1 meter to 200 meters away from the fluid detector 134). The receiver can have a bandwidth of at least 10 nanometers, for example. The receiver is configured to process the portion of the light beam to generate spectral data. The spectral data correspond to the absorbance as a function of light frequency for each fluid within the target area (e.g., optical field of the receiver).”, as disclosed in ¶[0004] the position of the fluid leak is determine by comparing the image acquired to a second portion of image data and spectral data. ¶[0025] discloses that the fluid detector is able to determine the frequency of each fluid associated with a leak to a location in the spectral data.) It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the claimed invention as taught by Baek and Sarwar with Dudek in order to determine the frequency in location of the fluid leak between the images. One skilled in the art would have been motivated to modify Baek and Sarwar in this manner in order to incorporate vehicle-based hyperspectral imaging for detecting fluid leaks in fluid distribution networks. (Dudek, ¶[0002]) Regarding Claim 5, the combination of Baek, Sarwar and Dudek teach the method according to claim 1, where Baek further teaches further comprising generating, by the processor, a notification indicating the fluid leak in response to identifying the fluid leak.( [0230] The processor 170 may perform object recognition and verification via an image signal processing in the under vehicle image 1210, and verify the spot 1215 due to leakage of engine oil. Once the spot 1215 is verified, the processor 170 may control output of an oil leakage message. This may assist the user in recognizing leakage of engine oil. ¶[0230] discloses alerting a user when there is a fluid leak related to the vehicle.) Regarding Claim 6, claim 6 is considered a method claim substantially corresponding to claim 1. Please see the discussion of claim 1 above for a discussion of similar limitations. Furthermore, Baek teaches a front overlay image comprising image data of a plurality of front images ([0040] Also, the rear view camera 195b and the front view camera 195d may be located respectively near a trunk switch and at or near an emblem and ¶[0007] discloses capturing images that are in front of the vehicle.), and a second overlay image comprising image data of a plurality of back images(¶[0008] discloses the rear camera captures images behind the vehicle), each front image received from a front camera fixed to a front portion of an automated vehicle, and each rear image received from a rear camera fixed to a rear portion of the automated vehicle(as disclosed in ¶[0038]-¶[0041] the prior art uses a front camera that faces the road and a rear camera that is position in the back of the vehicle in order to capture the road image infront and behind the vehicle.) Regarding Claim 10, it recites features similar to claim 5 respectively, and is rejected in the same manner, the same art, reasoning applying. Regarding Claim 11, it recites features similar to claim 1 respectively, and is rejected in the same manner, the same art, reasoning applying. Regarding Claim 15, it recites features similar to claim 5 respectively, and is rejected in the same manner, the same art, reasoning applying. Claims 2-4, 7-9 and 12-14 are rejected under 35 U.S.C. 103 as being unpatentable over Baek et al. US PG-Pub(US 20160101734 A1) in view of Sarwar et al. US PG-Pub(US 20190392656 A1) in view of Dudek US PG-Pub(US 20180259418 A1) in view of Weston et al. US PG-Pub(US 20240094046 A1). Regarding Claim 2, while the combination of Baek, Sarwar and Dudek teach the method according to claim 1, they do not explicitly teach further comprising: generating, by the processor, a first metadata stamp for the front images and a corresponding second metadata stamp for the rear images; and storing, by the processor into an image log, the front images, the first metadata stamp, second metadata stamp, and the rear images. Weston teaches generating, by the processor, a first metadata stamp for the front images and a corresponding second metadata stamp for the rear images; and storing, by the processor into an image log, the front images, the first metadata stamp, second metadata stamp, and the rear images.. (¶[0043], “In such examples, the lost load detection circuitry 208 stores the load difference with the indication in case some load difference is identified. For example, a small load difference could be indicative of a start of a leak but may not be significant enough to be flagged. Additionally, in such examples, the lost load detection circuitry 208 stores a timestamp with the indication of the load difference and the location. In some examples, the lost load detection circuitry 208 is instantiated by processor circuitry executing lost load detection instructions and/or configured to perform operations such as those represented by the flowchart of FIG. 3”, as disclosed in ¶[0043] the prior art determine a leak from the sensor data in received in the vehicle and stores the timestamp of when the leak has occurred.) It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the claimed invention as taught by Baek, Sarwar and Dudek with Weston in order to determine a time in which a leak occurs. One skilled in the art would have been motivated to modify Baek, Sarwar and Dudek in this manner in order to detect lost loads. (Weston, ¶[0001]) Regarding Claim 3, the combination of Baek, Sarwar, Dudek and Weston teach the method according to claim 2, where Weston further teaches further comprising selecting, by the processor, the rear images having the second metadata stamp according to the corresponding first metadata stamp of the front images associated with the rear images. (¶[0034] “In the illustrated example, the cameras 112 include a rear camera, side cameras, and/or a vehicle 360° camera mounted on the vehicle 104. The cameras 112 can transmit data to the lost load control circuitry 102. In some examples, the lost load control circuitry 102 determines there is a leak underneath the vehicle 104 based on the data received from the cameras 112.” ¶[0043], “In such examples, the lost load detection circuitry 208 stores the load difference with the indication in case some load difference is identified. For example, a small load difference could be indicative of a start of a leak but may not be significant enough to be flagged. Additionally, in such examples, the lost load detection circuitry 208 stores a timestamp with the indication of the load difference and the location. In some examples, the lost load detection circuitry 208 is instantiated by processor circuitry executing lost load detection instructions and/or configured to perform operations such as those represented by the flowchart of FIG. 3”, as disclosed in ¶[0034] the image data acquired is from the front and rear cameras and in ¶[0043] the prior art determine a leak from the sensor data in received in the vehicle and stores the timestamp of when the leak has occurred.) It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the claimed invention as taught by Baek, Sarwar and Dudek with Weston in order to store the timestamp in which a leak occurs. One skilled in the art would have been motivated to modify Baek, Sarwar and Dudek in this manner in order to detect lost loads. (Weston, ¶[0001]) Regarding Claim 4, the combination of Baek, Sarwar, Dudek and Weston teach the method according to claim 2, where Weston further teaches wherein the first metadata stamp and the second metadata stamp includes at least one of a timestamp or a location stamp. (¶[0045] “In some examples, the load location identification circuitry 210 identifies an area and/or time where a leak was initially identified based on the load differences and associated locations and timestamps stored in the database 222. In some examples, the load location identification circuitry 210 causes the load determination circuitry 206 to determine a rate at which a leak is occurring based on the data in the database 222.”, as disclosed in ¶[0045] the prior art determine a leak from the sensor data in received in the vehicle and stores the timestamp of when and where the leak has occurred.) It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the claimed invention as taught by Baek, Sarwar and Dudek with Weston in order to store the timestamp and location in which a leak occurs. One skilled in the art would have been motivated to modify Baek, Sarwar and Dudek in this manner in order to detect lost loads. (Weston, ¶[0001]) Regarding Claims 7-9, they are substantially similar to claim 2-4 respectively, and are rejected in the same manner, the same art, reasoning applying. Regarding Claims 12-14, they are substantially similar to claim 2-4 respectively, and are rejected in the same manner, the same art, reasoning applying. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HAN D HOANG whose telephone number is (571)272-4344. The examiner can normally be reached Monday-Friday 8-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JOHN M VILLECCO can be reached at 571-272-7319. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HAN HOANG/Examiner, Art Unit 2661
Read full office action

Prosecution Timeline

Mar 29, 2023
Application Filed
May 13, 2025
Non-Final Rejection — §103
Jul 24, 2025
Applicant Interview (Telephonic)
Jul 24, 2025
Examiner Interview Summary
Aug 07, 2025
Response Filed
Nov 14, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602835
POINT CLOUD DATA TRANSMISSION DEVICE, POINT CLOUD DATA TRANSMISSION METHOD, POINT CLOUD DATA RECEPTION DEVICE, AND POINT CLOUD DATA RECEPTION METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12602778
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM
2y 5m to grant Granted Apr 14, 2026
Patent 12602918
LEARNING DATA GENERATING APPARATUS, LEARNING DATA GENERATING METHOD, AND NON-TRANSITORY RECORDING MEDIUM HAVING LEARNING DATA GENERATING PROGRAM RECORDED THEREON
2y 5m to grant Granted Apr 14, 2026
Patent 12592070
IMAGE PROCESSING APPARATUS
2y 5m to grant Granted Mar 31, 2026
Patent 12586364
SINGLE IMAGE CONCEPT ENCODER FOR PERSONALIZATION USING A PRETRAINED DIFFUSION MODEL
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
93%
With Interview (+19.3%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 162 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month