Prosecution Insights
Last updated: April 19, 2026
Application No. 18/377,963

MODULAR INFRASTRUCTURE INSPECTION PLATFORM

Non-Final OA §103
Filed
Oct 09, 2023
Examiner
TARKO, ASMAMAW G
Art Unit
2482
Tech Center
2400 — Computer Networks
Assignee
Redzone Robotics Inc.
OA Round
3 (Non-Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
81%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
284 granted / 395 resolved
+13.9% vs TC avg
Moderate +9% lift
Without
With
+9.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
24 currently pending
Career history
419
Total Applications
across all art units

Statute-Specific Performance

§101
3.4%
-36.6% vs TC avg
§103
58.2%
+18.2% vs TC avg
§102
23.9%
-16.1% vs TC avg
§112
4.4%
-35.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 395 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/19/2026 has been entered. Remarks Claims 1-20 were pending. Claims 1-2, 5, 8-9, 11-12, and 19-20 are amended. Claim 18 is cancelled. Claim 21 is added. Claims 1-17 and 19-21 are currently pending. The claim objection to claims 1, 11 and 20 have been moot in view applicant’s amendment. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-17 and 19-21 are rejected under 35 U.S.C. 103 as being unpatentable over Stewart et al. (US 20190339210 A1, hereinafter “Stewart”) in view of Harmen et al. (US 20220028054 A1, hereinafter “Harmen”), and further in view of Starr et al. (US 11949989 B2, hereinafter “Starr”). Regarding claim 1. (Currently Amended) Stewart discloses a device, comprising: a base infrastructure inspection unit (0051-0053; FIGS. 1A-1C; wherein a platform mounted sensor arrangement for inspection; “[0053] FIGS. 1A-1C show views of a sensor arrangement 100 in accordance with embodiments described herein. FIG. 1A shows a top view of the sensor arrangement. According to embodiments described herein, the sensor arrangement 100 can float and/or operate under its own locomotion. The sensor arrangement includes a support structure 102 to which a number of sensors are mounted. The support structure may comprise a floating platform, for example. The support structure 102 can comprise a single hull or multiple hulls. The support structure 102 can support one or more sensors. A cable 120 terminates at the rear 109 of the support structure 102. The cable 120 may be connected to a powered reel. The powered reel may be configured to provide locomotion to the sensor arrangement 100. For example, the reel can increase and/or decrease the length of cable payout. Mounted on the top 105 and near the front 103 of the sensor arrangement 100 is a camera 104. The camera 104 captures images above the flow line as the Platform 100 advances through a pipe. The camera 104 can be a low light, high definition CCTV, which captures images at a rate of 30 frames/sec, for example. The camera 104 may be a high-resolution digital camera with wide-angle lens. ...”); and a plurality of sensor units attached to the base infrastructure inspection unit (0053; Figures 1A-1C; “[0053] ... FIG. 1A shows a top view of the sensor arrangement. … the sensor arrangement 100 can float and/or operate under its own locomotion. The sensor arrangement includes a support structure 102 to which a number of sensors are mounted. The support structure may comprise a floating platform, for example. The support structure 102 can comprise a single hull or multiple hulls. The support structure 102 can support one or more sensors. A cable 120 terminates at the rear 109 of the support structure 102. The cable 120 may be connected to a powered reel. The powered reel may be configured to provide locomotion to the sensor arrangement 100. For example, the reel can increase and/or decrease the length of cable payout. Mounted on the top 105 and near the front 103 of the sensor arrangement 100 is a camera 104. The camera 104 captures images above the flow line as the Platform 100 advances through a pipe. The camera 104 can be a low light, high definition CCTV, which captures images at a rate of 30 frames/sec, for example. The camera 104 may be a high-resolution digital camera with wide-angle lens. ...”); the base infrastructure inspection unit comprising: a set of one or more processors (0059-0063 and 0137; Figures 2 and 38; “[0059] The system 200 includes a Data Preprocessor 204, a Machine Vision Processor 206, a Normalized Database 210, and a Data Quality Analysis Tool 208. The Machine Vision Processor 206 can be integral to the Data Preprocessor 204. Raw project data 201 is processed by the Data Preprocessor 204, which produces 3D pipe feature data, related contextual data, and/or metadata. The Data Preprocessor 204 includes a Quality Analysis Preprocessor Module that performs a number of operations, including pre-loading a reference geographic information system (GIS) datastore with all asset designations, locations, and/or nominal parameters for a given project designation...”); and a memory device having code executable by the one or more processors to: capture, using the plurality of sensor units, two or more data streams comprising infrastructure inspection data (0064-0065, 0119 and 0121-0122; Figures 1, 2 and 31; wherein inspection video data partitioned and distributed across a network, “[0065] Quality control (QC) is a set of activities for ensuring quality in products. The activities begin with identifying defects in the actual products produced. QC aims to identify (and correct) defects in the finished product. In the QC process implemented by the interactive software of the Data Quality Analysis Tool 208, the DA can review the entire CCTV video, verifying defects and the accuracy of corresponding fault coding, e.g., Pipeline Assessment Certification Program (PACP) codes that were identified in the field, while looking for additional defects that may have been missed by the certified field operator (CFO). ...”); combine the infrastructure inspection data with metadata indicating synchronization between respective ones of the plurality of sensor units (0050-0052, 0055, 0056, 0059 and 0063; Figures 1 and 2; wherein parallel concurrently distributed metadata timestamping triggered data collected from the units, “[0055] It can be appreciated that the sensors 104, 106, 108, 110 mounted to the support structure 102 operate at different scan rates and, initially, are not synchronized in time or space. Each of the sensors 104, 106, 108, 110 may be coupled to a collection computer 130, 132, 134, 136 on the support structure 102. Data produced by the sensors 104, 106, 108, 110 is timestamped and synchronized using a “heartbeat” signal running between the different collection computers and any external computer, such as one on a support vehicle. The “heartbeat” signal is a network transmission control protocol (TCP), user datagram protocol (UDP), or other network protocol broadcast time signature in universal coordinated time (UTC-Z) used to synchronize the clocks on the collection computers 130, 132, 134, 136. Using the heartbeat signal may ensure that the logging software used to record the constituent sensor elements is recording with a similar clock. …”); and provide, using a network connection to a remote device, combined metadata and infrastructure inspection data based on a three-dimensional model (3D) model of the infrastructure (0051-0052, 0059-0060, 0063 and 0137; Figures 1-2 and 38; wherein other communication device using network interfaces, and bring together metadata and inspection data unified in aggregation within 3D model reconstruction visualization views, “[0052] ... reconstruction and processing parameters, aggregate models, and classifications are stored in an electronic platform, which provides for an extensible multi-dimensional data aggregate from which output is produced. The output comprises a 3D geo-referenced geometric reconstruction, performance deterioration model, and/or the rates of change therein. ... After data aggregation, final deliverables are produced in a form desired by the end customer.”, “[0137] The above-described methods can be implemented on a computer using well-known computer processors, memory units, storage devices, computer software, and other components. A high-level block diagram of such a computer is illustrated in FIG. 38. ... The computer 3800 may include one or more network interfaces 3850 for communicating with other devices via a network. ... The user interface 3860 may include I/O devices …”). Stewart failed to disclose a device, comprising: a plurality of sensor units attached to, and removable from the base infrastructure inspection unit in a modular manner; the base infrastructure inspection unit comprising: provide infrastructure inspection data for inclusion in a photorealistic image built on the infrastructure inspection data. Harmen, however, in the same field of endeavor, shows a device, comprising: a base infrastructure inspection unit (0082; Figure 5; “[0082] As may be appreciated, the described techniques permit for densely populating a model to produce a photo-realistic image or visual point cloud representation of an infrastructure asset. In one example, culling may be used to alter the transparency of the photorealistic image or part thereof, e.g., dynamically or via response to user input. This permits adding or removing data from the populated model or part thereof. In one example, culling or removal allows an end user to, e.g., via an interface element such as a slider or input element, to look through a front facing wall in a 3D structure to observe a rear facing wall.”); the base infrastructure inspection unit comprising: provide infrastructure inspection data for inclusion in a photorealistic image built on the infrastructure inspection data (0082; Claims 2 and 11; Figure 5; wherein the infrastructure inspection data for inclusion in a photorealistic image based on a three-dimensional model (3D) model of the infrastructure;). It would have been obvious to the person of having ordinary skilled in the art to combine the infrastructure inspection data for inclusion in a photorealistic image in the inspection system of Stewart in order to enhance the accuracy, efficiency, and communication of the inspection process and yield other industrial advantages. Stewart in view of Harmen failed to show a device, comprising a plurality of sensor units attached to, and removable from the base infrastructure inspection unit in a modular manner. Starr, however, in the same field of endeavor, shows a device, comprising a plurality of sensor units attached to, and removable from the base infrastructure inspection unit in a modular manner (Column 4 lines 37-57; Figure 1; Claims 1 and 4-5; “(19) The pipe inspection robot 10 used by way of example for descriptive purposes includes a sensor component 12 and a chassis portion 14. The sensor component 12 is electrically and mechanically connected to the chassis portion 14. As shown in FIG. 1, the pipe inspection robot 10 may also include a riser portion 16 which is positioned between the sensor component 12 and the chassis portion 14 and is electrically and mechanically connected to each. The riser portion 16 operates to increase the distance that the sensor component 12 is situated above the lowest portion of the pipe and may be utilized in larger pipe applications or environments, such as a large tunnel or chamber, to provide a desired vantage point for various sensing devices of the sensor component 12, including a camera 21. Additionally, riser portion 16 and sensor component 12 are modular, i.e., they may be coupled/decoupled to and from the pipe inspection robot 10. Functionality of the pipe inspection robot 10 may be implemented by a computing device and/or a computer program stored on a computer-readable medium, as further described herein.”). It would have been obvious to the person of having ordinary skilled in the art to combine the modular sensor that could be coupled/decoupled as shows by Starr in the inspection system of Stewart in view of the infrastructure inspection data for inclusion in a photorealistic image in order to give the flexibility to easily remove and replace the proper sensor that would be needed at any given time an operation. Regarding claim 2. (Currently Amended) Stewart further discloses the device of claim 1, wherein the memory device has code executable by the one or more processors to use a trigger to capture the two or more data streams at alternating periods (0055, 0063-0064 and 0066-0069; Figures 2-5; wherein triggering the partitioning of periodically distributed/different rates (alternating periods) of collection, “[0055] It can be appreciated that the sensors 104, 106, 108, 110 mounted to the support structure 102 operate at different scan rates and, initially, are not synchronized in time or space. Each of the sensors 104, 106, 108, 110 may be coupled to a collection computer 130, 132, 134, 136 on the support structure 102. Data produced by the sensors 104, 106, 108, 110 is timestamped and synchronized using a “heartbeat” signal running between the different collection computers and any external computer, such as one on a support vehicle. …”, “[0066] FIG. 3A illustrates a process for collecting disparate data using a sensor arrangement configured to be deployed in a pipe in accordance with embodiments described herein. A sensor arrangement is deployed 302 within a pipe. According to embodiments, the sensor arrangement is a floating platform that is deployed in the pipe. Disparate data from sensors of the sensor arrangement is collected 304. The disparate data is synchronized 306. The synchronized data is stored 308 in a memory. The synchronized data is processed 310 to produce an output. The processed synchronized data is presented 312 to a user (e.g., a DA) via a user interface.”, “[0068] A file transfer protocol (FTP) data store 300B is queried 300C, 300D periodically by the processor front-end until a data upload is detected 300D. When the data upload is detected 300D, the project corresponding to the data upload is identified ...”). Regarding claim 3. (Previously Presented) Stewart discloses the device of claim 2, wherein to combine the infrastructure inspection data with metadata comprises including timing data, based on the trigger, in the metadata to produce time-referenced data of the two or more data streams (0050-0052 and 0056; Figures 1A-1C; “[0050] … systems and methods can comprise, or be implemented by, at least one processor, computer memory, and a software pre-processor employing numerical and computer vision algorithms for automatically transforming distinct datasets consisting of individually time-stamped measurements collected from a pipe interior combined with dynamic models to create a project-oriented normalized relational database….”, “[0052] … The output comprises a 3D geo-referenced geometric reconstruction, performance deterioration model, and/or the rates of change therein. … the outputs and/or deliverables are presented as multiple synchronous and contextual views to emphasize the relationships between the numerical and categorical attributes, geometry, location, and/or temporal dynamics. ...”). Regarding claim 4. (Original) Stewart discloses the device of claim 1, wherein the plurality of sensor units comprises visual sensor units (0053 and 0061; Figures 1A-1C; wherein ‘camera 104’ and ‘Machine Vision Processor 206’ are visual sensor units, “[0061] The Machine Vision Processor 206 is configured to perform image processing (e.g., noise filtering) of data produced by various sensors, such as the sonar 106 and LIDAR sensors 108, 110 (see FIG. 1) on data that has been converted into an appropriate visual context. …”). Regarding claim 5. (Currently Amended) Stewart discloses the device of claim 4, wherein the plurality of sensor units comprise one or more of a sonar unit, a lidar unit, and a laser profiler (0048, 0054 and 0061; Figures 1A-1C; “[0048] … Measuring sedimentation levels and volume can be done more efficiently without bypass pumping and service interruption with the addition of a sonar sensor. Profiling sensors such as sonar, Light Detection and Ranging LIDAR, and ring lasers may be used for precise measurement of minimum interior pipe diameter used for relining operations. ...”, “[0061] The Machine Vision Processor 206 is configured to perform image processing (e.g., noise filtering) of data produced by various sensors, such as the sonar 106 and LIDAR sensors 108, 110 (see FIG. 1) on data that has been converted into an appropriate visual context. …”). Regarding claim 6. (Original) Stewart discloses the device of claim 4, wherein: the plurality of sensor units comprises two or more cameras disposed on the base infrastructure inspection unit such that overlapping views are obtained from the two or more cameras (0058; Figure 2; “[0058] Advantageous features of the system 200 include fully automated processing of inspection data and inspection database population of results upon arrival from the field, unique software tools allowing for automated and semi-automated QC of data by statistical analyses and other methods, and a Co-registered Quad View presentation of the raw sensor and calculated data with user-controlled viewpoints.”); and to capture comprises capturing stereo-overlapping images from the two or more cameras (0064 and 0123-0124; Figures 2 and 34; co-registered camera views, “[0064] The second process is a set of tasks performed in interactive software implemented by the Data Quality Analysis Tool 208 shown in FIG. 2. The Data Quality Analysis Tool 208 may be referred to a quality control module herein. This interactive software provides a 3D view of the integrated pipe data combined with co-registered selectable 2D sensor slices and a unique 2D interactive selectable view of the relevant pipe parameters, along with a view of the project database. …”, “[0123] ... The co-registered view allows for user-controlled speed of translation through the pipe, …”). Regarding claim 7. (Original) Stewart discloses the device of claim 1, comprising a delivery unit configured for attachment with the base infrastructure inspection unit (0066; Figure 3A; a deployment platform, “[0066] FIG. 3A illustrates a process for collecting disparate data using a sensor arrangement configured to be deployed in a pipe ... A sensor arrangement is deployed 302 within a pipe. … the sensor arrangement is a floating platform that is deployed in the pipe. Disparate data from sensors of the sensor arrangement is collected 304. The disparate data is synchronized 306. The synchronized data is stored 308 in a memory. The synchronized data is processed 310 to produce an output. The processed synchronized data is presented 312 to a user (e.g., a DA) via a user interface.”). Regarding claim 8. (Currently Amended) Stewart further discloses the device of claim 7, wherein the delivery unit comprises one or more of a float system (0053 and 0066; Figures 1A-1C and 3A; a floating platform, “[0053] FIGS. 1A-1C show views of a sensor arrangement 100 ... FIG. 1A shows a top view of the sensor arrangement. … the sensor arrangement 100 can float and/or operate under its own locomotion. The sensor arrangement includes a support structure 102 to which a number of sensors are mounted. The support structure may comprise a floating platform, ...”). Starr further shows the delivery unit comprises the tractor system (Column 4 lines 7-37; Figure 1). The motivation used in the rejection of claim 1 to combine Starr prior art with Stewart in view of Harmen prior art still applies to the combination of the prior arts on the rejection of claim 8. Regarding claim 9. (Currently Amended) Claim 9 has similar limitations as to those treated in the above rejections more specifically on the rejection to the claim 1, and is met by the references as discussed above, and has been rejected for the same reasons of obviousness as used in the rejection to claim 1 above. Regarding claim 10. (Original) Starr further shows the device of claim 8, wherein the delivery unit comprises a tractor unit having tracks that cover substantially the entire width of the tractor unit (Figure 1). The motivation used in the rejection of claim 1 to combine Starr prior art with Stewart in view of Harmen prior art still applies to the combination of the prior arts on the rejection of claim 10. Regarding claims 11-12. (Currently Amended) Method claims 11-12 are drawn to the method of using the corresponding device claimed in claims 1-3. Therefore method claims 11-12 correspond to device claims 1-3 are rejected for same reasons of obviousness as used above. Regarding claim 13. (Previously Presented) Method claim 13 is drawn to the method of using the corresponding device claimed in claim 5. Therefore method claim 13 corresponds to device claim 5 is rejected for same reasons of obviousness as used above. Regarding claims 14-15. (Previously Presented) Method claims 14-15 are drawn to the method of using the corresponding device claimed in claim 6. Claim 15 however further recites deriving distance information for one or more points within the stereo-overlapping images. Stewart further discloses the method comprising deriving distance information for one or more points within the stereo-overlapping images (0058, 0135 and 0123-0124; “[0058] Advantageous features of the system 200 include fully automated processing of inspection data and inspection database population of results upon arrival from the field, unique software tools allowing for automated and semi-automated QC of data by statistical analyses and other methods, and a Co-registered Quad View presentation of the raw sensor and calculated data with user-controlled viewpoints.”, “[0135] According to embodiments described herein, the pivot 3730 contains a helical spring, with the force on the spring electro-mechanically measured to indicate the distance (depth) between the pivot 3730 and the pipe invert. The helical spring may allow the spoon 3740 to be lifted into a safe position if it is pulled back on the platform. For example, in cases in which the pipe is blocked and/or has collapsed, the spoon 3740 may be pulled back onto the platform to avoid damage to the spoon 3740.”). Therefore method claims 14-15 correspond to device claim 6 are rejected for same reasons of obviousness as used above. Regarding claim 16. (Previously Presented) Method claim 16 is drawn to the method of using the corresponding device claimed in claim 7. Therefore method claim 16 corresponds to device claim 7 is rejected for same reasons of obviousness as used above. Regarding claim 17. (Previously Presented) Method claim 17 is drawn to the method of using the corresponding device claimed in claim 8. Therefore method claim 17 corresponds to device claim 8 is rejected for same reasons of obviousness as used above. Regarding claim 19. (Currently Amended) Method claim 19 is drawn to the method of using the corresponding device claimed in claim 9. Therefore method claim 19 corresponds to device claim 9 is rejected for same reasons of obviousness as used above. Regarding claim 20. (Currently Amended) System claim 20 is drawn to the system corresponding to the method of using same as claimed in claims 1 and 7. Claim 20 further recites a system comprising: a base infrastructure inspection unit having a first instance of a universal connector; a delivery unit having a second instance of the universal connector and being configured for attachment with the base infrastructure inspection unit via the universal connector; a plurality of modular sensor units attached to the base infrastructure inspection unit, each of the plurality of modular sensor units being removable from the base infrastructure inspection unit to reconfigure the system; and a server. Stewart further discloses a server (0063, 0069 and 0137; Figures 2-3 and 38; “[0069] The raw field data file designations are compared to the entries in the GIS Datastore 330 and either set aside for manual examination if no match is found 335 or prepared for processing using the stored nominal asset parameters 340. ... A relational database file store 355 (e.g., Normalized Database 210 in FIG. 2) is updated in third normal form with the project data. ...”). Starr, however, in the same field of endeavor, shows a system comprising: a base infrastructure inspection unit having a first instance of a universal connector (Column 4 lines 37-57; Figure 1; ‘sensor component 12’, ); a delivery unit having a second instance of the universal connector and being configured for attachment with the base infrastructure inspection unit via the universal connector (Column 4 lines 37-57; Figure 1; ‘the chassis portion 14’, and ‘the pipe inspection robot 10 may also include a riser portion 16 which is positioned between the sensor component 12 and the chassis portion 14 and is electrically and mechanically connected to each.’); a plurality of modular sensor units attached to the base infrastructure inspection unit, each of the plurality of modular sensor units being removable from the base infrastructure inspection unit to reconfigure the system (Column 4 lines 37-57; Fig. 1). Therefore, system claim 20 is correspond to method claims 1 and 7, and is rejected for the same reasons of obviousness as used above. Regarding claim 21. (New) System claim 21 is drawn to the system corresponding to the device of using same as claimed in claim 1. Therefore, system claim 1 corresponds to device claim 1 and is rejected for the same reasons of obviousness as used above. Response to Arguments Applicant’s arguments with respect to claims 1-17 and 19-21 have been considered but are moot based on new ground of rejection. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ASMAMAW TARKO whose telephone number is (571)272-9205. The examiner can normally be reached Monday -Friday 9:00AM-5:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Kelley can be reached at (571) 272-7331. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ASMAMAW G TARKO/ Patent Examiner, Art Unit 2482
Read full office action

Prosecution Timeline

Oct 09, 2023
Application Filed
Dec 07, 2024
Non-Final Rejection — §103
Apr 14, 2025
Response Filed
Jul 15, 2025
Final Rejection — §103
Jan 19, 2026
Request for Continued Examination
Jan 27, 2026
Response after Non-Final Action
Jan 31, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12529288
SYSTEMS AND METHODS FOR ESTIMATING RIG STATE USING COMPUTER VISION
2y 5m to grant Granted Jan 20, 2026
Patent 12511768
METHOD AND APPARATUS FOR DEPTH IMAGE ENHANCEMENT
2y 5m to grant Granted Dec 30, 2025
Patent 12506865
SYSTEMS AND METHODS FOR REDUCING A RECONSTRUCTION ERROR IN VIDEO CODING BASED ON A CROSS-COMPONENT CORRELATION
2y 5m to grant Granted Dec 23, 2025
Patent 12498482
CAMERA APPARATUS
2y 5m to grant Granted Dec 16, 2025
Patent 12469164
VEHICLE EXTERNAL DETECTION DEVICE
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
81%
With Interview (+9.3%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 395 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month