Prosecution Insights
Last updated: April 19, 2026
Application No. 18/828,429

DEVICE, SYSTEM, AND METHOD FOR SENSOR PROVISIONING

Non-Final OA §103§DP
Filed
Sep 09, 2024
Examiner
TRAN, JIMMY H
Art Unit
2451
Tech Center
2400 — Computer Networks
Assignee
Ndustrial Io Inc.
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
96%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
547 granted / 689 resolved
+21.4% vs TC avg
Strong +17% interview lift
Without
With
+17.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
27 currently pending
Career history
716
Total Applications
across all art units

Statute-Specific Performance

§101
15.7%
-24.3% vs TC avg
§103
48.8%
+8.8% vs TC avg
§102
11.4%
-28.6% vs TC avg
§112
13.0%
-27.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 689 resolved cases

Office Action

§103 §DP
DETAILED ACTION This action is in response to communication filed on 9/9/2024. Claims 1-20 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 3/3/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. At least claim 1, 19 and 20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1, 15 and 16 of U.S. Patent No. 11,109,204 ; claim 1, 19 and 20 of U.S. Patent No. 11,546,744; claim 1 of U.S. Patent No. 12,096,323. Although the claims at issue are not identical, they are not patentably distinct from each other because: 11,109,204 – A person of ordinary skill in the art would find the current claims an obvious simplication on the parent’s broader GUI-driven process, as both achieve automated sensor metadata generation and labeling for deployment. 11,546,744 – A person of ordinary skill in the art would find the current claims an obvious narrower variation of the parent’s broader system for GUI-driven sensor provisioning instructions from georeferenced maps, zone, and sensors representation for determining serial numbers, template locations, and labels. 12,096,323 – A person of ordinary kill in the art would find the current claims as nearly identical, differing mainly in label details and omission of printer transmission. These are minor, obvious modification. Claim Objections Claim 1 is objected to because of the following informalities: On line 3 of the claim, “program instructions, when executed are configured for” should be --program instructions that, when executed --. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Lawson et al. (US 2017/0103137) in view of Bandyopadhyay et al. (US 2012/0130632) and Ganz (US 2011/0167267). Regarding claim 1, Lawson discloses a computing system comprising: one or more processors; and one or more memories including program instructions, when executed are configured for: receiving initial sensor provisioning instructions from a graphical user interface (GUI) (Lawson discloses receiving provision instruction (e.g., submitted applications/configs) from a GUI (publishing interface with tagging tools); see [0040] “Publish component 204 can be configured to receive submission of an industrial automation application from a remote client device (e.g., a development workstation 220 having a cloud interface 224) via a cloud platform and publish the industrial automation application to one or more application libraries 228”); storing the initial sensor provisioning instructions (Lawson discloses storing the provisioning instructions (submitted application/configs stored in libraries); see [0040] “the one or more application libraries 228 can be stored in memory 218”); and provisioning a first plurality of sensors using the initial sensor provisioning instructions (Lawson discloses provisioning a plurality of sensors (industrial devices including sensors) using the instructions (delivering application/configs); see [0041] “Retrieval component 208 can retrieve the subset of industrial applications identified by the search component 206 and deliver the identified applications to the originator of the search request. This can include sending the industrial applications to the requesting client device 222 over the cloud platform, or sending only a set of indicators identifying the subset of industrial applications”). However, the prior art does not explicitly disclose the following: wherein: provisioning the first plurality of sensors comprises: determining a serial number, and a template location for each sensor of the first plurality of sensors based on a first zone name and a location relative to georeferenced map data of an associated sensor representation within a first plurality of sensor representations. Bandyopadhyay in the field of the same endeavor discloses techniques for locating, tracking, and/or monitoring the status of personnel and/or assets ("trackees"), both indoors and outdoors. In particular, Bandyopadhyay teaches tracking multiple personnel/asset equipped with sensors (e.g., INCs and CSMs) in a system that determines and corrects locations using zone-based features and georeferenced building data. Bandyopadhyay teaches the following limitation: provisioning the first plurality of sensors comprises: determining a serial number (Bandyopadhyay [0112] “Each building may be associated with a name or other identifier (e.g, comprising any number of text or numeric characters) in a building database for purposes of queries and/or display”), and a template location for each sensor of the first plurality of sensors based on a first zone name (Bandyopadhyay [0023-0024] “the trajectory of personnel and/or assets in a building may be limited and characterized by features in the building such as hallways, stairwells, elevators, etc. While tracking personnel and/or assets in buildings, the tracking data input may be matched to known features in the building to increase the position accuracy and reduce and/or remove the errors inherent in the methods used to obtain the tracking data…map building methods may be implemented to generate features and/or landmarks of a building using sections of tracking data, and then use these features to match and correct other sections of tracking data. Map building may include generation and matching functionalities as well as the functionality of sensor fusion methods”) and a location relative to georeferenced map data of an associated sensor representation within a first plurality of sensor representations (Bandyopadhyay [0108] “Aerial Imagery may also include a Georeference for each pixel of a building outline, yielding the building outline as a series of points, or lines, or a polygon that is georeferenced. The building may then be georeferenced by its outline, or by clicking (or otherwise selecting) a point inside the building outline, such as the center, to be its unique georeference. This is useful for database queries, and for grouping and searching buildings near a global location”). Therefore, it would have been obvious to a person of ordinary skill in the art at the time the invention was effectively filed to modify the prior art with the teaching of Bandyopadhyay to enhance sensor provisioning efficiency in industrial tracking system by integrating cloud-based automation and unique label identifiers to improve accuracy and reduce deployment errors as predictable uses os known techniques. However, the prior art does not explicitly disclose the following: generating a print file including label information to be applied to each sensor of the first plurality of sensors; and the label information comprises a representation of a unique identifier. Ganz discloses a technique where unique alphanumeric registration codes are printed on tags or labels attached to physical toys at the point of manufacture or packaging, enabling secure online provisioning and registration of each toy into a virtual system upon user entry of the code. In particular, Ganz teaches the following: generating a print file including label information to be applied to each sensor of the first plurality of sensors (Ganz discloses label information (tag with code) is applied to items (toys analogous to sensors); see [0056] “The toy includes a tag attached to the toy body or the toy packaging (or alternatively, another indicator and/or a storage device) indicating a web site address and a registration code”); and the label information comprises a representation of a unique identifier (Ganz discloses a unique identifier (registration code); see [0047] “an online "virtual world" where the user of a toy can register the toy using a unique registration number provided with the toy at purchase”). Therefore, it would have been obvious to a person of ordinary skill in the art at the time the invention was effectively filed to modify the prior art with the teaching of Ganz to enhance secure and efficient registration of multiple physical sensors in a network by associating unique codes with items for system integration and tracking. Regarding claim 2, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein the label information further comprises a sensor type (Bandyopadhyay [0217] “Inertial methods are generally considered to be those that use sensors such as accelerometers, gyroscopes, magnetic field sensors, pressure sensors, and other sensors to calculate the trajectory of a target being tracked without the need for external references”. Therefore, it would have been obvious of one of ordinary skill to modify Ganz’s label (with type-indicating code) to include Bandyopadhyay’s sensor type, as both facilitate device provisioning by identifying type via metadata, improving efficiency in sensor network, combining know elements for predictable results). Regarding claim 3, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein: provisioning the first plurality of sensors further comprises generating metadata for each sensor of the first plurality of sensors (Lawson [0053] “the developer can set values for the metadata tags' data fields using tagging functionality provided by the publishing interface. Some metadata tag values may also be set automatically generated during development of application”); and the metadata comprises a sensor type (Bandyopadhyay [0072] “The INU can include a combination of digital or analog accelerometers, gyroscopes, and magnetic field sensors. In one configuration, for example, the INU may include a MEMS three-axis accelerometer, a one and two axis MEMS gyroscope, and a MEMS 3-axis magnetic field sensor”) and a sensor deployment location (Bandyopadhyay [0235] “A stair flag may indicate whether a tracking device or system is encountering stairs. It may also indicate if the stairs are being traversed in an upward or downward direction”). Regarding claim 4, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 3, wherein the sensor deployment location includes at least one of a georeferenced identifier, a building identifier, a room identifier, a zone identifier, and a wireless access point identifier (Bandyopadhyay discloses sensor deployment location including zone identifiers through hallway and stairwell detection, where hallways functions as zones; [0192] “connectivity between landmarks of different types may be detected by checking for intersection regions, such as checking if a stairwell polygon intersects with any hallway”). Regarding claim 5, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 3, wherein the sensor deployment location includes an elevation identifier (Bandyopadhyay [0114] “Accordingly, to make a floor assignment process more feasible, it may be useful to include building elevation data (e.g., number of floors, basement data, elevation of each floor, or other building elevation data) in the building data that is tagged to (or associated with) a building”). Regarding claim 6, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 5, wherein: the one or more memories including program instructions, when executed are further configured to provide a provisioning template to the GUI and the GUI is configured to receive the georeferenced map data (Bandyopadhyay [0082] “Mapping application 130 may provide a Graphical User Interface (GUI) (or other interface) for, among other things, providing graphical displays of position (or tracking) estimates of personnel and/or assets (including, but not limited to, estimates based on INU, GPS, or fused sensor data) on maps (or other displays) of various kinds including those generated based on collected trajectory data. The GUI may further display identification and status information of personnel and/or assets as determined by sensors connected to the CSM, including the INU”); and the initial sensor provisioning instructions comprise the template location for each sensor of the first plurality of sensors (Bandyopadhyay [0572] “According to an aspect of the invention, rooms can be detected using the Room Event. When the Room Event is triggered by a series of tracking points originating from a matched hallway and returning to the same hallway, the Room Tracking Points may be first rotated clockwise around the first point by the slope of the hallway they are along”). Regarding claim 7, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 6, wherein the provisioning template is an architectural template representing a facility to be monitored by the first plurality of sensors (Bandyopadhyay [0131] “building data (e.g., building features and their description) registered in the building database may be stored in a universal format such as, for example, XML, or in other formats. The level of detail may be customized depending on the intended use of the building database. For tracking, landmark data may be customized to the characteristics of tracking data available from inertial methods, signal-based methods, or other methods, or a fusion of methods”). Regarding claim 8, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 7, wherein the facility is at least one of a cold storage warehouse, an office complex, an apartment building, and an outdoor complex (Bandyopadhyay [0149] “open spaces (e.g., gymnasiums, warehouse floors, exhibit halls, lobbies, etc.)”). Regarding claim 9, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein: the GUI is configured to receive the georeferenced map data (Bandyopadhyay [0157] “for each floor plan, marking the geolocation of a number of points (e.g., two points) on the floor plan can provide a georeference to determine the geolocation of each point on the floor plan. This combined with the boundary may describe the extent of the floor plan in global co-ordinates”); and the initial sensor provisioning instructions comprise the georeferenced map data and a map location for each sensor of the first plurality of sensors (Bandyopadhyay [0082] “Mapping application 130 may provide a Graphical User Interface (GUI) (or other interface) for, among other things, providing graphical displays of position (or tracking) estimates of personnel and/or assets (including, but not limited to, estimates based on INU, GPS, or fused sensor data) on maps (or other displays) of various kinds including those generated based on collected trajectory data. The GUI may further display identification and status information of personnel and/or assets as determined by sensors connected to the CSM, including the INU. In this regard, a user of computer 120 (e.g., an incident commander at an emergency scene) can monitor, among other things, the location and status information of personnel and/or assets that have been outfitted with a tracking system”). Regarding claim 10, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 9, wherein the georeferenced map data comprises elevation data (Bandyopadhyay [0117] “Elevation of each floor. Knowledge of the elevation of each floor may help in the process of assigning floor numbers to 3D position estimates. Aerial Imagery software often includes rulers to determine distances between points. This may be used to find the approximate distance between the ground and approximate start of the first floor, and so on, for all of the floors. The total elevation of the building can also be recorded in the building data”). Regarding claim 11, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein each sensor of the first plurality of sensors comprises a wireless interface (Bandyopadhyay [0069] “the CSM may include a radio transceiver for communicating the data wirelessly to one or more computing devices such as, for example, a computer 120 which may serve as a "base station" or "command center" at the particular location or environment”). Regarding claim 12, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 11, wherein each sensor of the first plurality of sensors comprises at least one of a temperature sensor, a humidity sensor, and a pressure sensor (Bandyopadhyay [0067] “Each tracking system may comprise, for example, an Inertial Navigation Unit (INU), a Communications Sensor Module (CSM), and/or other sensors or devices that may acquire physiological data (e.g., heart rate, respiration rate, etc.) from a user, environmental information (e.g., temperature)”). Regarding claim 13, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 11, wherein each sensor of the first plurality of sensors is at least one of a security sensor, a chemical sensor, a biological sensor, an acoustic sensor, and an optical sensor (Bandyopadhyay [0067] “other sensors or devices that may acquire physiological data (e.g., heart rate, respiration rate, etc.) from a user, environmental information (e.g., temperature, atmospheric pressure, background radiation, etc.), or other information”). Regarding claim 14, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 11, wherein each sensor of the first plurality of sensors is at least one of a flow sensor, a position sensor, a voltage sensor, a current sensor, a radio field (RF) sensor, and a proximity sensor (Bandyopadhyay [0067] “Each tracking system may comprise, for example, an Inertial Navigation Unit (INU), a Communications Sensor Module (CSM), and/or other sensors or devices that may acquire physiological data (e.g., heart rate, respiration rate, etc.) from a user, environmental information (e.g., temperature, atmospheric pressure, background radiation, etc.), or other information”). Regarding claim 15, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein the system is a subsystem implemented within a facility monitoring system (Lawson [0034] “cloud platform 102 can be a private cloud operated internally by an industrial enterprise. An exemplary private cloud can comprise a set of servers hosting the industrial application provisioning system 104 and residing on a corporate network protected by a firewall”), and the facility monitoring system is configured for: receiving data from each sensor of the first plurality of sensors (Lawson [0037] “Client device 116 can also be a cloud-capable industrial device, such as an industrial controller (e.g., programmable logic controllers or other types of programmable automation controllers); a field device such as sensor or a meter”); and influencing one or more environmental control systems (Bandyopadhyay [0067] “Each tracking system may comprise, for example, an Inertial Navigation Unit (INU), a Communications Sensor Module (CSM), and/or other sensors or devices that may acquire physiological data (e.g., heart rate, respiration rate, etc.) from a user, environmental information (e.g., temperature, atmospheric pressure, background radiation, etc.), or other information”). Regarding claim 16, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein: the one or more memories including program instructions, when executed are further configured for receiving sensor status data for each sensor, wherein the sensor status data for each sensor is received from the first plurality of sensors after installation of the first plurality of sensors per the initial sensor provisioning instructions (Bandyopadhyay discloses the system architecture that includes a base station with mapping software (i.e., memories with program instruction) that receives real-time or historic tracking data, including sensor readings and status, from sensors (INU and CSM) after the sensors are deployed on trackee (i.e., after installation per provisioning instructions, as the sensors are equipped and then send data during operation); see [0067-0071] ); and the sensor status data for each sensor includes a health status for each sensor and the health status for each sensor includes at least one of a battery health indicator and a sensor failure indicator (Bandyopadhyay [0069] “The INU, CSM, and/or other components comprising a given tracking system may each be powered (individually or collectively) by one or more batteries (or other power source(s))”). Regarding claim 17, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 16, wherein the one or more memories including program instructions, when executed are further configured for: generating a sensor status overlay template using the sensor status data for each sensor and the initial sensor provisioning instructions, wherein the sensor status overlay template illustrates per sensor status data positioned about the template location for each sensor location (Bandyopadhyay [0071] “the INU may use inertial sensors and magnetic or electro-magnetic field sensors to generate data that can be used to determine location, motion and orientation of a trackee. This may be accomplished by combining a variety of motion sensing components with a microprocessor or microcontroller which provides both I/O support for the peripheral sensors and computational capabilities for signal processing functions”); and transmitting the sensor status overlay template to the GUI (Bandyopadhyay [0082] “mapping software application 130, may be loaded into memory and run on an operating system of computer 120. Mapping application 130 may comprise software module(s) which may enable the features and functionality and implement the various methods (or algorithms) described in detail herein”). Regarding claim 18, Lawson-Bandyopadhyay-Ganz discloses the computing system of claim 1, wherein the one or more memories including program instructions, when executed are further configured for: transmitting at least a portion of the initial sensor provisioning instructions to the GUI (Bandyopadhyay [0082] “Mapping application 130 may provide a Graphical User Interface (GUI) (or other interface) for, among other things, providing graphical displays of position (or tracking) estimates of personnel and/or assets (including, but not limited to, estimates based on INU, GPS, or fused sensor data) on maps (or other displays) of various kinds including those generated based on collected trajectory data”); receiving updated sensor provisioning instructions, wherein the updated sensor provisioning instructions include a delta provisioning report (Bandyopadhyay [0293] “the tracking data may comprise data acquired in real-time (e.g., one or more tracking points may be provided with every new update) while personnel and/or assets are outfitted with tracking systems (e.g., 110a, 100b, . . . 100n) (FIG. 1) and are being monitored”); and provisioning a second plurality of sensors using the updated sensor provisioning instructions (Bandyopadhyay [0013] “The INU and CSM may establish a wireless personal area network (WPAN) on each trackee, allowing for the addition of other distributed wireless sensors on the trackee as needed”). Regarding claim(s) 19 and 20, do(es) not teach or further define over the limitation in claim(s) 1 respectively. Therefore claim(s) 19 and 20 is/are rejected for the same rationale of rejection as set forth in claim(s) 1 respectively. Conclusion For the reason above, claims 1-20 have been rejected and remain pending. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JIMMY H TRAN whose telephone number is (571)270-5638. The examiner can normally be reached Monday-Friday 9am-5pm PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Parry can be reached at 571-272-8328. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JIMMY H TRAN Primary Examiner Art Unit 2451 /JIMMY H TRAN/Primary Examiner, Art Unit 2451
Read full office action

Prosecution Timeline

Sep 09, 2024
Application Filed
Feb 03, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587469
AUTOMATIC APPLICATION-BASED MULTIPATH ROUTING FOR AN SD-WAN SERVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12568042
Application-Aware BGP Path Selection And Forwarding
2y 5m to grant Granted Mar 03, 2026
Patent 12549391
SUBSCRIPTION-BASED MODEL WITH PROTECTION AGAINST BILLING AVOIDANCE
2y 5m to grant Granted Feb 10, 2026
Patent 12542790
ACTION RESPONSE FRAMEWORK FOR DATA SECURITY INCIDENTS
2y 5m to grant Granted Feb 03, 2026
Patent 12542765
REMOTE SERVER ISOLATION UTILIZING ZERO TRUST ARCHITECTURE
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
96%
With Interview (+17.0%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 689 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month