Prosecution Insights
Last updated: April 19, 2026
Application No. 18/463,255

ANOMALY DETECTION FOR GUIDED PROCEDURES IN AN AUGMENTED REALITY ENVIRONMENT

Non-Final OA §101§102§112
Filed
Sep 07, 2023
Examiner
COBB, MICHAEL J
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Squint Inc.
OA Round
1 (Non-Final)
76%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
329 granted / 432 resolved
+14.2% vs TC avg
Strong +38% interview lift
Without
With
+37.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
19 currently pending
Career history
451
Total Applications
across all art units

Statute-Specific Performance

§101
10.0%
-30.0% vs TC avg
§103
42.0%
+2.0% vs TC avg
§102
4.4%
-35.6% vs TC avg
§112
34.7%
-5.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 432 resolved cases

Office Action

§101 §102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims Claims 1-20 are currently pending in the present application, with claims 1, 15, and 18 being independent. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Given the broadest reasonable interpretation of claims 1, 15, and 18 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitations is unclear. For instance, it is not immediately clear as to: What constitutes the sequence of events associated with carrying out the procedure - Are the sequence of events related to the user performing the steps in the procedure? The steps of the procedure? How are the anomalies detected in the sequence of events detected? Are they according to some expected sequence of events? Or is it related to the steps in the procedure? As currently claimed it is unclear as to what constitutes the anomaly and how that anomaly is detected. The examiner respectfully requests the applicant clarify the scope of the claimed invention. Claims depending thereon do not cure all of the noted deficiencies and are therefore also rejected using substantially similar rationale as to that for the claims from which they depend. Given the broadest reasonable interpretation of claims 2, 16, and 19 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitations is unclear. For instance, it is not immediately clear as to what constitutes event data or aggregated event data. Is event data related to the sequence of events associated with carrying out the procedure? If so, how is that data being aggregated? Is it over multiple procedures? The examiner respectfully requests the applicant clarify the scope of the claimed limitation. Claims depending thereon do not cure the noted deficiency and are accordingly also rejected suing substantially similar rationale as to that set forth for the claims from which they depend. Given the broadest reasonable interpretation of claim 4 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitation is unclear. For instance, it is not immediately clear as to If the step is related to the steps of the procedure, or a different set. In addition, is the augmented reality view different or the same as the augmented reality view previously set forth. Also, wouldn’t the augmented reality view be of an environment? The examiner respectfully requests the applicant clarify the scope of the claimed limitation. Given the broadest reasonable interpretation of claim 6 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitation is unclear. For instance, it is not immediately clear as to what is meant by the performance of a step and if the step is related the steps of the procedure. The examiner respectfully requests the applicant clarify the scope of the claimed limitation. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim(s) recite(s) rendering a screen on a display comprising: selecting at least one resolution for rendering the screen based on a set of parameters associated with the screen; and rendering the screen with the at least one selected resolution on the display. As drafted, the claimed limitations are mental process that can be performed by a human mind. For instance, the limitation “obtaining a procedure for performing in association with an augmented reality device, the procedure comprising steps to be performed, wherein at least a subset of the steps are associated with one or more virtual objects presented in an augmented reality view of the augmented reality device that provide guidance associated with the steps”, is a process that, under its broadest reasonable interpretation, is simply choosing a procedure to be performed. The subsequent steps merely tracks the user’s interactions to see if they are following the procedure, and outputting something if the user deviates from the procedure list. Displaying the steps/procedure on an augmented reality device, is merely (extra solution activity). The process is a mental process. Thus, the claim, as drafted, falls at least within the “Mental Processes” grouping of abstract ideas. It is noted, that a similar argument could be made for the claim to additionally fall within one or more groups, such as “Organizing Human Activity”. This judicial exception is not integrated into a practical application because the method, is recited at a high-level of generality, such that it amounts to no more than mere instructions to apply the exception using a generic computer component. The additional elements (e.g., an augmented reality device), do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Accordingly, claims 1, 15, and 18 are directed to an abstract idea. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above, with respect to integration of the abstract idea into a practical application, the additional elements of using an augmented reality device to perform the claimed limitations amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Thus, claims 1, 15, and 18 are not patent eligible. Similar mapping and rationale can be performed for each of the dependent claims. For instance, with respect to claims 12-14, a notification is generated and/or sent. The dependent claims do not cure the deficiency noted with respect to claims from which they depend. Accordingly, claims 1-20 as currently drafted are not patent eligible. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 7-10, 12-15, and 18 is/are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by Neeter (US PG Publication 2021/0019215). Regarding claim 1, Neeter teaches a method for detecting anomalies associated with performance of procedures guided by an augmented reality system (see for instance, paragraph 64 and figs. 1, 3A, and 3B), the method comprising: obtaining a procedure for performing in association with an augmented reality device, the procedure comprising steps to be performed, wherein at least a subset of the steps are associated with one or more virtual objects presented in an augmented reality view of the augmented reality device that provide guidance associated with the steps (At step 310 a user interface can be rendered on one or more displays of the augmented reality device...to guide the user on the procedure or routine to be performed in the real-world physical scene...the set of tasks can include one or more activities and actions to be performed by the user, where the action to be performed by the user can include...one or more interactions between the user and one or more virtual and/or actual objects, see for instance, paragraph 66); tracking, by the augmented reality device, a sequence of events associated with carrying out the procedure, the sequence of events including tracked user interactions with the augmented reality device and tracked state data derived from image data and motion data captured by the augmented reality device (At step 312, the system received data from the augmented reality environment indicating that the user is undertaking a first activity in the set of tasks by performing one or more actions, see for instance, paragraph 66. If the error detection engine determines that the user of the augmented reality device deviated or is deviating from actions to be performed for the first activity, as defined by the set of tasks, see for instance, paragraph 66. Object recognition can be used to identify virtual objects and/or actual objects in the augmented reality environment based on images of the virtual objects and/or actual objects and attributes of the identified virtual and/or actual objects can be extracted from the identified virtual and/or actual objects, see for instance, paragraph 31. For example, a virtual object or an actual object in the augmented reality environment can be a thermometer or gauge, and a temperature of the environment can be determined by identifying the object as thermometer using one or more of the sensors (e.g., via machine vision) to extract the temperature from the image(s) of the thermometer, see for instance, paragraph 31. The event recognition engine can be executed to interface with the environment engine and the task management engine to process the feedback from one or more sensors, see for instance, paragraph 52.); detecting one or more anomalies in the sequence of events associated with carrying out the procedure (“At step 314, the event recognition engine identifies the actions and the error detection engine uses the trained machine learning models to determine whether the one or more action are correct and are occurring at the correct time in the routine based on the order sequence defined by the set of tasks. As an example, the first activity in the routine can be to locate a control panel”, see for instance, paragraph 66); and generating one or more outputs indicative of the one or more anomalies (The user can be alerted of the error, (see for instance, paragraph 66 and fig. 3). Regarding claim 6, Neeter teaches the method of claim 1 and further teaches wherein the tracked user interactions comprises selection of a control element of the augmented reality device presented in the augmented reality view of an environment in association with performance of a step (see for instance, paragraph 70 and fig. 3). Regarding claim 7, Neeter teaches the method of claim 1 and further teaches wherein the tracked state data includes at least one of: a position, a velocity, an acceleration, an orientation, an angular velocity, and an angular acceleration associated with the augmented reality device, a set of feature points detected from image analysis of images captured by the augmented reality device, and a detected environment map detected from the images captured by the augmented reality device (see for instance, paragraphs 66-71 and fig. 3). Regarding claim 8, Neeter teaches the method of claim 1 and further teaches wherein tracking the sequence of events comprises at least one of: capturing one or more user events triggered in response to a user input; and capturing one or more polling events triggered based on a polling mechanism (see for instance, paragraphs 66-71 and fig. 3). Regarding claim 9, Neeter teaches the method of claim 1 and further teaches wherein detecting the one or more anomalies includes: applying a set of anomaly detection rules associated with the procedure to the sequence of events (see for instance, paragraphs 66 and 70). Regarding claim 10, Neeter teaches the method of claim 1, and further teaches wherein detecting the one or more anomalies includes: applying a classification model associated with the procedure to the sequence of events to detect when the sequence deviates with statistical significance from historical event sequences associated with procedure (see for instance, paragraphs 63, 66, and 69-71). Regarding claim 12, Neeter teaches the method of claim 1 and further teaches wherein generating the one or more outputs comprises: generating a notification in a user interface of the augmented reality device indicative of the one or more anomalies (see for instance, paragraph 66-70). Regarding claim 13, Neeter teaches the method of claim 1 and further teaches wherein generating the one or more outputs comprises: sending a notification indicative of the one or more anomalies to an augmented reality server to an administrative client (see for instance, paragraph 66-70). Regarding claim 14, Neeter teaches The method of claim 1 and further teaches wherein generating the one or more outputs comprises: generating a notification to an augmented reality server coupled to the augmented reality device to enable the augmented reality server to control an action of a machine associated with the procedure, the action comprising at least one of: a calibration action, a reset action, a shutdown action, and a safety action (see for instance, paragraph 66-70). Regarding claims 15 and 18, claim 16 is the computer-readable storage medium claim and claim 18 is the device claim of the method claim 1 and are accordingly rejected using substantially similar rationale as to that set forth with respect to claim 1. In addition, Neeter teaches a non-transitory computer-readable storage medium storing instructions for detecting anomalies associated with performance of procedures guided by an augmented reality system, the instructions when executed by a processor causing the processor to perform steps (see for instance, claim 21 and paragraphs 6 and 79) and an augmented reality device comprising: one or more cameras for capturing image data; one or more motion sensors for capturing motion data; one or more processors; and a non-transitory computer-readable storage medium storing instructions for detecting anomalies associated with performance of procedures guided by an augmented reality system, the instructions when executed by the one or more processors causing the one or more processors to perform steps (see for instance, paragraphs 49-52 and 79-85). Allowable Subject Matter Claims 2-6, 11, 16, 17, 19, and 20 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL J COBB whose telephone number is (571)270-3875. The examiner can normally be reached Monday - Friday, 11am - 7pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MICHAEL J COBB/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Sep 07, 2023
Application Filed
Jan 24, 2026
Non-Final Rejection — §101, §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597182
DATA INTERPOLATION PLATFORM FOR GENERATING PREDICTIVE AND INTERPOLATED PRICING DATA
2y 5m to grant Granted Apr 07, 2026
Patent 12586321
AUTOMATED MEASUREMENT OF INTERIOR SPACES THROUGH GUIDED MODELING OF DIMENSIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12579736
METHOD AND DEVICE FOR GENERATING THREE-DIMENSIONAL IMAGE BY USING PLURALITY OF CAMERAS
2y 5m to grant Granted Mar 17, 2026
Patent 12561105
ONLINE ELECTRONIC WHITEBOARD CONTENT SYNCHRONIZATION AND SHARING SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12561859
Method and System for Visualizing a Graph
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+37.9%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 432 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month