DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the Claims
Claims 1-20 are currently pending in the present application, with claims 1, 15, and 18 being independent.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-20 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Given the broadest reasonable interpretation of claims 1, 15, and 18 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitations is unclear. For instance, it is not immediately clear as to:
What constitutes the sequence of events associated with carrying out the procedure - Are the sequence of events related to the user performing the steps in the procedure? The steps of the procedure?
How are the anomalies detected in the sequence of events detected? Are they according to some expected sequence of events? Or is it related to the steps in the procedure? As currently claimed it is unclear as to what constitutes the anomaly and how that anomaly is detected.
The examiner respectfully requests the applicant clarify the scope of the claimed invention. Claims depending thereon do not cure all of the noted deficiencies and are therefore also rejected using substantially similar rationale as to that for the claims from which they depend.
Given the broadest reasonable interpretation of claims 2, 16, and 19 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitations is unclear. For instance, it is not immediately clear as to what constitutes event data or aggregated event data. Is event data related to the sequence of events associated with carrying out the procedure? If so, how is that data being aggregated? Is it over multiple procedures? The examiner respectfully requests the applicant clarify the scope of the claimed limitation.
Claims depending thereon do not cure the noted deficiency and are accordingly also rejected suing substantially similar rationale as to that set forth for the claims from which they depend.
Given the broadest reasonable interpretation of claim 4 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitation is unclear. For instance, it is not immediately clear as to If the step is related to the steps of the procedure, or a different set. In addition, is the augmented reality view different or the same as the augmented reality view previously set forth. Also, wouldn’t the augmented reality view be of an environment? The examiner respectfully requests the applicant clarify the scope of the claimed limitation.
Given the broadest reasonable interpretation of claim 6 in light of the disclosure and/or the plain and ordinary meaning of the words themselves, the scope of the claimed limitation is unclear. For instance, it is not immediately clear as to what is meant by the performance of a step and if the step is related the steps of the procedure. The examiner respectfully requests the applicant clarify the scope of the claimed limitation.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim(s) recite(s) rendering a screen on a display comprising: selecting at least one resolution for rendering the screen based on a set of parameters associated with the screen; and rendering the screen with the at least one selected resolution on the display.
As drafted, the claimed limitations are mental process that can be performed by a human mind. For instance, the limitation “obtaining a procedure for performing in association with an augmented reality device, the procedure comprising steps to be performed, wherein at least a subset of the steps are associated with one or more virtual objects presented in an augmented reality view of the augmented reality device that provide guidance associated with the steps”, is a process that, under its broadest reasonable interpretation, is simply choosing a procedure to be performed. The subsequent steps merely tracks the user’s interactions to see if they are following the procedure, and outputting something if the user deviates from the procedure list. Displaying the steps/procedure on an augmented reality device, is merely (extra solution activity). The process is a mental process. Thus, the claim, as drafted, falls at least within the “Mental Processes” grouping of abstract ideas. It is noted, that a similar argument could be made for the claim to additionally fall within one or more groups, such as “Organizing Human Activity”.
This judicial exception is not integrated into a practical application because the method, is recited at a high-level of generality, such that it amounts to no more than mere instructions to apply the exception using a generic computer component. The additional elements (e.g., an augmented reality device), do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Accordingly, claims 1, 15, and 18 are directed to an abstract idea.
The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above, with respect to integration of the abstract idea into a practical application, the additional elements of using an augmented reality device to perform the claimed limitations amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Thus, claims 1, 15, and 18 are not patent eligible.
Similar mapping and rationale can be performed for each of the dependent claims. For instance, with respect to claims 12-14, a notification is generated and/or sent. The dependent claims do not cure the deficiency noted with respect to claims from which they depend. Accordingly, claims 1-20 as currently drafted are not patent eligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 7-10, 12-15, and 18 is/are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by Neeter (US PG Publication 2021/0019215).
Regarding claim 1, Neeter teaches a method for detecting anomalies associated with performance of procedures guided by an augmented reality system (see for instance, paragraph 64 and figs. 1, 3A, and 3B), the method comprising:
obtaining a procedure for performing in association with an augmented reality device, the procedure comprising steps to be performed, wherein at least a subset of the steps are associated with one or more virtual objects presented in an augmented reality view of the augmented reality device that provide guidance associated with the steps (At step 310 a user interface can be rendered on one or more displays of the augmented reality device...to guide the user on the procedure or routine to be performed in the real-world physical scene...the set of tasks can include one or more activities and actions to be performed by the user, where the action to be performed by the user can include...one or more interactions between the user and one or more virtual and/or actual objects, see for instance, paragraph 66);
tracking, by the augmented reality device, a sequence of events associated with carrying out the procedure, the sequence of events including tracked user interactions with the augmented reality device and tracked state data derived from image data and motion data captured by the augmented reality device (At step 312, the system received data from the augmented reality environment indicating that the user is undertaking a first activity in the set of tasks by performing one or more actions, see for instance, paragraph 66. If the error detection engine determines that the user of the augmented reality device deviated or is deviating from actions to be performed for the first activity, as defined by the set of tasks, see for instance, paragraph 66. Object recognition can be used to identify virtual objects and/or actual objects in the augmented reality environment based on images of the virtual objects and/or actual objects and attributes of the identified virtual and/or actual objects can be extracted from the identified virtual and/or actual objects, see for instance, paragraph 31. For example, a virtual object or an actual object in the augmented reality environment can be a thermometer or gauge, and a temperature of the environment can be determined by identifying the object as thermometer using one or more of the sensors (e.g., via machine vision) to extract the temperature from the image(s) of the thermometer, see for instance, paragraph 31. The event recognition engine can be executed to interface with the environment engine and the task management engine to process the feedback from one or more sensors, see for instance, paragraph 52.);
detecting one or more anomalies in the sequence of events associated with carrying out the procedure (“At step 314, the event recognition engine identifies the actions and the error detection engine uses the trained machine learning models to determine whether the one or more action are correct and are occurring at the correct time in the routine based on the order sequence defined by the set of tasks. As an example, the first activity in the routine can be to locate a control panel”, see for instance, paragraph 66); and
generating one or more outputs indicative of the one or more anomalies (The user can be alerted of the error, (see for instance, paragraph 66 and fig. 3).
Regarding claim 6, Neeter teaches the method of claim 1 and further teaches wherein the tracked user interactions comprises selection of a control element of the augmented reality device presented in the augmented reality view of an environment in association with performance of a step (see for instance, paragraph 70 and fig. 3).
Regarding claim 7, Neeter teaches the method of claim 1 and further teaches wherein the tracked state data includes at least one of: a position, a velocity, an acceleration, an orientation, an angular velocity, and an angular acceleration associated with the augmented reality device, a set of feature points detected from image analysis of images captured by the augmented reality device, and a detected environment map detected from the images captured by the augmented reality device (see for instance, paragraphs 66-71 and fig. 3).
Regarding claim 8, Neeter teaches the method of claim 1 and further teaches wherein tracking the sequence of events comprises at least one of: capturing one or more user events triggered in response to a user input; and capturing one or more polling events triggered based on a polling mechanism (see for instance, paragraphs 66-71 and fig. 3).
Regarding claim 9, Neeter teaches the method of claim 1 and further teaches wherein detecting the one or more anomalies includes: applying a set of anomaly detection rules associated with the procedure to the sequence of events (see for instance, paragraphs 66 and 70).
Regarding claim 10, Neeter teaches the method of claim 1, and further teaches wherein detecting the one or more anomalies includes: applying a classification model associated with the procedure to the sequence of events to detect when the sequence deviates with statistical significance from historical event sequences associated with procedure (see for instance, paragraphs 63, 66, and 69-71).
Regarding claim 12, Neeter teaches the method of claim 1 and further teaches wherein generating the one or more outputs comprises: generating a notification in a user interface of the augmented reality device indicative of the one or more anomalies (see for instance, paragraph 66-70).
Regarding claim 13, Neeter teaches the method of claim 1 and further teaches wherein generating the one or more outputs comprises: sending a notification indicative of the one or more anomalies to an augmented reality server to an administrative client (see for instance, paragraph 66-70).
Regarding claim 14, Neeter teaches The method of claim 1 and further teaches wherein generating the one or more outputs comprises: generating a notification to an augmented reality server coupled to the augmented reality device to enable the augmented reality server to control an action of a machine associated with the procedure, the action comprising at least one of: a calibration action, a reset action, a shutdown action, and a safety action (see for instance, paragraph 66-70).
Regarding claims 15 and 18, claim 16 is the computer-readable storage medium claim and claim 18 is the device claim of the method claim 1 and are accordingly rejected using substantially similar rationale as to that set forth with respect to claim 1. In addition, Neeter teaches a non-transitory computer-readable storage medium storing instructions for detecting anomalies associated with performance of procedures guided by an augmented reality system, the instructions when executed by a processor causing the processor to perform steps (see for instance, claim 21 and paragraphs 6 and 79) and an augmented reality device comprising: one or more cameras for capturing image data; one or more motion sensors for capturing motion data; one or more processors; and a non-transitory computer-readable storage medium storing instructions for detecting anomalies associated with performance of procedures guided by an augmented reality system, the instructions when executed by the one or more processors causing the one or more processors to perform steps (see for instance, paragraphs 49-52 and 79-85).
Allowable Subject Matter
Claims 2-6, 11, 16, 17, 19, and 20 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL J COBB whose telephone number is (571)270-3875. The examiner can normally be reached Monday - Friday, 11am - 7pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHAEL J COBB/Primary Examiner, Art Unit 2615