Prosecution Insights
Last updated: April 19, 2026
Application No. 18/550,339

DIGITAL COMPANION FOR PERCEPTUALLY ENABLED TASK GUIDANCE

Non-Final OA §102§103
Filed
Sep 13, 2023
Examiner
ROWLAND, STEVE
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Siemens Aktiengesellschaft
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
95%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
823 granted / 1059 resolved
+7.7% vs TC avg
Strong +18% interview lift
Without
With
+17.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
24 currently pending
Career history
1083
Total Applications
across all art units

Statute-Specific Performance

§101
17.2%
-22.8% vs TC avg
§103
32.0%
-8.0% vs TC avg
§102
28.7%
-11.3% vs TC avg
§112
13.5%
-26.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1059 resolved cases

Office Action

§102 §103
Detailed Action Priority Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) is acknowledged. Applicant has not complied with one or more conditions for receiving the benefit of an earlier filing date under 35 U.S.C. 119(e) as follows: Provisional application No. 61/168426 (as listed in the Application Data Sheet of 09/13/2023) is expired as of 04/11/2010; thus, the according claim to priority is untimely. To correct the priority claim to a different provisional application, Applicant should refer to the procedure set forth in 37 CFR 1.78(c). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: (a) A person shall be entitled to a patent unless— (2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122 (b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention Claims 1-9, 11-15 and 17-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Neeter (US 2021/0019215 A1). Regarding claim 1, Neeter discloses a computer-implemented method for a digital companion (Abstract) comprising a computer processor (Fig. 6) receiving information representative of human knowledge (354: load a set of tasks), converting the received information into a computer-readable form including at least one task-based process to be performed (356: trained models corresponding to tasks set), constructing a digital twin of a scene for performing the task-based process (¶ [0039]: scene scanning and generation engine can facilitate the scanning of a real-world physical scene and the generation of a virtual, augmented, and/or mixed reality scene corresponding to the real-world physical scene), receiving environmental information from a real-world scene for performing the task-based process (374: receive data that the user is undertaking a next activity in the set of tasks by performing one or more actions), evaluating the received environmental information to detect an error in the performance of the task-based process (376), and providing guidance to a user based on the detected error (378 and ¶ [0066]: error detection engine can cause the augmented reality device or another device located at the real-world physical location to alert the user of the error and/or provide the user with feedback to correct the error via the augment reality device or the other device). Regarding claim 2, Neeter discloses converting the received information representative of human knowledge into a knowledge graph (¶ [0063]: Bayesian network and/or knowledge base). Regarding claim 3, Neeter discloses constructing a process model representative of execution of the task-based process (356: trained models corresponding to tasks set), constructing a scene model representative of the real-world scene for performing the task-based process (¶ [0039]: scene scanning and generation engine can facilitate the scanning of a real-world physical scene and the generation of a virtual, augmented, and/or mixed reality scene corresponding to the real-world physical scene), and constructing a user model representative of a worker performing tasks in the task-based process (¶ [0052]: use the data from the sensors to generate an outline or skeleton of the user … outline or skeleton can be used to track a pose of the user, locations of the user, movements of the user and/or interactions between the user and objects). Regarding claim 4, Neeter discloses wherein the scene model is a digital twin of the real-world scene for performing the task-based process (¶ [0039]: scene scanning and generation engine can facilitate the scanning of a real-world physical scene and the generation of a virtual, augmented, and/or mixed reality scene corresponding to the real-world physical scene). Regarding claim 5, Neeter discloses wherein updating the digital twin of the real-world scene periodically based on the received environmental information (¶ [0043]: the scanned object can be updated with a CAD model in the virtual reality and/or augmented reality environment to include a virtual object that can be manipulated by the users). Regarding claim 6, Neeter discloses wherein the environmental information from the real-world scene comprises data generated from one or more sensors located in the real-world scene (¶ [0039]: environment data (e.g., temperature data, precipitation data, humidity data, telemetry data, wind data, speed data, acceleration data, smell data, vibration data, etc.) can be captured at the physical scene during scanning). Regarding claim 7, Neeter discloses providing the guidance to the user in a head-mounted display using augmented reality (¶ [0066]: error detection engine can cause the augmented reality device or another device located at the real-world physical location to alert the user of the error and/or provide the user with feedback to correct the error via the augment reality device or the other device). Regarding claim 8, Neeter discloses providing the guidance to the user by communicating information to the user (¶ [0066]: error detection engine can cause the augmented reality device or another device located at the real-world physical location to alert the user of the error and/or provide the user with feedback to correct the error via the augment reality device or the other device). Regarding claim 9, Neeter discloses receiving information regarding the user, and customizing the guidance provided to the user based on the user information (¶ [0066]: error detection engine can cause the augmented reality device or another device located at the real-world physical location to alert the user of the error and/or provide the user with feedback to correct the error via the augment reality device or the other device). Regarding claim 11, Neeter discloses wherein the information regarding the user is obtained from a physiological sensor associated with the user (¶ [0090]: monitor and track the movement of user’s eyes). Regarding claim 12, Neeter discloses storing in a knowledge graph, each step in the task-based process, and linking to each step and at least one entity required to execute the step (¶ [0063]: Bayesian network and/or knowledge base). Regarding claim 13, Neeter discloses for each step, storing information relating to pre-dependencies for performing the task (claim 2: the set of tasks have a hierarchical tree structure that define an ordered sequence of the tasks). Regarding claim 14, Neeter discloses wherein constructing the digital twin of the scene comprises receiving a captured image from the scene process (¶ [0039]: scene scanning and generation engine can facilitate the scanning of a real-world physical scene and the generation of a virtual, augmented, and/or mixed reality scene corresponding to the real-world physical scene), and classifying entity objects in the captured image process (¶ [0041]: engine can be executed to identify and match a particular real world, physical object captured in a real-world, physical scene against defined virtual objects). Regarding claim 15, Neeter discloses wherein each classified entity object is associated with a unique identifier identifying the entity object based on a semantic model of the system (¶ [0041]: one or more images of the real-world physical object can be used to retrieve 3D models from a database of computer aided design (CAD) models to aid in the identification of the real-world physical object and a corresponding virtual object). Regarding claim 17, Neeter discloses analyzing the digital twin to mark each object as to whether it is expected in the scene (¶ [0041]: engine can be executed to identify and match a particular real world, physical object captured in a real-world, physical scene against defined virtual objects). Regarding claim 18, Neeter discloses a system for providing a digital companion (Abstract) comprising a computer processor in communication with a non-transitory memory (Fig. 6), the non-transitory memory storing instructions that when executed by the computer processor cause the processor to instantiate a knowledge transfer module for receiving information representative of human knowledge and convert the information into a machine-readable form (354: load a set of tasks), create a knowledge base comprising a process model representative of a step-based process performed using the human knowledge (356: trained models corresponding to tasks set), create a perception grounding module that identifies entities in a physical world and builds a digital twin of the physical world (¶ [0039]: scene scanning and generation engine can facilitate the scanning of a real-world physical scene and the generation of a virtual, augmented, and/or mixed reality scene corresponding to the real-world physical scene), create a perception attention module for evaluating the digital twin of the physical world to detect an error in execution of the step-based process (374: receive data that the user is undertaking a next activity in the set of tasks by performing one or more actions), and create a user engagement module for communication of a detected error to a user operating in the physical world (378 and ¶ [0066]: error detection engine can cause the augmented reality device or another device located at the real-world physical location to alert the user of the error and/or provide the user with feedback to correct the error via the augment reality device or the other device). Regarding claim 19, Neeter discloses the knowledge base comprising a process model representative of the step-based process (356: trained models corresponding to tasks set), a scene model representative of the physical world (¶ [0039]: scene scanning and generation engine can facilitate the scanning of a real-world physical scene and the generation of a virtual, augmented, and/or mixed reality scene corresponding to the real-world physical scene), and a user model representative of the user (¶ [0052]: use the data from the sensors to generate an outline or skeleton of the user … outline or skeleton can be used to track a pose of the user, locations of the user, movements of the user and/or interactions between the user and objects). Regarding claim 20, Neeter discloses a communicating device to communicate the detected error to the user (378 and ¶ [0066]: error detection engine can cause the augmented reality device or another device located at the real-world physical location to alert the user of the error and/or provide the user with feedback to correct the error via the augment reality device or the other device). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. If this application names joint inventors, Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Neeter in view of Jones (US 2013/0194607 A1). Regarding claim 10, Jones suggests—where Neeter does not disclose—wherein the information regarding the user is obtained from a login of the user to the system (¶ [0030]: feedback can be automatically formulated to include … specific user based on … login information). It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the invention to combine the disclosures of Neeter and Jones in order to personalize the feedback, thus make the system seem more user friendly. Allowable Subject Matter Claim 16 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. . Conclusion The prior art considered pertinent to applicant's disclosure and not relied upon is made of record on the attached PTO-892 form. Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVE ROWLAND whose telephone number is (469) 295-9129. The examiner can normally be reached on M-Th 10-8. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor Dmitry Suhol can be reached at (571) 272-4430. The fax number for the organization where this application or proceeding is assigned is (571) 273-8300. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. Applicant may choose, at his or her discretion, to correspond with Examiner via Internet e-mail. A paper copy of any and all email correspondence will be placed in the appropriate patent application file. Email communication must be authorized in advance. Without a written authorization by applicant in place, the USPTO will not respond via e-mail to any correspondence which contains information subject to the confidentiality requirement as set forth in 35 U.S.C. 122. Authorization may be perfected by submitting, on a separate paper, the following (or similar) disclaimer: PNG media_image1.png 18 19 media_image1.png Greyscale Recognizing that Internet communications are not secure, I hereby authorize the USPTO to communicate with me concerning any subject matter of this application by electronic mail. I understand that a copy of these communications will be made of record in the application file. PNG media_image1.png 18 19 media_image1.png Greyscale See MPEP 502.03 for more information. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /STEVE ROWLAND/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Sep 13, 2023
Application Filed
Dec 03, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589308
GENERATIVE NARRATIVE GAME EXPERIENCE WITH PLAYER FEEDBACK
2y 5m to grant Granted Mar 31, 2026
Patent 12586441
SELECTIVE REDEMPTION OF GAMING ESTABLISHMENT TICKET VOUCHERS
2y 5m to grant Granted Mar 24, 2026
Patent 12582874
APPARATUS FOR ARTIFICIAL INTELLIGENCE EXERCISE RECOMMENDATION BY ANALYZING DATA COLLECTED BY POSTURE MEASUREMENT SENSOR AND DRIVING METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12579757
UPDATING A VIRTUAL REALITY ENVIRONMENT
2y 5m to grant Granted Mar 17, 2026
Patent 12569763
VIRTUAL OBJECT CONTROL METHOD AND RELATED APPARATUS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
95%
With Interview (+17.6%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 1059 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month