Prosecution Insights
Last updated: April 19, 2026
Application No. 18/385,851

PRESENTING AN ENVIRONMENT BASED ON USER MOVEMENT

Non-Final OA §102
Filed
Oct 31, 2023
Examiner
HARRISON, CHANTE E
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
3 (Non-Final)
69%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
97%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
497 granted / 725 resolved
+6.6% vs TC avg
Strong +29% interview lift
Without
With
+28.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
30 currently pending
Career history
755
Total Applications
across all art units

Statute-Specific Performance

§101
8.9%
-31.1% vs TC avg
§103
40.3%
+0.3% vs TC avg
§102
31.8%
-8.2% vs TC avg
§112
15.2%
-24.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 725 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 1. This action is responsive to communications: RCE & Amendment, filed on 12/17/2025. 2. Claims 1-7 and 10-24 are pending in the case. Claims 1, 13 and 19 are independent claims. Claims 1-2, 10, 12-14 and 19 have been amended. Claims 8-9 are cancelled. Response to Arguments Applicant’s arguments with respect to claim(s) December 17, 2025 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-4, 6-7, 10-11, 13-16, 18-22 and 24 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yuki Yamamoto et al., US 2016/0379413 A1. Independent claim 1, Yamamoto discloses an electronic device, comprising: one or more processors (Fig. 3 “307”); and memory storing one or more programs (Fig. 3 “306”; Fig. 14; Para 110) configured to be executed by the one or more processors, the one or more programs including instructions for: presenting, via a display device (Fig. 2 “100”), a virtual object (Fig. 2 “101R, 101L”); detecting, via one or more sensors, user movement that occurs in a physical environment (i.e. The status information acquisition unit 304 acquires information about the position and orientation or information about the orientation of the user's head, for example, in order to track the user's head movements – Para 59; The outward-facing camera 312 and the distance sensor may be used to detect the position, orientation, and shape of the body of the user wearing the head-mounted display 100 - Para 64); determining that at least a portion of the virtual object is presented in front of a real object in the physical environment from a perspective of a user viewing the display device (i.e. the virtual object is front of the real object - Fig. 8); and in response to detecting the user movement and determining that at least the portion of the virtual object is presented in front of the real object in the physical environment from the perspective of the user viewing the display device: determining whether the detected user movement is directed to the real object or the portion of the virtual object presented in front of the real object from the perspective of the user viewing the display device (i.e. acquire other information as status information about the user wearing the head- mounted display 100, such as the user's operating status (whether or not the user is wearing the head-mounted display 100), the user's behavioral status (a movement status such as being still, walking, or running, gestures made with the hands or fingers, the open/closed status of the eyelids, the gaze direction, and the size of the pupils – Para 60; and as long as the hand remains within a fixed distance from the head-mounted display 100, the physical body is tracked and a three-dimensional model M.sub.I of the physical body is placed in the virtual world and continually displayed. If the hand moves away farther than the fixed distance, the physical body is no longer detected, and the display of the three- dimensional model M.sub.I also disappears – Para 101); in accordance with a determination that the detected user movement is directed to the real object, modifying a visual appearance of the virtual object, wherein modifying the visual appearance of the virtual object comprises presenting at least a portion of the real object (i.e. Note that in the examples illustrated in FIGS. 15 to 17, a virtual object does not exist in front of the corresponding region R.sub.V in the virtual space V corresponding to the three-dimensional model M.sub.I of the object, and thus the entire three- dimensional model M.sub.I is displayed as illustrated in FIG. 17 – Para 114); and in accordance with a determination that the detected user movement is directed to the portion of the virtual object presented in front of the real object from the perspective of the user viewing the display device, maintaining the presentation of at least the portion of the virtual object in front of the real object from the perspective of the user viewing the display device (i.e. However, if a virtual object exists in front of the object in the real world (in the examples illustrated in FIGS. 15 to 17, the PET bottle), the three-dimensional model M.sub.I is displayed obscured behind by the virtual object – Para 114). Claim 2, Yamamoto discloses the electronic device of claim 1, wherein the determination that the detected user movement is directed to the portion of the virtual object presented in front of the real object from the perspective of the user viewing the display device includes a determination that a distance between the portion of the virtual object and a location of the user movement does not exceed a threshold distance (i.e. For example, as the user brings his or her hand holding a physical body such as a cup close to the head-mounted display 100, the physical body is detected as an object automatically, and as long as the hand remains within a fixed distance from the head-mounted display 100, the physical body is tracked and a three-dimensional model M.sub.I of the physical body is placed in the virtual world and continually displayed. If the hand moves away farther than the fixed distance, the physical body is no longer detected, and the display of the three-dimensional model M.sub.I also disappears - Para 101). Claim 3, Yamamoto discloses the electronic device of claim 1, wherein the one or more programs further include instructions for: detecting, via the one or more sensors, a user pose that occurs in the physical environment, wherein the determination that the detected user movement is directed to the real object includes a determination that the detected user pose corresponds to a feature of the real object (i.e. The outward-facing camera 312 and the distance sensor may be used to detect the position, orientation, and shape of the body of the user wearing the head-mounted display 100 - Para 64; For example, as the user brings his or her hand holding a physical body such as a cup close to the head-mounted display 100, the physical body is detected as an object automatically, and as long as the hand remains within a fixed distance from the head-mounted display 100, the physical body is tracked and a three-dimensional model M.sub.I of the physical body is placed in the virtual world and continually displayed. If the hand moves away farther than the fixed distance, the physical body is no longer detected, and the display of the three-dimensional model M.sub.I also disappears - Para 101). Claim 4, Yamamoto discloses The electronic device of claim 1, wherein the one or more programs further include instructions for: detecting, via the one or more sensors, a user pose that occurs in the physical environment, wherein the determination that the detected user movement is directed to the real object includes a determination that the detected user pose does not correspond to a feature of the virtual object (i.e. The outward-facing camera 312 and the distance sensor may be used to detect the position, orientation, and shape of the body of the user wearing the head-mounted display 100 - Para 64; For example, as the user brings his or her hand holding a physical body such as a cup close to the head-mounted display 100, the physical body is detected as an object automatically, and as long as the hand remains within a fixed distance from the head-mounted display 100, the physical body is tracked and a three-dimensional model M.sub.I of the physical body is placed in the virtual world and continually displayed. If the hand moves away farther than the fixed distance, the physical body is no longer detected, and the display of the three-dimensional model M.sub.I also disappears - Para 101). Claim 6, Yamamoto discloses The electronic device of claim 1, wherein the one or more programs further include instructions for: detecting, via the one or more sensors, a user gaze, wherein the determination that the detected user movement is directed to the real object includes a determination that the detected user gaze is directed to the real object (i.e. acquire other information as status information about the user wearing the head-mounted display 100, such as the user's operating status (whether or not the user is wearing the head-mounted display 100), the user's behavioral status (a movement status such as being still, walking, or running, gestures made with the hands or fingers, the open/closed status of the eyelids, the gaze direction, and the size of the pupils – Para 60; Herein, the object in the real world is part of the body of the user wearing the head-mounted display… The head-mounted display 100, upon detecting an object in the real world using the outward-facing camera 312 and the distance sensor, tracks the detected object, places a three-dimensional model of the detected object in the virtual world, and conducts a rendering process – Para 100). Claim 7, Yamamoto discloses The electronic device of claim 1, wherein modifying the visual appearance of the virtual object includes: in accordance with a determination that the detected user movement is directed to the real object with a first level of confidence, modifying the visual appearance of the virtual object by a first magnitude (Figs. 9 & 10); and in accordance with a determination that the detected user movement is directed to the real object with a second level of confidence different from the first level of confidence, modifying the visual appearance of the virtual object by a second magnitude different from the first magnitude (Figs. 9 & 10). Claim 10, Yamamoto discloses the electronic device of claim 1, wherein determining whether the detected user movement is directed to the real object or the portion of the virtual object presented in front of the real object from the perspective of the user viewing the display device includes predicting where the detected user movement will stop (i.e. acquire other information as status information about the user wearing the head-mounted display 100, such as the user's operating status (whether or not the user is wearing the head-mounted display 100), the user's behavioral status (a movement status such as being still, walking, or running, gestures made with the hands or fingers, the open/closed status of the eyelids, the gaze direction, and the size of the pupils – Para 60; and as long as the hand remains within a fixed distance from the head-mounted display 100, the physical body is tracked and a three-dimensional model M.sub.I of the physical body is placed in the virtual world and continually displayed. If the hand moves away farther than the fixed distance, the physical body is no longer detected, and the display of the three-dimensional model M.sub.I also disappears – Para 101). Claim 11, Yamamoto discloses the electronic device of claim 1, wherein modifying the visual appearance of the virtual object includes ceasing to present at least a portion of the virtual object (Figs. 11, 15). Independent claim 13, the claim is similar in scope to claim 1. Therefore, the rejection of claim 1 applies herein. Independent claim 19, the claim is similar in scope to claim 1. Therefore, the rejection of claim 1 applies herein. Claims 14-16, 18, 20-22 and 24, the corresponding rationale as applied in the rejection of claims 1-4 and 6-11 apply herein. Claims 5, 12, 17 and 23 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHANTE HARRISON whose telephone number is (571)272-7659. The examiner can normally be reached Monday - Friday 8:00 am to 5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHANTE E HARRISON/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Oct 31, 2023
Application Filed
Jun 11, 2024
Non-Final Rejection — §102
Jul 18, 2024
Examiner Interview Summary
Jul 18, 2024
Applicant Interview (Telephonic)
Jul 29, 2024
Response Filed
Aug 28, 2024
Final Rejection — §102
Oct 15, 2024
Notice of Allowance
Dec 16, 2024
Response after Non-Final Action
Dec 30, 2024
Response after Non-Final Action
Mar 31, 2025
Response after Non-Final Action
May 28, 2025
Response after Non-Final Action
May 28, 2025
Response after Non-Final Action
May 28, 2025
Response after Non-Final Action
May 29, 2025
Response after Non-Final Action
May 29, 2025
Response after Non-Final Action
Sep 22, 2025
Response after Non-Final Action
Oct 16, 2025
Response after Non-Final Action
Oct 22, 2025
Response after Non-Final Action
Dec 17, 2025
Request for Continued Examination
Jan 15, 2026
Response after Non-Final Action
Mar 19, 2026
Examiner Interview (Telephonic)
Mar 24, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597213
GESTURE BASED TACTILE INTERACTION IN EXTENDED REALITY USING FORM FACTOR OF A PHYSICAL OBJECT
2y 5m to grant Granted Apr 07, 2026
Patent 12592043
Systems, Methods, and Graphical User Interfaces for Displaying and Manipulating Virtual Objects in Augmented Reality Environments
2y 5m to grant Granted Mar 31, 2026
Patent 12592045
AUGMENTED REALITY SYSTEM AND METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12586322
OPTICAL DEVICE FOR AUGMENTED REALITY HAVING GHOST IMAGE PREVENTION FUNCTION
2y 5m to grant Granted Mar 24, 2026
Patent 12561891
GRAPHICS PROCESSORS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
69%
Grant Probability
97%
With Interview (+28.8%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 725 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month