Prosecution Insights
Last updated: April 19, 2026
Application No. 18/654,595

TRACKING AND DRIFT CORRECTION

Final Rejection §102§112§DP
Filed
May 03, 2024
Examiner
DAVIS, DAVID DONALD
Art Unit
2627
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
2 (Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 2m
To Grant
79%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
631 granted / 900 resolved
+8.1% vs TC avg
Moderate +9% lift
Without
With
+9.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
41 currently pending
Career history
941
Total Applications
across all art units

Statute-Specific Performance

§101
1.2%
-38.8% vs TC avg
§103
41.6%
+1.6% vs TC avg
§102
40.8%
+0.8% vs TC avg
§112
10.6%
-29.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 900 resolved cases

Office Action

§102 §112 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on September 12, 2024; September 23, 2024; December 16, 2024 and April 11, 2025 has been considered by the examiner. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Claim Rejections - 35 USC § 112 Claims 17-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Specifically, line 3 of claim 17, “the image sensor” is indefinite because it lacks antecedent basis. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 2-4, 10-12 and 18-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1 of U.S. Patent No. 11,036,284 in view of Lyons et al (WO 2018/106675). Regarding claims 2, 10 and 18 Lyons et al is silent as to: The method of claim 1, wherein the image 210 comprises a marker displayed by the touch screen display of the second device 255, wherein relative 3D position and orientation of the second device 255 to the first device 160-1/184/270 is determined based on the marker. With respect to claim 1 U.S. Patent No. 11,036,284 sets forth: obtaining an image of a physical environment using the image sensor, the image comprising a marker displayed on a second display of a second device; determining a relative position and orientation of the second device to the first device based on the marker; and generating a control signal based on the relative position and orientation of the second device. It would have been obvious to a person having ordinary skill in the art at the time the invention was effectively filed to provide Lyons et al with obtaining an image of a physical environment using the image sensor, the image comprising a marker displayed on a second display of a second device; determining a relative position and orientation of the second device to the first device based on the marker; and generating a control signal based on the relative position and orientation of the second device as set forth in U.S. Patent No. 11,036,284. The rationale is as follows: one of ordinary skill in the art at the time the invention was effectively filed would have been motivated to obtain an image of a physical environment using the image sensor, the image comprising a marker displayed on a second display of a second device; determining a relative position and orientation of the second device to the first device based on the marker; and generating a control signal based on the relative position and orientation of the second device so as to provide a simplified way of tracking a device. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 5-9 and 13-17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Lyons et al (WO 2018/106675). As per claim 1 Lyons et al discloses: A method comprising: at a first device 160-1/184/270 comprising a processor 172, a computer-readable storage medium 176 {figure 1}, an image sensor 175 { [0036] In an exemplary embodiment, sensor 175 may be at least an audio sensor such as a microphone, a visual sensor such as a camera (video or picture) . . . }, and a first display: obtaining an image 210 of a physical environment 250 using the image sensor 175, the physical environment 250 comprising a second device 255 different than the first device 160-1/184/270, the second device 255 having a touch screen display; displaying, on the first display, a view of an environment based on the image 210, the view comprising a representation 215 of the second device 255 with a user interface object positioned at a location corresponding to the location of the touch screen display { [0052] According to the present disclosure, the physical device 255 does not need to match the virtual devices 215 and 220; only the interface of the physical device 255 matches the interface of a virtual device (215 or 220) when the physical device 255 is coupled to the respective virtual device (215 or 220). For example, the physical device 255 may be a tablet, the first virtual device 215 may be a remote control and the second virtual device 220 may be a smart phone. In this example, when the physical device 255 is coupled to the first virtual device 215, the user interface of the physical device 255 matches the buttons of the remote control 215. And when the physical device 255 is coupled to the second virtual device 220, the user interface of the physical device 255 matches the user interface of the smart phone 220.}; obtaining data indicative of a touch event on the second device 255 corresponding to the location of the user interface object {figure 2A}; and initiating a user interface interaction associated on the first device 160-1/184/270 based on the touch event on the second device 255 {[0049] As a result, the user is able to interact with the virtual world through the physical device, experiencing a sense of touch and utilizing the physical device as a user interface. As the user moves to select a second virtual device, the VR system dynamically and automatically warps or transforms the VR scene to better align the physical device with the second virtual device and couples the physical device with the second virtual device. The user is then allowed to interact with the second virtual device through the physical device.}. As per claims 5 and 13 Lyons et al discloses: The method of claim 1, wherein the environment is a computer-generated reality (CGR) environment {[0050] FIG. 2A also illustrates a virtual space 210 representing a VR scene displayed by HMD 270 including two virtual devices (a first virtual device 215 and a second virtual device 220) and a virtual arm/hand 225 corresponding to the user's arm 265 in the physical space.} As per claims 6 and 14 Lyons et al discloses: The method of claim 5, wherein the interaction corresponds to an interaction with the CGR environment. {[0050] FIG. 2A also illustrates a virtual space 210 representing a VR scene displayed by HMD 270 including two virtual devices (a first virtual device 215 and a second virtual device 220) and a virtual arm/hand 225 corresponding to the user's arm 265 in the physical space.} As per claims 7 and 15 Lyons et al discloses: The method of claim 1, wherein obtaining the data indicative of the touch event comprises the first device 160-1/184/270 receiving the data from the second device 255, the second device 255 sending the data based on detecting a touch screen input {figure 1 & [0052] In this example, when the physical device 255 is coupled to the first virtual device 215, the user interface of the physical device 255 matches the buttons of the remote control 215.}.. As per claims 8 and 16 Lyons et al discloses: The method of claim 1, wherein the user interface object comprises a button {[0052] In this example, when the physical device 255 is coupled to the first virtual device 215, the user interface of the physical device 255 matches the buttons of the remote control 215.}. As per claim 9 Lyons et al discloses: A first device 160-1/184/270 comprising: a non-transitory computer-readable storage medium 176 {figure 1}; and one or more processors 172 coupled to the non-transitory computer-readable storage medium 176, wherein the non-transitory computer-readable storage medium 176 comprises program instructions that, when executed on the one or more processors 172, cause the first device 160-1/184/270 to perform operations comprising: obtaining an image 210 of a physical environment 250 using the image sensor 175 { [0036] In an exemplary embodiment, sensor 175 may be at least an audio sensor such as a microphone, a visual sensor such as a camera (video or picture) . . . }, the physical environment 250 comprising a second device 255 different than the first device 160-1/184/270, the second device 255 having a touch screen display; displaying, on the first display, a view of an environment based on the image 210, the view comprising a representation of the second device 255 with a user interface object positioned at a location corresponding to the location of the touch screen display { [0052] According to the present disclosure, the physical device 255 does not need to match the virtual devices 215 and 220; only the interface of the physical device 255 matches the interface of a virtual device (215 or 220) when the physical device 255 is coupled to the respective virtual device (215 or 220). For example, the physical device 255 may be a tablet, the first virtual device 215 may be a remote control and the second virtual device 220 may be a smart phone. In this example, when the physical device 255 is coupled to the first virtual device 215, the user interface of the physical device 255 matches the buttons of the remote control 215. And when the physical device 255 is coupled to the second virtual device 220, the user interface of the physical device 255 matches the user interface of the smart phone 220.}; obtaining data indicative of a touch event on the second device 255 corresponding to the location of the user interface object {figure 2A}; and initiating a user interface interaction associated on the first device 160-1/184/270 based on the touch event on the second device 255 {[0049] As a result, the user is able to interact with the virtual world through the physical device, experiencing a sense of touch and utilizing the physical device as a user interface. As the user moves to select a second virtual device, the VR system dynamically and automatically warps or transforms the VR scene to better align the physical device with the second virtual device and couples the physical device with the second virtual device. The user is then allowed to interact with the second virtual device through the physical device.}. As per claim 17 Lyons et al discloses: A non-transitory computer-readable storage medium 176 {figure 1} storing program instructions executable via one or more processors 172 to perform operations comprising: obtaining an image 210 of a physical environment 250 using the image sensor 175 { [0036] In an exemplary embodiment, sensor 175 may be at least an audio sensor such as a microphone, a visual sensor such as a camera (video or picture) . . . }, the physical environment 250 comprising a second device 255 different than the first device 160-1/184/270, the second device 255 having a touch screen display; displaying, on the first display, a view of an environment based on the image 210, the view comprising a representation of the second device 255 with a user interface object positioned at a location corresponding to the location of the touch screen display { [0036] In an exemplary embodiment, sensor 175 may be at least an audio sensor such as a microphone, a visual sensor such as a camera (video or picture) . . . }; obtaining data indicative of a touch event on the second device 255 corresponding to the location of the user interface object {figure 2A}; and initiating a user interface interaction associated on the first device 160-1/184/270 based on the touch event on the second device 255 {[0049] As a result, the user is able to interact with the virtual world through the physical device, experiencing a sense of touch and utilizing the physical device as a user interface. As the user moves to select a second virtual device, the VR system dynamically and automatically warps or transforms the VR scene to better align the physical device with the second virtual device and couples the physical device with the second virtual device. The user is then allowed to interact with the second virtual device through the physical device.}. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID D DAVIS whose telephone number is (571)272-7572. The examiner can normally be reached Monday - Friday, 8 a.m. - 4 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID D DAVIS/Primary Examiner, Art Unit 2627 DDD
Read full office action

Prosecution Timeline

May 03, 2024
Application Filed
Feb 25, 2025
Response after Non-Final Action
Jun 10, 2025
Non-Final Rejection — §102, §112, §DP
Sep 11, 2025
Applicant Interview (Telephonic)
Sep 11, 2025
Examiner Interview Summary
Sep 12, 2025
Response Filed
Dec 19, 2025
Final Rejection — §102, §112, §DP
Mar 25, 2026
Examiner Interview Summary
Mar 25, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602106
Ambience-Driven User Experience
2y 5m to grant Granted Apr 14, 2026
Patent 12602128
DISPLAY DEVICE HAVING PIXEL DRIVE CIRCUITS AND SENSOR DRIVE CIRCUITS
2y 5m to grant Granted Apr 14, 2026
Patent 12602121
TOUCH DEVICE FOR PASSIVE RESONANT STYLUS, DRIVING METHOD FOR THE SAME AND TOUCH SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12596265
Aiming Device with a Diffractive Optical Element and Reflective Image Combiner
2y 5m to grant Granted Apr 07, 2026
Patent 12592178
Display Device Including an Electrostatic Discharge Circuit for Discharging Static Electricity
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
79%
With Interview (+9.1%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 900 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month