Prosecution Insights
Last updated: April 19, 2026
Application No. 18/439,679

SYSTEMS AND METHODS OF DISPLAYING USER INTERFACES BASED ON TILT

Non-Final OA §102§103
Filed
Feb 12, 2024
Examiner
HUYNH, LINDA TANG
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
1 (Non-Final)
36%
Grant Probability
At Risk
1-2
OA Rounds
3y 8m
To Grant
68%
With Interview

Examiner Intelligence

Grants only 36% of cases
36%
Career Allow Rate
100 granted / 274 resolved
-18.5% vs TC avg
Strong +32% interview lift
Without
With
+31.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
30 currently pending
Career history
304
Total Applications
across all art units

Statute-Specific Performance

§101
9.8%
-30.2% vs TC avg
§103
53.4%
+13.4% vs TC avg
§102
13.4%
-26.6% vs TC avg
§112
18.6%
-21.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 274 resolved cases

Office Action

§102 §103
DETAILED ACTION This Office Action is sent in response to Applicant's Communication received 02/12/2024 for 18439679. Claims 1-24 are presented. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Information Disclosure Statement The information disclosure statement (IDS) submitted on 08/27/2024 was filed before the mailing date of a first action. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the IDS is being considered by the examiner. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-3, 6-7, 9-11, 14-15, 17-19, 22-23 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Le et al. (US 20210102820 A1). As to claim 1, Le discloses a method comprising: at an electronic device in communication with a display and one or more input devices [Fig. 2, para 0037-0038, 0040, 0044-0045, device with display and search function, navigation function, and measurement component receiving input]: while presenting, via the display, an extended reality environment, detecting, via the one or more input devices, a first input corresponding to a request to navigate to a first destination via a navigation application [Fig. 1A, para 0027-0028, 0038, 0044, device display displays graphical user interface (read: extended reality environment, note device user interface falls under broadest reasonable interpretation of extended reality environment including objects generated by a computer consistent with Applicant's specification [para 0003, 0023]) and navigation function runs search (read: first input) defining route to destination]; in response to detecting the first input, initiating navigation to the first destination via the navigation application, including displaying, via the display, one or more virtual objects in the extended reality environment, wherein the one or more virtual objects are associated with navigating to the first destination [Fig. 1A, para 0027-0029, 0038, 0044, navigation function defines navigation route to destination and presents map view including map area with point of interest objects (read: virtual objects) to traverse to destination]; while displaying the one or more virtual objects, detecting, via the one or more input devices, a second input that includes movement of a viewpoint the electronic device [Figs. 3A-3C, para 0034, 0053-0054, 0059, 0088, device displays map view and detects user input inclining device changing device field of view]; and in response to detecting the second input: in accordance with a determination that the second input satisfies one or more first criteria, including a criterion that is satisfied when the movement of the viewpoint exceeds a threshold movement or results in an elevation of the electronic device being within a range of elevations, replacing display, via the display, of the one or more virtual objects with a virtual user interface that includes a current route to the first destination in the extended reality environment [Figs. 1A-1B, 5A-6, para 0030, 0034-0035, 0088, graphical user interface transitions from displaying map view to augmented reality view (read: virtual user interface) including object corresponding to navigation route to destination when determining tilt degree of user input is considered (read: satisfies) at a large degree (read: first criteria)]; and in accordance with a determination that the second input does not satisfy the one or more first criteria, maintaining display of the one or more virtual objects in the extended reality environment [Figs. 1A-1B, 5A-7, para 0030, 0034, 0088, graphical user interface presents map view when determining tilt degree of user input is less than a large degree (read: does not satisfy)]. As to claim 2, Le discloses the method of claim 1, wherein the first input includes interaction with a virtual search user interface associated with the navigation application [para 0038, 0040, 0044, user defines route to destination with navigation function with search run by search function displayed in graphical user interface]. As to claim 3, Le discloses the method of claim 1, wherein the one or more virtual objects associated with navigating to the first destination include: a first element that is configured to point in a direction of the first destination relative to a current location of the electronic device in the extended reality environment [Fig. 1A, para 0028-0029, map view includes point of interest object (read: first element) including arrow symbol signifying turn (read: direction) along route from user present position displayed in graphical user interface]; one or more textual elements that provide visual cues for navigating to the first destination from a location corresponding to the electronic device in the extended reality environment [Fig. 1A, para 0028-0029, map view includes text content (read: textual element) with information about point of interest object along route from user present position to destination displayed in graphical user interface]; and a first option [para 0034, virtual button]. As to claim 6, Le discloses the method of claim 1, wherein the determination that the movement of the viewpoint of the electronic device exceeds the threshold movement is in accordance with a determination that a vertical component of the movement of the viewpoint of the user exceeds a threshold angle relative to a reference [Figs. 3A-3C, para 0034, 0053-0054, 0059, 0088, change device field of view as function of degree of tilt considered large (read: threshold angle) along vertical axis (read: vertical component) with regard to reference]. As to claim 7, Le discloses the method of claim 1, wherein the virtual user interface that includes the current route to the first destination in the extended reality environment includes: a representation of a map of a physical environment surrounding the electronic device [Fig. 1B, para 0030-0031, graphical user interface displaying augmented reality view includes map area (read: map representation) presenting map relative to device at current position (read: physical environment)]; a visual indication of a location corresponding to the electronic device in the physical environment overlaid on the representation of the map of the physical environment [Fig. 1B, para 0030-0031, augmented reality view includes arrow (read: visual indication) indicating device location at current position on map at current location]; a representation of the current route that is displayed between the visual indication and a first user interface object corresponding to the first destination and is overlaid on the representation of the map [Figs. 1A-1B, 4A, para 0029-0031, 0075, augmented reality view includes marked route on map extending from current device location to point of interest object (read: first user interface object) including destination]; and a plurality of visual indications of a plurality of locations in the physical environment overlaid on the representation of the map [Fig. 1B, para 0031-0032, 0035, augmented reality view includes point of interest objects (read: visual indications) placed at locations on map corresponding to physical locations relative to current position]. As to claim 9, Le discloses an electronic device comprising: one or more processors; memory; and one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing a method [Fig. 13, para 0128, device includes processor and memory storing instructions executed by processor] comprising: limitations substantially similar to those recited in claim 1 and is rejected under similar rationale. As to claims 10-11 and 14-15, Le discloses the electronic device of claim 9 comprising limitations substantially similar to those recited in claim 2-3 and 6-7, respectively, and are rejected under similar rationale. As to claim 17, Le discloses a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to perform a method [Fig. 13, para 0128, memory stores instructions executed by device processor] comprising: limitations substantially similar to those recited in claim 1 and is rejected under similar rationale. As to claims 18-19 and 22-23, Le discloses the non-transitory computer readable storage medium of claim 17 comprising limitations substantially similar to those recited in claim 2-3 and 6-7, respectively, and are rejected under similar rationale. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 4-5, 8, 12-13, 16, 20-21, and 24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Le as applied to claims 1, 9, and 17 above, and further in view of Chung et al. (US 20170176208 A1). As to claim 4, Le discloses the method of claim 3, further comprising: while navigating to the first destination and while the one or more virtual objects are displayed in the extended reality environment, detecting a third input that includes [movement] of the viewpoint within the extended reality environment [Fig. 1A, 3A, 3C 4A-4B, para 0031, 0058-0059, 0075-0077, graphical user interface displays map view including navigation instructions to destination and device detects rotation (read: third input) changing field of view of device displaying map]; and in response to detecting the third input, in accordance with a determination that the third input causes the location corresponding to the electronic device to change in the extended reality environment, updating display, via the display, of the one or more virtual objects [Figs. 3A, 3C, 4A-4B, para 0050, 0058-0059, 0075-0078, rotate graphical user interface displaying map view based on determining device rotation changing device orientation at present position], including: changing the direction in which the first element is pointed in the extended reality environment [Figs. 4A-4B, para 0075-0078, graphical user interface displays rotated orientation of arrow symbol turning along route]; or updating text in the one or more textual elements to provided updated visual cues for navigating to the first destination based on a change in the location corresponding to the electronic device in the extended reality environment. However, Le does not specifically disclose wherein "[movement] of the viewpoint" is "translation of the viewpoint". Chung discloses translation of the viewpoint in the extended reality environment [Figs. 8A-8C, para 0118, screen interface (read: extended reality environment) displays live video (read: viewpoint) of device moving forward]. Le and Chung are analogous art to the claimed invention being from a similar field of endeavor of map user interface systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the movement of the viewpoint as disclosed by Le with the movement including a translation as disclosed by Chung with a reasonable expectation of success. One of ordinary skill in the art would be motivated to modify Le as described above to easily distinguish map directions [Chung, para 0005]. . As to claim 5, Le discloses the method of claim 3, wherein: the first option is selectable to concurrently display, with the one or more virtual objects, one or more second options in the extended reality environment [para 0034-0035, actuate virtual button to display augmented reality view with map area and additional point of interest objects (read: second options) in graphical user interface]. However, Le does not specifically disclose the one or more second options include: a map option that is selectable to display the virtual user interface that includes the current route to the first destination in the extended reality environment; a first respective textual element indicating an estimated time of arrival at the first destination; a second respective textual element indicating a distance between the location corresponding to the user and the first destination; or a second option that is selectable to cease the navigation to the first destination. Chung discloses the one or more second options include: a map option that is selectable to display the virtual user interface that includes the current route to the first destination in the extended reality environment; a first respective textual element indicating an estimated time of arrival at the first destination; a second respective textual element indicating a distance between the location corresponding to the user and the first destination [Figs. 10A-10C, para 0109, 0124-0126, display object including distance information from current device location to destination, note single second option selected from group]; or a second option that is selectable to cease the navigation to the first destination. Le and Chung are analogous art to the claimed invention being from a similar field of endeavor of map user interface systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the second options as disclosed by Le with an option of a textual element indicating a distance between a user location and destination as disclosed by Chung with a reasonable expectation of success. One of ordinary skill in the art would be motivated to modify Le as described above to easily distinguish map directions [Chung, para 0005]. As to claim 8, Le discloses the method of claim 1, further comprising: while navigating to the first destination and while the one or more virtual objects are displayed in the extended reality environment, detecting a third input that includes [movement] of the viewpoint in the extended reality environment … [Fig. 1A, 3A, 3C 4A-4B, para 0031, 0058-0059, 0075-0077, graphical user interface displays map view including navigation instructions to destination and device detects rotation (read: third input) changing field of view of device displaying map]; and in response to detecting the third input, replacing display, via the display, of the one or more virtual objects [Figs. 3A, 3C, 4A-4B, para 0050, 0058-0059, 0075-0078, rotate map view based on determining device rotation] with: a first virtual object that is associated with the navigation application, wherein the first virtual object includes a first visual indication corresponding to the first destination and information corresponding to the first destination [Figs. 4A-4B, para 0075-0078, display legend location (read: first virtual object) with rotated arrow symbol (read: first visual indication) with orientation (read: information) toward point of interest object along destination route]; and a textual indication that indicates a location corresponding to the electronic device in the extended reality environment … of the first destination [Fig. 4B, para 0029, 0076-0078, graphical user interface displays rotated map view including text content (read: textual indication) providing direction to point of interest (read: location) from device location along route to destination]. However, Le does not specifically disclose translation of the viewpoint in the extended reality environment to within a threshold distance of the first destination; and a textual indication that indicates a location corresponding to the electronic device in the extended reality environment is within the threshold distance of the first destination. Chung discloses: translation of the viewpoint in the extended reality environment to within a threshold distance of the first destination [Figs. 10A-10C, para 0123-0126, screen interface displays live video (read: viewpoint) about current location of device within specified distance (read: threshold distance) from destination in screen interface]; and a textual indication that indicates a location corresponding to the electronic device in the extended reality environment is within the threshold distance of the first destination [para 0123, 0126, 0139-0140, screen interface displays display object (read: textual indication) including text information about distance from current device location to destination]. Le and Chung are analogous art to the claimed invention being from a similar field of endeavor of map user interface systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the movement of the viewpoint and textual indication as disclosed by Le with the movement as a translation of the viewpoint to within a threshold distance of the destination and a textual indication indicating a location within the threshold distance of the destination as disclosed by Chung with a reasonable expectation of success. One of ordinary skill in the art would be motivated to modify Le as described above to easily distinguish map directions [Chung, para 0005]. As to claims 12, 13, and 16, Le and Chung, combined at least for the reasons above, Le discloses the electronic device of claim 9 comprising limitations substantially similar to those recited in claim 4, 5, and 8, respectively, and are rejected under similar rationale. As to claims 20, 21, and 24, Le and Chung, combined at least for the reasons above, Le discloses the non-transitory computer readable storage medium of claim 17 comprising limitations substantially similar to those recited in claim 4, 5, and 8, respectively, and are rejected under similar rationale. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Mabbutt et al. (US 20140267400 A1) generally discloses an augmented reality environment displaying additional map elements based on device tilt. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LINDA HUYNH whose telephone number is (571)272-5240 and email is linda.huynh@uspto.gov. The examiner can normally be reached M-F between 9am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LINDA HUYNH/Primary Examiner, Art Unit 2172
Read full office action

Prosecution Timeline

Feb 12, 2024
Application Filed
Dec 30, 2025
Non-Final Rejection — §102, §103
Feb 06, 2026
Interview Requested
Feb 19, 2026
Examiner Interview Summary
Feb 19, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578837
USER INTERFACES FOR MANAGING SHARING OF CONTENT IN THREE-DIMENSIONAL ENVIRONMENTS
2y 5m to grant Granted Mar 17, 2026
Patent 12547310
INFORMATION PROCESSING DEVICE
2y 5m to grant Granted Feb 10, 2026
Patent 12541287
INTEGRATED ENERGY DATA SCIENCE PLATFORM
2y 5m to grant Granted Feb 03, 2026
Patent 12524136
EVENT TRANSCRIPT PRESENTATION
2y 5m to grant Granted Jan 13, 2026
Patent 12524124
RECORDING FOLLOWING BEHAVIORS BETWEEN VIRTUAL OBJECTS AND USER AVATARS IN AR EXPERIENCES
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
36%
Grant Probability
68%
With Interview (+31.9%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 274 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month