Prosecution Insights
Last updated: April 19, 2026
Application No. 18/369,628

Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments

Non-Final OA §102§103
Filed
Sep 18, 2023
Examiner
ANDERSON, BRODERICK C
Art Unit
2178
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
1 (Non-Final)
74%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
93%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
190 granted / 258 resolved
+18.6% vs TC avg
Strong +19% interview lift
Without
With
+19.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
20 currently pending
Career history
278
Total Applications
across all art units

Statute-Specific Performance

§101
9.8%
-30.2% vs TC avg
§103
60.1%
+20.1% vs TC avg
§102
18.4%
-21.6% vs TC avg
§112
7.1%
-32.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 258 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) was filed on 1/31/2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Drawings The drawings filed 9/18/2023 were accepted. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 4, 6, 15, 20, and 25-29 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Solomon (US20150321558A1; filed 12/3/2012). With regards to claim 1, Solomon discloses A method, comprising: at a device that includes or is in communication with one or more display generation components and one or more input devices (Solomon, Fig. 5: Device with a touch display 2 and physical switches 30-34): while displaying via the one or more display generation components an application user interface, detecting a first input to an input device of the one or more input devices, the input device provided on a housing of the device that includes the one or more display generation components (Solomon, Fig. 5: users hand presses switch 33 while display components are shown); in response to detecting the first input to the input device provided on the housing of the device (Solomon, Fig. 1: physical switches 30-34 are on the housing of the device): replacing display of at least a portion of the application user interface by displaying a home menu user interface via the one or more display generation components (Solomon, paragraph 79: “These individual control switches are known as ‘direct access’ switches in the present disclosure. In other words, it allows the driver to jump directly to some application display or top menu, without requiring touch(es) on the touch screen;” the “top menu” is interpreted as the home menu, and the applications are interpreted as the other applications, such as the navigation or music applications described in paragraphs 56-57 and shown in fig. 1; paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications”); and while displaying via the one or more display generation components the home menu user interface (Solomon, paragraph 80: “display of the main menu of applications”), detecting a second input to the input device provided on the housing of the device (Solomon, paragraph 80: “top menu switch 30… also supports a toggle switching function (also called toggle access);” fig. 1: input button 30 is a physical button); in response to detecting the second input to the input device provided on the housing of the device: dismissing the home menu user interface (Solomon, paragraphs 26-27: “toggle access switches, the method comprising the steps: A—switching, upon a first actuation of one of the toggle access switch, from a first display mode, currently displayed at the time of the first toggle access switch actuation, to a second display mode, B—returning to the first display mode upon a second subsequent actuation of the same toggle access switch;” the return to the first display mode upon toggling the button is interpreted as returning from the home menu (top menu) to the previous application). With regards to claim 4, which depends on claim 1, Solomon discloses wherein the input device is a hardware button or a solid state button (Solomon, Fig. 1: switches 30-34 are hardware switches or buttons; paragraph 52: “individual control switches 30-34 which are formed as push buttons”). With regards to claim 6, which depends on claim 1, Solomon discloses including, in response to detecting the first input to the input device, dismissing the application user interface prior to, or concurrently with displaying of the home menu user interface (Solomon, paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications”). With regards to claim 15, which depends on claim 1, Solomon discloses displaying in the home menu user interface representations of software applications executable on the device (Solomon, paragraph 80: “a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications, as illustrated in FIG. 1, which is also an application itself;” fig. 1: the applications are shown with their icons, and the user selecting one with a finger touch); detecting a seventh input directed to a respective representation of a software application in the representations of software applications executable on the device displayed in the home menu user interface (Solomon, fig. 1: user is pressing an icon for an application; paragraph 79: “direct access and toggle access are alternative to touch actions available in a hierarchical menu”); and in response to detecting the seventh input directed to the respective representation of the software application: displaying an application user interface of the software application (Solomon, fig. 1: user is pressing an icon for an application; paragraph 79: “direct access and toggle access are alternative to touch actions available in a hierarchical menu”). With regards to claim 20, which depends on claim 1, Solomon discloses while displaying a first section of the home menu user interface, detecting a twelfth input to the input device provided on the housing of the device, and in response to detecting the twelfth input to the input device provided on the housing of the device: dismissing the home menu user interface (Solomon, paragraph 80: “top menu switch 30… also supports a toggle switching function (also called toggle access);” pressing it again will toggle the home menu and return to the application); and detecting a thirteenth input to the input device provided on the housing of the device, and in response to detecting the thirteenth input to the input device provided on the housing of the device: displaying the first section of the home menu user interface based on the thirteenth input (Solomon, paragraph 80: “top menu switch 30… also supports a toggle switching function (also called toggle access);” pressing it again will toggle the home menu and return to the home menu). With regards to claim 25, which depends on claim 1, Solomon discloses detecting a first number of inputs to the input device provided on the housing of the device within a first period of time, and in response to detecting the first number of inputs to the input device provided on the housing of the device within the first period of time, displaying an application management user interface (Solomon, paragraph 106: “It may be accessible by one of the above mentioned toggle switches or accessible by a double press on the menu button 30. In this case, the menu switch is toggle switch for Top Menu display and base dashboard display;” the determination of a double press or single press of the switch is interpreted as detecting a number of inputs within a time period, and the “application management user interface” could be interpreted as another name for the home menu, or the “base dashboard display”). With regards to claim 26, which depends on claim 1, Solomon discloses while displaying, via the one or more display generation components, a system user interface (Solomon, paragraph 57: “a most frequent settings application 53”), detecting a respective input to the input device provided on the housing of the device, the respective input being a same type of input as the first input to the input device, and in response to detecting the respective input to the input device provided on the housing of the device while displaying the system user interface: replacing display of at least a portion of the system user interface by displaying the home menu user interface via the one or more display generation components (paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications;” since the frequent settings application 53 (which is interpreted as a system user interface) is treated as an application, pressing the top menu switch will have the same behavior as when it is pressed as described in the rejection to claim 1). With regards to claim 27, which depends on claim 1, Solomon discloses after dismissing the home menu user interface, and while the home menu user interface is not displayed (Solomon, paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30… supports a toggle switching function;” examiner is interpreting this as the user toggling the top menu switch and returning to the previously used application), detecting a fourteenth input to the input device provided on the housing of the device (Solomon, paragraph 80: “a top menu switch 30 which can be pressed;” the fourteenth input is interpreted as just another pressing of the top menu switch); in response to detecting the fourteenth input to the input device provided on the housing of the device: redisplaying the home menu user interface via the one or more display generation components (Solomon, paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30… supports a toggle switching function;” since it’s a toggle, this press would just display the home menu (top menu) again instead of the current application). Claim 28 recites substantially similar limitations to claim 1 and is thus rejected along the same rationale. Claim 29 recites substantially similar limitations to claim 1 and is thus rejected along the same rationale. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2-3, 12-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Schwarz et al (US20200225813A1; filed 3/11/2019). With regards to claim 2, which depends on claim 1, Solomon does not disclose wherein the device is a head mounted device that includes the input device and the one or more display generation components, and the method includes generating a user interface that is visible to a user when the head mounted device is positioned on a head of the user, covering the user's eyes. However, Schwarz et al teaches wherein the device is a head mounted device that includes the input device and the one or more display generation components, and the method includes generating a user interface that is visible to a user when the head mounted device is positioned on a head of the user, covering the user's eyes (Schwarz et al, abstract: “a mixed-reality display device presents a mixed-reality environment to a user;” Fig. 8: user is wearing a head mounted device to view the GUI; Figs. 1-6: a user interface is shown; paragraph 5: “Such virtual menus may fulfill a variety of purposes, such as allowing users to select applications to execute in the mixed-reality environment.”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Schwarz et al such that the device was a head mounted mixed reality device. This would have enabled the invention to display the menus as virtual objects for the user to see and interact with (Schwarz et al, paragraph 2: ““Augmented reality” typically refers to virtual experiences where virtual objects are visually placed within the real world, such that a user experiences virtual content and the real world simultaneously”). With regards to claim 3, which depends on claim 1, Solomon discloses wherein the home menu user interface is presented substantially in a central portion of … the device (Solomon, fig. 1: “top menu” takes up the entire screen of the device). However, Solomon does not disclose a field of view of a user. However, Schwarz et al teaches a field of view of a user (Schwarz et al, abstract: “a mixed-reality display device presents a mixed-reality environment to a user;” paragraph 28: “the user will have the ability to move the menu around in the user's field of view in order to be able to see any portion of the field of view freely”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Schwarz et al such that the device was a head mounted mixed reality device. This would have enabled the invention to display the menus as virtual objects for the user to see and interact with (Schwarz et al, paragraph 2: ““Augmented reality” typically refers to virtual experiences where virtual objects are visually placed within the real world, such that a user experiences virtual content and the real world simultaneously”). With regards to claim 12, which depends on claim 1, Solomon discloses wherein dismissing the home menu user interface includes replacing display of the home menu user interface (Solomon, paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30… supports a toggle switching function;” examiner is interpreting this as the user toggling the top menu switch and returning to the previously used application). However, Solomon does not disclose yet Schwarz et al teaches presentation of a passthrough portion of a physical environment of the device via the one or more display generation components (Schwarz et al, abstract: “a mixed-reality display device presents a mixed-reality environment to a user”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Schwarz et al such that the device was a head mounted mixed reality device, and that dismissing the menu interface would result in the previous interface. This would have enabled the invention to display the menus as virtual objects for the user to see and interact with (Schwarz et al, paragraph 2: ““Augmented reality” typically refers to virtual experiences where virtual objects are visually placed within the real world, such that a user experiences virtual content and the real world simultaneously”). With regards to claim 13, which depends on claim 1, Solomon discloses wherein dismissing the home menu user interface includes ceasing to display… environment in which the home menu user interface is displayed (Solomon, paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30… supports a toggle switching function;” examiner is interpreting this as the user toggling the top menu switch and returning to the previously used application). However, Solomon does not disclose a virtual environment. Schwarz et al teaches a virtual environment (Schwarz et al, abstract: “a mixed-reality display device presents a mixed-reality environment to a user”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Schwarz et al such that the device was a head mounted mixed reality device, and that dismissing the menu interface would result in dismissing the environment that includes the menu. This would have enabled the invention to display the menus as virtual objects for the user to see and interact with (Schwarz et al, paragraph 2: ““Augmented reality” typically refers to virtual experiences where virtual objects are visually placed within the real world, such that a user experiences virtual content and the real world simultaneously”). With regards to claim 14, which depends on claim 13, Solomon discloses detecting a sixth input on a representation of a first… environment displayed in the home menu user interface; and in response to detecting the sixth input on the representation of the first… environment displayed in the home menu user interface: replacing any currently displayed… environment with the first… environment (Solomon, paragraph 80: “a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications, as illustrated in FIG. 1, which is also an application itself;” fig. 1: the applications are shown with their icons, and the user selecting one with a finger touch). However, Solomon does not disclose a virtual environment. Schwarz et al teaches a virtual environment (Schwarz et al, abstract: “a mixed-reality display device presents a mixed-reality environment to a user”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Schwarz et al such that the device was a head mounted mixed reality device, and that dismissing the menu interface would result in dismissing the environment that includes the menu. This would have enabled the invention to display the menus as virtual objects for the user to see and interact with (Schwarz et al, paragraph 2: ““Augmented reality” typically refers to virtual experiences where virtual objects are visually placed within the real world, such that a user experiences virtual content and the real world simultaneously”). Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Schwarzhuber (US20080088608A1; filed 10/17/2007). With regards to claim 5, which depends on claim 4, Solomon does not disclose detecting a rotational input to the hardware button; and in response to detecting the rotational input, performing a second operation different from displaying or dismissing the home menu user interface. However, Schwarzhuber teaches detecting a rotational input to the hardware button; and in response to detecting the rotational input, performing a second operation different from displaying or dismissing the home menu user interface (Schwarzhuber, paragraph 6: “at least one pushbutton which is arranged behind a section of the rotating wheel, and in which case the at least one pushbutton can be depressed by pressing the section of the rotating wheel”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Schwarzhuber such that the input included a rotatable input device for performing additional operations. This would have enabled the invention to integrate additional inputs while saving space (Schwarzhuber, paragraph 4: “the individual interface must be designed and arranged to save as much space as possible.” Paragraph 7: “In addition to the space-saving integration of the pushbuttons by means of which cursor control is provided, behind the rotating wheel the control process is also simplified, allowing faster control actions since there is no longer any need, as in the case of the prior art, to switch backwards and forwards between the rotating wheel and the cursor control, since these were physically separate.”). Claim(s) 7-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Schwarzhuber, and further in view of Toprani (US20110289427A1; filed 5/21/2010). With regards to claim 7, which depends on claim 5, Solomon does not disclose yet Toprani teaches prior to detecting the first input to the input device of the one or more input devices, generating and displaying a first user interface object associated with the application user interface; and in response to detecting the first input to the input device: maintaining display of the first user interface object concurrently with dismissing the application user interface (Toprani, abstract: “The persistent overlay can be formed of selected visual information from a first page and can remain viewable over subsequently displayed pages;” Fig. 2-3: the portion of the GUI is made into a persistent overlay 230). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon, Schwarzhuber, and Toprani such that an object of the user interface could be made persistent even when switching to other user interfaces. This would have enabled a user to more efficiently manage visual information on a small display (Toprani, paragraph 5: “a method is described for efficiently managing visual information by an electronic device having a display.”). With regards to claim 8, which depends on claim 7, Solomon does not disclose yet Toprani teaches prior to detecting the first input, generating and displaying the first user interface object associated with the application user interface by extracting the first user interface object from the application user interface based on a third input directed to the application user interface (Toprani, abstract: “The persistent overlay can be formed of selected visual information from a first page and can remain viewable over subsequently displayed pages;” Fig. 2: selection tool 228 is shown selecting the object). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon, Schwarzhuber, and Toprani such that an object of the user interface could be made persistent even when switching to other user interfaces. This would have enabled a user to more efficiently manage visual information on a small display (Toprani, paragraph 5: “a method is described for efficiently managing visual information by an electronic device having a display.”). With regards to claim 9, which depends on claim 7, Solomon discloses in response to detecting the second input, dismissing both… object and the home menu user interface (Solomon, paragraph 22: “switching, upon an actuation of one of the individual control switch, from a current display arrangement displayed at the time of the switch actuation, to a display mode dedicated to the application corresponding to the actuated switch”). Solomon and Schwarzhuber do not disclose dismissing… the home menu user interface. Toprani teaches dismissing… the home menu user interface. (Toprani, paragraph 31: “In some cases, an icon can be removed completely from GUI 108 or a new icon can be added.”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon, Schwarzhuber, and Toprani such that the persistent object can be removed freely. This would have enabled a user to more efficiently manage visual information on a small display (Toprani, paragraph 5: “a method is described for efficiently managing visual information by an electronic device having a display.”). With regards to claim 10, which depends on claim 7, Solomon discloses detecting a fourth input directed to a representation of a second application displayed on the home menu user interface, and in response to detecting the fourth input, displaying an application user interface of the second application (Solomon, paragraph 9: “the user can easily select a new application or switch to another application;” Fig. 1 and 5 both show a user touching the screen to select an application). However, Solomon does not disclose while displaying via the one or more display generation components the home menu user interface and the first user interface object… displaying an application user interface of the second application concurrently with displaying the first user interface object. Toprani teaches while displaying via the one or more display generation components the home menu user interface and the first user interface object, detecting a fourth input directed to a representation of a second application displayed on the home menu user interface, and in response to detecting the fourth input, displaying an application user interface of the second application concurrently with displaying the first user interface object (Toprani, paragraph 17: “FIGS. 4A-4B shows the persistent overlay of FIG. 3 overlaying subsequently presented visual content in accordance with the described embodiments.” Fig. 3-4 show the object over a home menu and a separate application). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon, Schwarzhuber, and Toprani such that an object of the user interface could be made persistent even when switching to other user interfaces and applications. This would have enabled a user to more efficiently manage visual information on a small display (Toprani, paragraph 5: “a method is described for efficiently managing visual information by an electronic device having a display.”). With regards to claim 11, which depends on claim 10, Solomon and Schwarzhuber do not disclose detecting a fifth input to move the first user interface object onto the application user interface of the second application; and in response to detecting the fifth input, performing an operation in the second application based on the first user interface object. However, Toprani teaches detecting a fifth input to move the first user interface object onto the application user interface of the second application (Toprani, paragraph 25: “Other features can be used to minimize, drag and drop, expand the persistent overlay, and so on;” paragraph 45: “persistent overlay 230 can be moved about to any appropriate location”); and in response to detecting the fifth input, performing an operation in the second application based on the first user interface object (Toprani, Fig. 4A: persistent overlay is displayed and can be moved while the second application is displayed; paragraph 45: “persistent overlay 230 can be moved about to any appropriate location;” the “operation in the second application” is interpreted as the moving of the persistent overlay because it is performed over the user interface of the second application). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon, Schwarzhuber, and Toprani such that an object of the user interface could be made persistent even when switching to other user interfaces and applications. This would have enabled a user to more efficiently manage visual information on a small display (Toprani, paragraph 5: “a method is described for efficiently managing visual information by an electronic device having a display.”). Claim(s) 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of van Os (US20100011304A1; filed 7/9/2008). With regards to claim 16, which depends on claim 1, Solomon does not disclose displaying in the home menu user interface a first representation of a first person, and a second representation of a second person, the first representation and the second representation for initiating communication with the first person and the second person, respectively; detecting an eighth input directed to the first representation of the first person; and in response to detecting the eighth input directed to the first representation of the first person: displaying a communication user interface for initiating a communication session with the first person. However, van Os teaches displaying in the home menu user interface a first representation of a first person, and a second representation of a second person (van Os, abstract: “An icon can be created for a contact (e.g., an individual(s) or an entity) and presented on a user interface of a mobile device, such as a “home screen.”” The icon creation can be repeated multiple times, see Fig. 4, icons 405 and 410), the first representation and the second representation for initiating communication with the first person and the second person (van Os, abstract: “The icon can be used to retrieve and display contact information”), respectively; detecting an eighth input directed to the first representation of the first person; and in response to detecting the eighth input directed to the first representation of the first person: displaying a communication user interface for initiating a communication session with the first person (van Os, paragraph 38: “The user can use the icon to navigate directly to the address book application residing on the mobile device 100. A contact screen presented by the address book application can show status on SMS messages, phone calls, emails, etc., received from the contact. In some implementations, touching the contacts icon will open a user interface that bundles appropriate services or applications related to the contact.”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and van Os such that contact icons could be created and added to the home menu. This would have enabled a user to quickly find and communicate with contacts (van Os, paragraph 3: “While having all contact information in one place can be convenient, quickly finding an often needed contact can sometimes be difficult and inconvenient.” Paragraph 6: “User created icons can allow convenient access to all information and applications related to a contact”). Claim(s) 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Chaudhri (US20110252373A1; filed 9/22/2010) and Schwarz et al. With regards to claim 17, which depends on claim 1, Solomon does not disclose detecting a ninth input directed to a representation of a collection displayed in the home menu user interface; and in response to detecting the ninth input directed to the representation of the collection: displaying representations of one or more virtual three-dimensional environments or one or more augmented reality environments. Chaudhri teaches detecting a ninth input directed to a representation of a collection displayed in the home menu user interface; and in response to detecting the ninth input directed to the representation of the collection: displaying representations (Chaudhri, Paragraph 233: "In some embodiments, the device displays a folder view of a folder associated with a folder icon (e.g., 5004-7) in response to detecting a request to activate a folder icon (e.g., tap gesture 5076 in FIG. 5S);" see Fig. 5S and 5T for corresponding figures; paragraph 234: “the folder view (e.g., 5078 in FIG. 5T) includes the selectable user interface objects (e.g., 5002-4 and 5002-13) that were added to the folder”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Chaudhri such that the user can collapse application icons into folders and expand them to access them again. This would have enabled a user to efficiently access applications (Chaudhri, paragraph 51: “multifunction devices with displays are provided with faster, more efficient methods and interfaces for managing folders, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices.”). Schwarz et al teaches one or more virtual three-dimensional environments or one or more augmented reality environments (Schwarz et al, abstract: “a mixed-reality display device presents a mixed-reality environment to a user”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon, Chaudhri and Schwarz et al such that the device was a head mounted mixed reality device, and that dismissing the menu interface would result in dismissing the environment that includes the menu. This would have enabled the invention to display the menus as virtual objects for the user to see and interact with (Schwarz et al, paragraph 2: ““Augmented reality” typically refers to virtual experiences where virtual objects are visually placed within the real world, such that a user experiences virtual content and the real world simultaneously”). Claim(s) 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Chaudhri. With regards to claim 18, which depends on claim 1, Solomon does not disclose while displaying the home menu user interface, detecting a tenth input; and in response to detecting the tenth input: scrolling through the home menu user interface based on the tenth input so that first content in at least a portion of the home menu user interface is replaced with second content. Chaudhri teaches while displaying the home menu user interface, detecting a tenth input; and in response to detecting the tenth input: scrolling through the home menu user interface based on the tenth input so that first content in at least a portion of the home menu user interface is replaced with second content (Chaudhri, paragraph 275: “the arrangement of selectable user interface objects can be scrolled in one or two dimensions;” paragraph 311: “It should be understood that, although the arrangement of selectable user interface objects may be scrolled, paged through, or otherwise translated across the display (e.g., touch screen 112), these operations do not entail any rearrangement of the selectable user interface objects”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Chaudhri such that the user can navigate the home menu of the device via scrolling. This would have enabled a user to efficiently navigate the menu (Chaudhri, paragraph 369: “the device scrolls through the arrangement of selectable user interface objects in the navigation area”). With regards to claim 19, which depends on claim 1, Solomon does not disclose while displaying the home menu user interface having a first section, detecting an eleventh input; and in response to detecting the eleventh input: displaying a second section of the home menu user interface based on the eleventh input, the first section being different from the second section. Chaudhri teaches while displaying the home menu user interface having a first section, detecting an eleventh input; and in response to detecting the eleventh input: displaying a second section of the home menu user interface based on the eleventh input, the first section being different from the second section (Chaudhri, paragraph 311: “It should be understood that, although the arrangement of selectable user interface objects may be scrolled, paged through, or otherwise translated across the display (e.g., touch screen 112), these operations do not entail any rearrangement of the selectable user interface objects”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Chaudhri such that the user can navigate the home menu of the device via scrolling. This would have enabled a user to efficiently navigate the menu (Chaudhri, paragraph 369: “the device scrolls through the arrangement of selectable user interface objects in the navigation area”). Claim(s) 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Kaufman et al (US20080046462A1; filed 10/26/2007). With regards to claim 21, which depends on claim 20, Solomon discloses displaying the first section of the home menu user interface based on the thirteenth input… a display of the home menu user interface (Solomon, paragraph 80: “a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications, as illustrated in FIG. 1”). However, Solomon does not disclose in accordance with a determination that a time difference between detecting the twelfth input and detecting the thirteenth input is within a time threshold, displaying the first section of the… menu user interface based on the thirteenth input, and in accordance with a determination that the time difference exceeds the time threshold, resetting a display of the… menu user interface to a predetermined section. Kaufman et al teaches in accordance with a determination that a time difference between detecting the twelfth input and detecting the thirteenth input is within a time threshold, displaying the first section of the… menu user interface based on the thirteenth input (Kaufman et al, paragraph 13: “It utilizes a hierarchical “context stack” for maintaining (and suspending) the working state of a particular table (comprising selected record, display “mode”, pending form-field entries, … Browse-mode scroll position…) while “drilling down” across relationships to work with related information (in a possibly constrained working context) and returning relevant changes to the parent-context table, and a corresponding UI convention for displaying and navigating this stack”), and in accordance with a determination that the time difference exceeds the time threshold, resetting a display of the… menu user interface to a predetermined section (Kaufman et al, paragraph 170: “SESSION TIMEOUT: Because the system maintains a “user session” in which various context, sequence, and configuration information is tracked, and which… can expire after a (configurable) period of disuse”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Kaufman et al such that a user’s position in a menu is recorded during a session. This would have enabled the menu navigation to retain its context for the user without consuming excessive system resources (Kaufman et al, paragraph 170: “a “user session” in which various context, sequence, and configuration information is tracked, and which (because it consumes system resources) can expire after a (configurable) period of disuse”). Claim(s) 22-24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Solomon in view of Darby et al (US20210223949A1; filed 4/5/2021). With regards to claim 22, which depends on claim 1, Solomon discloses displaying via the one or more display generation components the application user interface comprises displaying a first application user interface of a media content playing application (Solomon, paragraph 57: “applications like… a music player application 56”), and the method includes: detecting the first input to the input device while playing media content using the media content playing application and displaying the first application user interface of the media content playing application (Solomon, paragraph 57: “a music player application 56;” Solomon, paragraph 79: “allows the driver to jump directly to some application display or top menu”); in response to detecting the first input to the input device: displaying the home menu user interface via the one or more display generation components (Solomon, paragraph 80: “One of the ‘direct access’ switch is a top menu switch 30 which can be pressed to trigger (jump to) the display of the main menu of applications”). However, Solomon does not disclose in response to detecting the first input to the input device:… replacing display of the first application user interface of the media content playing application with a second application user interface of the media content playing application, wherein the second application user interface of the media content playing application is smaller in size than the first application user interface of the media content playing application. Darby et al teaches in response to detecting the first input to the input device:… replacing display of the first application user interface of the media content playing application with a second application user interface of the media content playing application, wherein the second application user interface of the media content playing application is smaller in size than the first application user interface of the media content playing application (Darby et al, Fig. 2A-2C: shows the second UI of the media player being created from the first media player; Fig. 4A: shows the second, smaller media player being displayed in front of other content). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Darby et al such that a media playing application could include a smaller media player overlay for other interface environments. This would have enabled a user to manipulate the audio while viewing other user interfaces (Darby et al, abstract: “continuing to provide playback of the first media item in the relocated media player while content associated with the requested activity is being presented to the user on the screen of the user device”). With regards to claim 23, which depends on claim 22, Solomon does not disclose yet Darby et al teaches wherein: replacing display of the first application user interface of the media content playing application with a second application user interface of the media content playing application comprises displaying a media player (Darby et al, Fig. 2A-2C: shows the second UI of the media player being created from the first media player); and the second application user interface includes one or more of: a representation of media content playing on the media content playing application, and playback controls for the media content playing application (Darby et al, paragraph 58: “The mini player 606 is associated with the media player 612 and can cause media items to play, stop, pause, fast forward, rewind, etc. in the media player 612 in the second UI 611.” Fig. 6: controls are shown on mini player 606). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Darby et al such that a media playing application could include a smaller media player overlay with controls for other interface environments. This would have enabled a user to manipulate the audio while viewing other user interfaces (Darby et al, abstract: “continuing to provide playback of the first media item in the relocated media player while content associated with the requested activity is being presented to the user on the screen of the user device”). With regards to claim 24, which depends on claim 22, Solomon discloses in response to detecting the second input to the input device while displaying the home menu user interface, dismissing the home menu user interface (Solomon, paragraphs 26-27: “toggle access switches, the method comprising the steps: A—switching, upon a first actuation of one of the toggle access switch, from a first display mode, currently displayed at the time of the first toggle access switch actuation, to a second display mode, B—returning to the first display mode upon a second subsequent actuation of the same toggle access switch;” the return to the first display mode upon toggling the button is interpreted as returning from the home menu (top menu) to the previous application). However, Solomon does not disclose yet Darby et al teaches and continuing to display the second application user interface of the media content playing application (Darby et al, abstract: “perform an activity that is enabled by a second application and is independent of the viewing of the first media item or the second media item, and continuing to provide playback of the first media item”). It would have been obvious to a person of ordinary skill in the art before the effective filing date to have combined Solomon and Darby et al such that a media playing application could include a smaller media player overlay for other interface environments. This would have enabled a user to manipulate the audio while viewing other user interfaces (Darby et al, abstract: “continuing to provide playback of the first media item in the relocated media player while content associated with the requested activity is being presented to the user on the screen of the user device”). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRODERICK C ANDERSON whose telephone number is (313)446-6566. The examiner can normally be reached Monday-Tuesday, Thursday-Saturday 9-5 PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen Hong can be reached at 5712724124. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /B.C.A/Examiner, Art Unit 2178 /STEPHEN S HONG/Supervisory Patent Examiner, Art Unit 2178
Read full office action

Prosecution Timeline

Sep 18, 2023
Application Filed
Apr 21, 2025
Response after Non-Final Action
Jan 10, 2026
Non-Final Rejection — §102, §103
Feb 26, 2026
Examiner Interview Summary
Feb 26, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12572199
METHOD AND APPARATUS FOR GENERATING GROUP EYE MOVEMENT TRAJECTORY, COMPUTING DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 10, 2026
Patent 12564337
RECURRENT NEURAL NETWORK FOR TUMOR MOVEMENT PREDICTION
2y 5m to grant Granted Mar 03, 2026
Patent 12566821
GENERATIVE SYSTEM FOR WRITING ENTITY RECOMMENDATIONS
2y 5m to grant Granted Mar 03, 2026
Patent 12561863
CREATING AND MODIFYING CIRCULAR ARCS WHILE MAINTAINING ARC QUALITIES WITHIN A DIGITAL DESIGN DOCUMENT
2y 5m to grant Granted Feb 24, 2026
Patent 12547888
METHOD, APPARATUS, DEVICE, AND STORAGE MEDIUM FOR TRAINING IMAGE SEMANTIC SEGMENTATION NETWORK
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
74%
Grant Probability
93%
With Interview (+19.1%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 258 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month