DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on January 15th, 2026 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Amendment
This is in response to applicant’s preliminary amendments filed on April, 9th, 2025 have been accepted and entered.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-12, 19-31, 33, 37, and 39-40 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Rudman et al. U.S. Patent Application Publication US 20240119682 A1 (hereinafter Rudman).
Regarding claim 1, Rudman teaches a computer system (System 200, Para. 0072) configured to communicate with a display generation component (Extended Reality Appliance 110, Para. 0064 and 0071) and an input device (Button, Key, Keyboard, Mouse, touchpad, touchscreen, joystick, virtual cursers, gestures, etc.…., Para. 0065), the
computer system comprising:
one or more processors (Processing Device 360, Para. 0097);
and memory (Memory Interface 310) storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: (Para. 0083)
while displaying, via the display generation component (Extended Reality Appliance 110, Para. 0064 and 0071), extended reality content (Virtual Content and Extended Reality Environment) generated by an application, receiving, via the input device (Button, Key, Keyboard, Mouse, touchpad, touchscreen, joystick, virtual cursor, gestures etc., …, Para. 0065 and 0071), a first input (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065) that corresponds to a request to invoke an operation of a first type; (A User interacting with the Virtual Content and Extended Reality Environment by pressing a button, performing a gesture, or any input to open a User Interface to modify a setting, Para. 0404)
in response to receiving the first input; (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065)
applying a modification (Opening a New Layer or Disabling a Layer of Virtual Content Para. 0219, Minimizing or closing a Virtual Application or a User Interface Para. 0139 and 0172, Modifying a Setting of a Layer or Virtual Content Para. 0146 and 0404, Displaying Information Like Warnings/Alerts/Notifications, Para. 0138 and 0140, etc.… ) to the display of the extended reality content (Virtual Content and Extended Reality Environment) of the application;
and displaying, via the display generation component (Extended Reality Appliance 110, Para. 0064 and 0071), a system user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) that includes a first set of one or more user interface objects (Menus, Links, Text Boxes, Objects, Bars, Forms, Images, Icons, etc.….) for performing the operation of the first type (Changing a Setting, Purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…).
Regarding claim 2, Rudman teaches the computer system of claim 1, wherein the modification (Changing a Setting of the Virtual Content Application, Para. 0146) is a visual effect (Changing Brightness, Color, etc..) that is applied to at least a portion of the extended reality content.(Changing just the brightness of an image or changing the color of text, Para. 0146).
Regarding claim 3, Rudman teaches the computer system of claim 2, wherein the modification(Changing the Brightness of the Extended Reality Appliance Display, Para. 0404) is applied to the entirety of the extended reality content that continues to be displayed. (Changing the brightness of the display modifies the entire content being displayed by the extended reality appliance.)
Regarding claim 4, Rudman teaches the computer system of claim 2, wherein a visual characteristic (Brightness levels or Current Color) of the visual effect (Changing Brightness, Color, etc.…) is selected based on a visual characteristic of the extended reality content (Text or Images present in the Virtual Content). (Para. 0146)
Regarding claim 5, Rudman teaches the computer system of claim 1, wherein the modification (Closing or Opening a Layer that is Transparent or Semi-Transparent to be viewed, each layer can be different content types, Para. 0219) includes displaying a representation of a physical environment (Content Behind Layer) along with the extended reality content (Content of a Semi-Transparent Layer) of the application. A layer can contain different content, can be transparent or semi-transparent, and can block other content, Para. 0219. Thus, the layer opened can be a semi-transparent, which would allow the user to view the extended reality content and the physical environment behind the content as the layer is semi-transparent.
Regarding claim 6, Rudman teaches the computer system of claim 1, wherein the operation of the first type is a secure operation (Modifying Secure Information using a Menu or other User Interfaces, Para. 0138 and 0143). Modifying Secure Information can involve Secure Privacy Settings, Encryption, Signatures, and Watermarks.
Regarding claim 7, Rudman teaches the computer system of claim 6, wherein the secure operation (Modifying Secure Information using a Menu or other User Interfaces) includes a payment operation (Exchanging Payment or Purchasing Products). (Para. 0241 and 0249) Payment Operations are secure operations as they involve interactions with user’s banks, credit cards, and other sensitive information.
Regarding claim 8, Rudman teaches the computer system of claim 6, wherein the secure operation(Modifying Secure Information using a Menu or other User Interfaces) includes an operation to configure a setting (Modify Privacy Settings) of the computer system. (Para. 0138 and 0143 )
Regarding claim 9, Rudman teaches the computer system of claim 1, wherein the first input (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065) is an input directed to the application (Inputs to a User Interface to Invoke or Termination Application, Edit Information in the Application, Organize one or more Windows presenting Application Information, etc.… , Para. 0138).
Regarding claim 10, Rudman teaches the computer system of claim 1, wherein the first input (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065) is an input processed by an operating system of the computer system. (Para. 0068 and 0138) The operating system can be configured to receive inputs to modify display content, Para. 0068.
Regarding claim 11, Rudman teaches the computer system of claim 1, wherein the system user interface(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) is displayed over at least a portion of the extended reality content(Virtual Content and Extended Reality Environment) of the application. (Para. 0138, Fig.10 Floating Menu or Fig 6A)
Regarding claim 12, Rudman teaches the computer system of claim 1, wherein:
the extended reality content (Virtual Content and Extended Reality Environment)of the application, while displayed without the modification, has a first level of brightness (Default Settings for the Application or Display Region, Para. 0140 and 0142); The extended reality content would have a default brightness it would display its content at.
and the modification (Changing a Setting of the Virtual Content Application, Para. 0146) includes displaying the extended reality content of the application with a second level of brightness (Changing the Brightness of the Application to be lower than the default brightness) that is lower than the first level of brightness (Default Settings for the Application or Display Region, Para. 0140 and 0142).
Regarding claim 19, Rudman teaches the computer system of claim 1, the one or more programs further including instructions for:
while the modification (Opening a New Layer or Disabling a Layer of virtual content Para. 0219, Minimizing or closing a virtual application or a User Interface Para. 0139 and 0172, Modifying a Setting of a Layer or Virtual Content Para. 0146 and 0404, Displaying Information Like Warnings/Alerts/Notifications, Para. 0138 and 0140, etc.… ) is applied to the extended reality content (Virtual Content and Extended Reality Environment) of the application, detecting dismissal of the system user interface (Turning off a Layer of Content Para. 0219 and 0220 or Minimizing/Closing User Interface Para. 0172-0173 and 0138) and/or completion of the operation of the first type;
and in response to detecting dismissal (Closing/Minimizing Layer/User Interface/Menu/Settings, etc..) of the system user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) and/or completion of the operation of the first type, ceasing to apply the modification(Opening a New Layer or Disabling a Layer of virtual content Para. 0219, Minimizing or closing a virtual application or a User Interface Para. 0139 and 0172, Modifying a Setting of a Layer or Virtual Content Para. 0146 and 0404, Displaying Information Like Warnings/Alerts/Notifications, Para. 0138 and 0140, etc.… ) to the extended reality content (Virtual Content and Extended Reality Environment) of the application. Closing/Minimizing a Layer/User Interface/Menu/Settings stops displaying said Layer/User Interface/Menu/Settings, which is a modification to the current extended reality content.
Regarding claim 20, Rudman teaches the computer system of claim 19, wherein detecting dismissal(Closing/Minimizing Layer/User Interface/Menu/Settings, etc..) of the system user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) includes detecting an input (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065) corresponding to a selectable user interface object (Elements of the User Interface that close/minimize, Para. 0138), that causes the system user interface to be closed.
Regarding claim 21, Rudman teaches the computer system of claim 19, wherein detecting dismissal (Closing/Minimizing Layer/User Interface/Menu/Settings, etc..) of the system user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) includes detecting a gesture (Gesture Enabled User Interfaces, Para. 0241) that causes the system user interface to be closed. Gestures can be used to control the User Interface or to interact with elements of a User Interfaces, Para. 0138, 0165, and 0241)
Regarding claim 22, Rudman teaches the computer system of claim 19, wherein detecting dismissal of the system user interface(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) includes detecting a physical object (Hand/Body Gestures Para. 0094, 0182, and 0267 or Any Obstructing Physical Object, Para. 0283) impinging on a display area (Interference Region, Para. 0514 or FOV of the User, Para. 0151 or Display Regions, Para. 0135) of the system user interface. A user can interact with the user interface to dismiss the user interface in many different ways. Such as performing a gesture for the user interface in the display area (Para. 0138), the user using their hands to click the user interface elements which would imping on the display area(Para. 0157), or the user interface is deemed in the interference region, which is a region with frequent physical object obstructions/impinging in the region.
Regarding claim 23, Rudman teaches the computer system of claim 22, wherein the physical object(Hand/Body Gestures Para. 0094, 0182, and 0267 or Any Obstructing Physical Object, Para. 0283) is a body part of the user.
Regarding claim 24, Rudman teaches the computer system of claim 22, wherein the physical object(Hand/Body Gestures Para. 0094, 0182, and 0267 or Any Obstructing Physical Object, Para. 0283) is an object within a physical environment of the user.
Regarding claim 25, Rudman teaches the computer system of claim 22, wherein the physical object (Hand/Body Gestures Para. 0094, 0182, and 0267 or Any Obstructing Physical Object, Para. 0283) impinges on the display area(Interference Region, Para. 0514 or FOV of the User, Para. 0151 or Display Regions, Para. 0135) of the system user interface due(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) to movement of the user within a physical environment of the user. Extended Reality Appliance 110 (Para. 0064 and 0071) tracks the user’s movement and can utilize the movement as input (Para. 0073). This said movement includes hand/body gestures. As well as the interference region can move or change based on changes in the physical environment. The user moving is a change in the physical environment (Para. 0526).
Regarding claim 26, Rudman teaches the computer system of claim 1, the one or more programs further including instructions for:
while displaying the extended reality content(Virtual Content and Extended Reality Environment) component generated by the application (Application in the First Display Region, Para. 0135), receiving a third input (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065) that corresponds to a request to perform an operation of a second type (Opening an Application in a Second Display Region, Para. 0147), different from the first type (Purchasing a Product, Para. 0233) The first type and second type operations can be any of the following operations and are not limited to those listed. (Changing a Setting, purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…) Extended Reality Application can handle a variety of operations.
displaying a third user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) corresponding to the operation of the second type (Opening an Application in a Second Display Region, Para. 0147) without applying the modification to the extended reality content(Virtual Content and Extended Reality Environment) of the application(Application in the First Display Region, Para. 0135). The modification is not applied to the first display region’s extended reality content, but to the second display region’s extended reality content. As a second application is being opened in the second display region.
Regarding claim 27, Rudman teaches the computer system of claim 26, wherein the third input(Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065) is an input processed by an operating system of the computer system. (Para. 0068 and 0138) The operating system can be configured to receive inputs to modify display content, Para. 0068.
Regarding claim 28, Rudman teaches the computer system of claim 26, wherein the extended reality content(Virtual Content and Extended Reality Environment) is actively displayed content (Real time Content, Para. 0068) and the third user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) is displayed over at least a portion of the actively displayed extended reality content (Overlay Virtual/Physical Content, Para. 0068, 0155, and 0156) of the application (Application in the First Display Region or Second Display Region). Fig. 6A
Regarding claim 29, Rudman teaches the computer system of claim 1, wherein at least a portion of the system user interface(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) is reactive (Change in Rection to Actions of Users Para. 0068 and How the User Interface is Graphically Presented to the Users Interactions, Para. 0139) to one or more interactive inputs (Eye Gestures or Eye Motion, Para. 0241 and 0267).
Regarding claim 30, Rudman teaches the computer system of claim 29, wherein the system user interface(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) is reactive by displaying a first graphical effect (Modifying User Interface Elements, Settings, or Characteristics, Para. 0139) in response to receiving the one or more interactive inputs(Eye Gestures or Eye Motion, Para. 0241 and 0267). Modifying Users interface Elements, Settings, or characteristics includes graphical effects.
Regarding claim 31, Rudman teaches the computer system of claim 29, wherein the one or more interactive inputs (Hand Gestures, Para. 0071 and 0086) includes a hand-based interactive input.
Regarding claim 33, Rudman teaches the computer system of claim 1, wherein the operation of the first type(Changing a Setting, purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…) is an operation that requires providing biometric authentication (Biometric Sensor or Biometric Token, Para. 0103 and 0281).
Regarding claim 37, Rudman teaches the computer system of claim 1, wherein the operation of the first type(Changing a Setting, purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…) is an operation that requires providing a hardware input (Button, Key, Computer Mouse, Touchpad, Joystick, etc.…., Para. 0065).
Regarding claim 39, Rudman teaches a non-transitory computer-readable storage medium (Para. 0083) storing one or more programs configured to be executed by one or more processors (Processing Device 360, Para. 0097) of a computer system that is in communication with a display generation component(Extended Reality Appliance 110, Para. 0064 and 0071) and an input device (Button, Key, Keyboard, Mouse, touchpad, touchscreen, joystick, virtual cursers, gestures, etc.…., Para. 0065), the one or more programs including instructions for performing the computer system of claim 1, therefore it is rejected under the same rationale as claim 1.
Regarding claim 40, has similar limitations as of claims 1 and 39, therefore it is rejected under the same rationale as claims 1 and 39.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 13-18 are rejected under 35 U.S.C. 103 as being unpatentable over Rudman et al. U.S. Patent Application Publication US 20240119682 A1 (hereinafter Rudman) in view of Fein et al. U.S. Patent Application Publication US 20140267410 A1(hereinafter Fein).
Regarding claim 13, Rudman fails to teach the computer system of claim 1, wherein:
the extended reality content(Virtual Content and Extended Reality Environment) is actively displayed content; (Para. 0061) The extended reality content can contain a live view of the physical environment and virtual content displayed in real time.
However, Rudman fails to teach the computer system of claim 1, wherein:
Rudman and Fein are analogous to the claimed invention because both of them are in the same field of displaying AR/VR content.
Fein teaches the computer system of claim 1, wherein:
the extended reality content (AR Scene or Live Scene, Para. 0110 and 0138) is actively displayed content; (Augmented Reality content is content that has incorporated virtual content into a physical environment. Displaying the physical environment with virtual content is displaying a live scene.)
and the modification includes suspending (Pausing Scene, Para. 0112-0113) display of the extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) of the application. (Para. 0106, 0109, and 0112)
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Content to incorporate Fein’s Ability to Pause Content. Since doing so would provide the benefit of pausing content to allow users to interact with the content that would otherwise be difficult to interact with. (Fein et al. , Para. 0112)
Regarding claim 14, Rudman fails to teach the computer system of claim 13, wherein suspending (Pausing Scene, Para. 0112-0113) display of the extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) of the application includes ceasing to update the displayed extended reality content of the application. Pausing the AR Scene prevents the scene from updating while it is paused. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Content to incorporate Fein’s Ability to Pause Content. Since doing so would provide the benefit of pausing content to allow users to interact with the content that would otherwise be difficult to interact with. (Fein et al. , Para. 0112)
Regarding claim 15, Rudman fails to teach the computer system of claim 13, wherein suspending(Pausing Scene, Para. 0112-0113) display of the extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) of the application includes suspending the ability of the application to update other currently displayed content (Other Scenes or Other aspects of the Scene, Para. 0107). The scene or portions of the scene can be paused, such as items or specific elements, Para. 0107). If the whole scene is paused, then all content currently displayed is paused. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Content to incorporate Fein’s Ability to Pause Content. Since doing so would provide the benefit of pausing content to allow users to interact with the content that would otherwise be difficult to interact with. (Fein et al. , Para. 0112)
Regarding claim 16, Rudman teaches the computer system of claim 13, the one or more programs further including instructions for:
while actively displaying, via the display generation component (Extended Reality Appliance 110, Para. 0064 and 0071), second extended reality content (Content in the Second Display Region, Para. 0135 or Different Frames/Windows in Same Display Region, Para. 0142) generated by a second application (Different Content, Para. 0147), receiving, via the input device (Button, Key, Keyboard, Mouse, touchpad, touchscreen, joystick, virtual cursers, gestures, etc.…., Para. 0065), a second input corresponding to a request to invoke a second operation of the first type(Changing a Setting, purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…);
in response to receiving the second input(Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065):
displaying a second system user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266) that includes a second set of one or more user interface objects (Menus, Links, Text Boxes, Objects, Bars, Forms, Images, Icons, etc.….) for performing the second operation of the first type(Changing a Setting, purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…);
in accordance with a determination that the second extended reality content (Content in the Second Display Region, Para. 0135 or Different Frames/Windows in Same Display Region, Para. 0142) generated by the second application meets a set of one or more immersive content criteria (Criterion Associating Virtual Content with a Location Para. 0203 or Guidelines/Criteria for Displaying Content, Para. 0210), Rudman et al. either moves content to another region, minimizes it, or blocks display of certain content, Para. 0210.
and in accordance with a determination that the second extended reality content(Content in the Second Display Region, Para. 0135 or Different Frames/Windows in Same Display Region, Para. 0142) generated by the second application does not meet the set of one or more immersive content criteria (Criterion Associating Virtual Content with a Location Para. 0203 or Guidelines/Criteria for Displaying Content, Para. 0210),
However, Rudman fails to teach:
in accordance with a determination that the second extended reality content generated by the second application meets a set of one or more immersive content criteria,
and in accordance with a determination that the second extended reality content generated by the second application does not meet the set of one or more immersive content criteria,
Fein teaches:
in accordance with a determination that the second extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) generated by the second application meets a set of one or more immersive content criteria (Detecting an Item/Aspect/Element has left the scene or about to leave, or the activity to be inaccessible or difficult to access, Para. 0107), suspending display(Pausing Scene, Para. 0112-0113) of the second extended reality content of the second application; If an object of interest or an activity would be difficult to interact with, the application will pause the Scene, Para. 0106 and 0109.
and in accordance with a determination that the second extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) generated by the second application does not meet the set of one or more immersive content criteria (Detecting an Item/Aspect/Element has left the scene or about to leave, or the activity to be inaccessible or difficult to access, Para. 0107), continuing to actively display the second extended reality content of the second application. If no object is detected or an activity does not become difficult to access the scene will not be paused and will be displayed as normal.
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Content to incorporate Fein’s Ability to Pause Content based on a Criteria. Since doing so would provide the benefit of pausing content to allow users to interact with the content that would otherwise be difficult to interact with. (Fein et al. , Para. 0112)
Regarding claim 17, Rudman teaches the computer system of claim 16, wherein the set of one or more immersive content criteria(Criterion Associating Virtual Content with a Location Para. 0203 or Guidelines/Criteria for Displaying Content, Para. 0210) includes a criterion that is met when the second application is configured to present content (Displaying View-Through Content, Para. 0463) over a threshold amount (Threshold Distance, Para. 0354 and 0463) of an extended reality environment that is currently visible from a viewpoint of a user of the computer system. Displaying the view-through of a physical screen near a user based on a threshold, viewpoint, and distance, Para. 0463.
Regarding claim 18, Rudman fails to teach the computer system of claim 13, the one or more programs further including instructions for:
while the extended reality content of the application is suspended detecting dismissal of the system user interface and/or completion of the operation of the first type;
and in response to detecting dismissal of the system user interface and/or completion of the operation of the first type, unsuspending display of the extended reality content of the application.
However, Fein teaches: the computer system of claim 13, the one or more programs further including instructions for:
while the extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) of the application is suspended(Pausing Scene, Para. 0112-0113), detecting dismissal(User Input, Para. 0108) of the system user interface and/or completion of the operation of the first type (Task/Activity, Para. 0110 and 0112);
and in response to detecting dismissal of the system user interface and/or completion of the operation of the first type (Task/Activity, Para. 0110 and 0112), unsuspending display of the extended reality content(AR Scene or Live Scene, Para. 0110 and 0138) of the application. Restoring presentation of Scene, Para. 0108.
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Content to incorporate Fein’s Ability to Pause/Unpause Content based on a Criteria. Since doing so would provide the benefit of pausing content to allow users to interact with the content that would otherwise be difficult to interact with. (Fein et al. , Para. 0112)
Claim 32 is rejected under 35 U.S.C. 103 as being unpatentable over Rudman et al. U.S. Patent Application Publication US 20240119682 A1 (hereinafter Rudman) in view of Huang et al. U.S. Patent Application Publication US 20200320794 A1 (hereinafter Huang).
Regarding claim 32, Rudman teaches the computer system of claim 1, wherein the system user interface(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) exhibits(GUI, User Interface, Menus, Tabs, etc.… Para. 0266) being moved and/or in response to a viewpoint of the user shifting. Virtual Display following the movement of input unit 202 (Para. 0101) where the input unit 202 is input received from the user (Para. 0073). Such as a user dragging a user interface using an electronic mouse (Para. 0185) or a virtual screen following the gaze of the user (Para. 0151).
However, Rudman fails to explicitly teach the system user interface exhibits
Huang teaches the computer system of claim 1, wherein the system user interface exhibits (Virtual Applications, Para 0147) lazy follow behavior (follow/lazy headlock or follow based on external sensor, Para. 0176) in response to the system user interface(Virtual Applications, Para 0147) being moved and/or in response to a viewpoint of the user shifting. Application Content following the user and remaining positionally consistent with the user, Para. 0159. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s Follow Behavior to incorporate Huang’s Lazy Follow Behavior. Since doing so would provide the benefit of utilizing a common feature in developing XR/MR/AR/VR environments. As lazy follow behavior enhances user interactions with XR applications.
Claims 34-36, and 38 is rejected under 35 U.S.C. 103 as being unpatentable over Rudman et al. U.S. Patent Application Publication US 20240119682 A1 (hereinafter Rudman).
Regarding claim 34, Rudman teaches the computer system of claim 33, the one or more programs further including instructions for:
while displaying the system user interface(GUI, User Interface, Menus, Tabs, etc.… Para. 0266), receiving biometric data (Biometric Data from Biometric Sensor, Para. 0103);
in response receiving the biometric data (Biometric Data from Biometric Sensor, Para. 0103):
in accordance with a determination (Identifying User 100, Para. 0103) that the biometric data corresponds to an authorized biometric profile (User Profile, Para. 0209), performing the operation of the first type (Presenting Virtual Content that contain Private Information, Para. 103);
Rudman fails to explicitly teach:
However, Rudman teaches identifying a user with biometric sensor data (para. 0103) and utilizing a user profile associated uniquely to the user for lookups (Para. 0209). As well as, only presenting private information to the user after identifying the user with biometric sensor data (Para. 0103). If the private information is only presented when the user is identified, if the user is not identified the private information would not be shown. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Environment to incorporate Rudman’s own teaching of Not Performing an Operation when the User Fails to be Identified through Biometrics. Since doing so would provide the benefit of preventing private information from being shown to individuals it is not intended for. (Rudman et al. Para. 0293)
Regarding claim 35, Rudman teaches the computer system of claim 34, the one or more programs further including instructions for:
in response receiving the biometric data(Biometric Data from Biometric Sensor, Para. 0103) and in accordance with a determination(Identifying User 100, Para. 0103) that the biometric data corresponds to an authorized biometric profile (User Profile, Para. 0209), displaying, via the display generation component(Extended Reality Appliance 110, Para. 0064 and 0071), an indication that authentication was successful. (Displaying the Private Information, Para. 0103) Displaying the private information would be an indication the authentication was successful as it is only shown when the user is identified with biometric sensor data.
Regarding claim 36, Rudman fails to explicitly teach the computer system of claim 34, the one or more programs further including instructions for:
However, Rudman teaches identifying a user with biometric sensor data (para. 0103) and utilizing a user profile associated uniquely to the user for lookups (Para. 0209). As well as, only presenting private information to the user after identifying the user with biometric sensor data (Para. 0103). If the private information is only presented when the user is identified, if the user is not identified the private information would not be shown. Not showing the private information a user requests would be an indication the identification failed. Utilizing a user interface to display an indication of the failure of some operation is a common design choice in user interface design and is widespread across technology. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Environment to incorporate Rudman’s own teaching of Not Performing an Operation when the User Fails to be Identified through Biometrics and. Since doing so would provide the benefit of preventing private information from being shown to individuals it is not intended for. (Rudman et al. Para. 0293)
Regarding claim 38, Rudman teaches the computer system of claim 37, the one or more programs further including instructions for:
while displaying the system user interface (GUI, User Interface, Menus, Tabs, etc.… Para. 0266), receiving a respective input(Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065);
in response receiving the respective input (Key Presses, Tactile Data, Motion Data, Position Data, Gestures Data, Direction Data, etc.…. Para. 0065):
in accordance with a determination that the respective input corresponds to a hardware input (Button, Key, Keyboard, Touchpad, Touchscreen, Joystick, etc.…., Para. 0065) of a first type, performing the operation of the first type (Changing a Setting, purchasing a Product, Closing/Opening a Virtual Application/Layer/Content, Minimizing Virtual Application/Layer/ Content, etc.…);
Rudman fails to explicitly teach:
However, Rudman teaches an input determination module 312, Para. 0085 and 0086. The Input determination module 312 can determine the signals of input and determine the type of input (Hardware/Software). Based on these input signals the virtual content can be tailored to be modified based on the signal, Para. 0067. Such as requiring a specific signal to be received to perform a certain operation. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rudman’s XR Environment to incorporate Rudman’s own teaching of an Input Determination Module to forgo operations if the Input is not from Hardware. Since doing so would provide the benefit of utilizing different forms of input for specific interactions and ensuring certain operations require specific input devices.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIANNA R COCHRAN whose telephone number is (571)272-4671. The examiner can normally be reached Mon-Fri. 7:30am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571) 272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRIANNA RENAE COCHRAN/ Examiner, Art Unit 2615
/ALICIA M HARRINGTON/ Supervisory Patent Examiner, Art Unit 2615