Prosecution Insights
Last updated: April 19, 2026
Application No. 19/206,298

Devices, Methods, And Graphical User Interfaces For Interacting With System User Interfaces Within Three-Dimensional Environments

Non-Final OA §101§103
Filed
May 13, 2025
Examiner
ONYEKABA, AMY
Art Unit
2628
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
1 (Non-Final)
84%
Grant Probability
Favorable
1-2
OA Rounds
2y 1m
To Grant
90%
With Interview

Examiner Intelligence

Grants 84% — above average
84%
Career Allow Rate
405 granted / 482 resolved
+22.0% vs TC avg
Moderate +6% lift
Without
With
+6.3%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
11 currently pending
Career history
493
Total Applications
across all art units

Statute-Specific Performance

§101
0.9%
-39.1% vs TC avg
§103
53.9%
+13.9% vs TC avg
§102
23.2%
-16.8% vs TC avg
§112
15.5%
-24.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 482 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the response to this Office action, the Office respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Office in prosecuting this application. The Office has cited particular figures, elements, paragraphs and/or columns and line numbers in the references as applied to the claims for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested from the applicant, in preparing the responses, to fully consider each of the cited references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage disclosed by the Office. Disposition of the Claims 2. The instant application was effectively filed on May 17, 2024, wherein claims 1-126 are cancelled and claims 127-142 are currently pending. USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 142 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because Claim 142 recites the “A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with one or more display generation components and one or more input devices”. And specification does not explicitly state that the computer-readable storage medium excludes signals per se. Moreover, Applicant’s specification disclosure recites “Executable instructions for performing these functions are, optionally, included in a transitory and/or non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors” (See para. [0006]). Since the BRI encompasses transitory forms of signal transmission claim 142 is consider non-statutory. Examiner suggest adding the term “non-transitory” to the claim. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action: (a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under pre-AIA 35 U.S.C. 103(a) are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 127-128, 134-135, 137, 141 and 142 is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Lindmeier et al. US PG-PUB 20220084279 A1 (hereinafter Lind). Regarding claim 127, Lind teaches A method, comprising: at a computer system that is in communication with one or more display generation components and one or more input devices: while a view of an environment is visible via the one or more display generation components, and while the view of the environment includes a respective object that moves as a hand of a user moves (Fig. 10c and 11c; gesture G and Para. [0325]; Gesture G is predetermined to correspond to a request to bring cylinder 1106 to hand 1110, Para. [0329]; Gesture G is a double pinch gesture performed by hand 1110 or a double tap gesture on a stylus held by hand 1110. In some embodiments, cylinder 1106 is moved to within a threshold distance (e.g., 1 inch, 3 inches, 6 inches, 1 foot, etc.) of the representation of hand 1110 such that a pinch gesture by hand 1110 is interpreted as a selection of cylinder 1106 (e.g., without requiring hand 1110 to move towards cylinder 1106 to select cylinder 1106). For example, a user is able to perform a direct manipulation operation on cylinder 1106 without moving the position of hand 1110. For example, a user is able to directly manipulate cylinder 1106 to perform a movement operation, rotation, resizing operation, etc. optionally in a manner similar to those described herein with respect to methods 1000 and 1400), detecting, via the one or more input devices, a respective input; and in response to detecting the respective input: in accordance with a determination that first criteria are met, wherein the first criteria include a requirement that the hand of the user is holding a controller (Para. [0354]; the user input from the hand includes interaction with an input device, separate from the electronic device, such as a stylus held by hand 1110 (e.g., an input device, external to the electronic device, that is in communication with the electronic device, such as a wireless mouse, wireless keyboard, a remote control device, another mobile device, a handheld device, a stylus, a pointing tool, a controller, etc.), displaying, via the one or more display generation components, a first user interface object at a first location relative to the respective object (Para. [0354]; the electronic device recognizes the input device being held by the user and/or the hand of the user and displays the contextual menu at a location that is at or near the input device); Although, Lind, does not anticipate “in accordance with a determination that the first criteria are not met, forgoing displaying the first user interface object at the first location” Based on teachings in Lind which details as shown in Para. [0354]; the electronic device recognizes the input device being held by the user and/or the hand of the user and displays the contextual menu); Lind will also accomplish teachings of in accordance with a determination that the first criteria are not met, forgoing displaying the first user interface object at the first location (Para. [0345]; i.e. in the event the input device (stylus) is not detected to be held by the user, the contextual menu is not displayed). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to for-go displaying of the first user interface object at the first location based on determination that the first criteria are not met, in order to enhance user experience by efficiently providing a more effective way to communicate with the user interface. Regarding claim 128, Lind teaches The method of claim 127, Lind further teaches including: in response to detecting the respective input: in accordance with a determination that second criteria are met, wherein the second criteria include a requirement that the hand of the user is not holding a controller, displaying the first user interface object at a second location relative to the respective object, wherein the second location is different from the first location (Para. [0343]; in response to detecting the user input (1212), in accordance with a determination that the user input satisfies one or more second criteria, different from the one or more criteria, wherein the one or more second criteria include a criterion that is satisfied when the user input from the hand is a single tap gesture (e.g., a single short tap that is not followed by a second short tap within the threshold amount of time, optionally detected via the hand tracking device), the electronic device displays (1214), via the display generation component, a contextual menu associated with the first object at the first location, such as the display of contextual menu 1112 in response to hand 1110 performing Gesture F in FIG. 11B (e.g., without moving the first object from the first location to the respective location in the three-dimensional environment). Regarding claim 134, Lind teaches The method of claim 127, Lind further teaches wherein the one or more input devices include the controller, and the method includes: while displaying the first user interface object at the first location relative to the respective object, detecting, via the controller, an input; and in response to detecting the input, performing an operation associated with the first user interface object (Para. [0326]-[0328]; contextual menu 1112 includes one or more selectable options (e.g., selectable option 1114-1 to 1114-3) that are selectable to perform one or more operations associated with cylinder 1106. For example, contextual menu 1112 includes one or more options for replacing cylinder 1106 with a three-dimensional object, such as described above with respect to method 800. In some embodiments, contextual menu 1112 includes an option to copy cylinder 1106 into a clipboard, delete cylinder 1106, and/or duplicate cylinder 1106). Regarding claim 135, Lind teaches The method of claim 134, Lind further teaches wherein the controller includes a button, and detecting the input via the controller includes detecting a press of the button (Para. [0484]-[0485] and Fig. 15a; a selection input (e.g., the press of a button, a respective gesture, etc.) while the pointing device is pointed at a portion of the first physical object). Regarding claim 137, Lind teaches The method of claim 127, Lind further teaches wherein detecting the respective input includes detecting, via the one or more input devices, that attention of the user is directed toward a location of the hand of the user (Fig’s. 7a, 7d and Para. [0141] In some embodiments, while displaying the one or more selectable options, the electronic device detects (808) selection of a respective selectable option of the one or more selectable options, such as detecting a selection input performed by hand 716 (e.g., “Gesture A”) while gaze 710-2 is directed to a selectable option in FIG. 7A (e.g., a selection input on a respective selectable option). In some embodiments, the selection input is received via an input device. In some embodiments, the selection input includes a focus and an actuation. For example, selecting the respective selectable option includes detecting that a gaze of the user is directed at (e.g., looking at) the respective selectable option when the actuation is received (e.g., a click of a button, a tap on a touch-sensitive surface, etc.)). Regarding claim 141, Lind teaches A computer system that is in communication with one or more display generation components and one or more input devices (Para. [0006]), the computer system comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors (Para. [0006]; the computer system has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions), the one or more programs including instructions for: while a view of an environment is visible via the one or more display generation components, and while the view of the environment includes a respective object that moves as a hand of a user moves (Fig. 10c and 11c; gesture G and Para. [0325]; Gesture G is predetermined to correspond to a request to bring cylinder 1106 to hand 1110, Para. [0329]; Gesture G is a double pinch gesture performed by hand 1110 or a double tap gesture on a stylus held by hand 1110. In some embodiments, cylinder 1106 is moved to within a threshold distance (e.g., 1 inch, 3 inches, 6 inches, 1 foot, etc.) of the representation of hand 1110 such that a pinch gesture by hand 1110 is interpreted as a selection of cylinder 1106 (e.g., without requiring hand 1110 to move towards cylinder 1106 to select cylinder 1106). For example, a user is able to perform a direct manipulation operation on cylinder 1106 without moving the position of hand 1110. For example, a user is able to directly manipulate cylinder 1106 to perform a movement operation, rotation, resizing operation, etc. optionally in a manner similar to those described herein with respect to methods 1000 and 1400), detecting, via the one or more input devices, a respective input; and in response to detecting the respective input: in accordance with a determination that first criteria are met, wherein the first criteria include a requirement that the hand of the user is holding a controller (Para. [0354]; the user input from the hand includes interaction with an input device, separate from the electronic device, such as a stylus held by hand 1110 (e.g., an input device, external to the electronic device, that is in communication with the electronic device, such as a wireless mouse, wireless keyboard, a remote control device, another mobile device, a handheld device, a stylus, a pointing tool, a controller, etc.), displaying, via the one or more display generation components, a first user interface object at a first location relative to the respective object(Para. [0354]; the electronic device recognizes the input device being held by the user and/or the hand of the user and displays the contextual menu at a location that is at or near the input device); Although, Lind, does not anticipate “in accordance with a determination that the first criteria are not met, forgoing displaying the first user interface object at the first location” Based on teachings in Lind which details as shown in Para. [0354]; the electronic device recognizes the input device being held by the user and/or the hand of the user and displays the contextual menu); Lind will also accomplish teachings of in accordance with a determination that the first criteria are not met, forgoing displaying the first user interface object at the first location (Para. [0345]; i.e. in the event the input device (stylus) is not detected to be held by the user, the contextual menu is not displayed). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to for-go displaying of the first user interface object at the first location based on determination that the first criteria are not met, in order to enhance user experience by efficiently providing a more effective way to communicate with the user interface. Regarding claim 142, Lind teaches A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with one or more display generation components and one or more input devices (Para.[ 0006]; a non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors), the one or more programs including instructions for: while a view of an environment is visible via the one or more display generation components, and while the view of the environment includes a respective object that moves as a hand of a user moves (Fig. 10c and 11c; gesture G and Para. [0325]; Gesture G is predetermined to correspond to a request to bring cylinder 1106 to hand 1110, Para. [0329]; Gesture G is a double pinch gesture performed by hand 1110 or a double tap gesture on a stylus held by hand 1110. In some embodiments, cylinder 1106 is moved to within a threshold distance (e.g., 1 inch, 3 inches, 6 inches, 1 foot, etc.) of the representation of hand 1110 such that a pinch gesture by hand 1110 is interpreted as a selection of cylinder 1106 (e.g., without requiring hand 1110 to move towards cylinder 1106 to select cylinder 1106). For example, a user is able to perform a direct manipulation operation on cylinder 1106 without moving the position of hand 1110. For example, a user is able to directly manipulate cylinder 1106 to perform a movement operation, rotation, resizing operation, etc. optionally in a manner similar to those described herein with respect to methods 1000 and 1400), detecting, via the one or more input devices, a respective input; and in response to detecting the respective input: in accordance with a determination that first criteria are met, wherein the first criteria include a requirement that the hand of the user is holding a controller (Para. [0354]; the user input from the hand includes interaction with an input device, separate from the electronic device, such as a stylus held by hand 1110 (e.g., an input device, external to the electronic device, that is in communication with the electronic device, such as a wireless mouse, wireless keyboard, a remote control device, another mobile device, a handheld device, a stylus, a pointing tool, a controller, etc.), displaying, via the one or more display generation components, a first user interface object at a first location relative to the respective object(Para. [0354]; the electronic device recognizes the input device being held by the user and/or the hand of the user and displays the contextual menu at a location that is at or near the input device); Although, Lind, does not anticipate “in accordance with a determination that the first criteria are not met, forgoing displaying the first user interface object at the first location” Based on teachings in Lind which details as shown in Para. [0354]; the electronic device recognizes the input device being held by the user and/or the hand of the user and displays the contextual menu); Lind will also accomplish teachings of in accordance with a determination that the first criteria are not met, forgoing displaying the first user interface object at the first location (Para. [0345]; i.e. in the event the input device (stylus) is not detected to be held by the user, the contextual menu is not displayed). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to for-go displaying of the first user interface object at the first location based on determination that the first criteria are not met, in order to enhance user experience by efficiently providing a more effective way to communicate with the user interface. Claims 130-132 is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Lindmeier in view of Broughton et al. 20240152245 A1 (hereinafter Broughton). Regarding claim 130, Lind teaches The method of claim 127, although Lind teaches wherein the first user interface object includes an affordance that provides access to a “contextual menu” user interface. Lind fails to explicitly disclose the contextual menu user interface is a home menu user interface However, in the same field of endeavor, Broughton teaches a user interface object includes an affordance that provides access to a home menu user interface (Para. [0399], [0422] and FIGS. 7BS-7BT; illustrate placement of the home menu user interface in the three-dimensional environment, where the home menu user interface is displayed in accordance with closing a respective user interface (e.g., in response to a user input directed to a close affordance associated with the respective user interface), based on respective locations of the respective user interface and one or more other user interfaces, all of which were closed within the first time threshold of the closing of the respective user interface). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to combine the teachings of Lind with the teachings as taught by Broughton in order to enable the user to continue interacting with one or more other application user interfaces without displaying additional controls Broughton-(Para. [0687]). Regarding claim 131, Lind teaches The method of claim 127, although Lind teaches wherein the first user interface object includes an affordance that provides access to a “contextual menu” Lind fails to explicitly disclose the contextual menu user interface is status information However, in the same field of endeavor, Broughton teaches user interface wherein the first user interface object includes an affordance that provides access to status information (Para. [0084], [0577]; status information). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to combine the teachings of Lind with the teachings as taught by Broughton in order to view information about a current state of the computer system Broughton-(Para. [0577]). Regarding claim 132, Lind as modified by Broughton teaches The method of claim 127, Broughton further teaches wherein the first user interface object includes a status user interface that includes status information for the computer system (Para. [0577]). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to combine the teachings of Lind with the teachings as taught by Broughton in order to view information about a current state of the computer system Broughton-(Para. [0577]). Claim 133 is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Lindmeier as modified by Broughton and further in view of Lemay et al. US PG-PUB 20210191600 A1 (hereinafter Lemay). Regarding claim 133, Lind as modified by Broughton teaches The method of claim 132, Broughton further teaches (Para. [0190], [0355]; hand facing away from the user and hand moved toward a region in front of the user) Lind as modified by Broughton fails to further disclose wherein detecting the respective input includes detecting a change in orientation of the hand from a first orientation with a palm of the hand facing toward a viewpoint of the user to a second orientation with the palm of the hand facing away from the viewpoint of the user. However, in the same field of endeavor, Lemay teaches wherein detecting the respective input includes detecting a change in orientation of the hand from a first orientation with a palm of the hand facing toward a viewpoint of the user to a second orientation with the palm of the hand facing away from the viewpoint of the user (Para. [0233]; in response to detecting the movement of the palm relative to the viewpoint corresponding to the view of the three-dimensional environment: in accordance with a determination that the movement of the palm includes a rotation (e.g., a change in orientation (e.g., rotate around an axis that points toward the viewpoint corresponding to the three-dimensional environment, or rotate toward or away from the viewpoint corresponding to the three-dimensional environment)) of the palm relative to the viewpoint corresponding to the view of the three-dimensional environment) Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to combine the teachings of Lind as modified by Broughton with the teachings as taught by Lemay in order to simplify the user interface by providing a more efficient and user-friendly way to interact/communicate with the system. Claim 136 is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Lindmeier in view of Lemay et al. US PG-PUB 20210191600 A1 (hereinafter Lemay). Regarding claim 136, Lind teaches The method of claim 135, Lind further teaches wherein: detecting the input via the controller includes detecting movement of the controller while detecting the press of the button (Para. [0484]-[0485] and Fig. 15a; a user input directed to the first physical object, such as hand 1508 performing Gesture I while directed to a portion of table 1506 in FIG. 15A (e.g., the user input is received from a pointing device such as a stylus, pointer stick, a hand of the user (e.g., a user pointing at the first physical object), etc.), the user input directed to the first physical object includes a selection input (e.g., the press of a button, a respective gesture, etc.) while the pointing device is pointed at a portion of the first physical object) ; and performing the operation associated with the first user interface object in response to detecting the input in accordance with the movement of the controller (Para. [0326]-[0328]; contextual menu 1112 includes one or more selectable options (e.g., selectable option 1114-1 to 1114-3) that are selectable to perform one or more operations associated with cylinder 1106. For example, contextual menu 1112 includes one or more options for replacing cylinder 1106 with a three-dimensional object, such as described above with respect to method 800. In some embodiments, contextual menu 1112 includes an option to copy cylinder 1106 into a clipboard, delete cylinder 1106, and/or duplicate cylinder 1106) Lind fails to explicitly disclose changing a respective volume level of the computer system However, in the same field of endeavor, Lemay teaches the operation associated with the first user interface object includes changing a respective volume level of the computer system in accordance with the movement (0267, Para. [0119]; the user interface objects in the first view 7036 of the user interface of the first application respond to gesture inputs (e.g., tap input, swipe input, press input, etc.) by another hand (e.g., the hand 7022) directed to locations on the palm of the hand 7020 that correspond to display positions of the different user interface objects (e.g., the user interface objects 7038 and 7040). In some embodiments, a magnitude of the operation that is performed in response to activation of the user interface object in the first view 7036 of the user interface (e.g., a magnitude for dragging a slider control, increasing volume, or drawing a line, moving an object, etc.) is based on a magnitude of the gesture input performed by the other hand (e.g., the hand 7022).). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to combine the teachings of Lind with the teachings as taught by Lemay in order to simplify the user interface by providing a more efficient and user-friendly way to interact/communicate with the system. Allowable Subject Matter Claims 129 and 138-140 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion 3. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMY ONYEKABA whose telephone number is (571)270-7633. The examiner can normally be reached on 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, NITIN K PATEL can be reached on 5712727677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMY ONYEKABA/Primary Examiner, Art Unit 2628
Read full office action

Prosecution Timeline

May 13, 2025
Application Filed
Feb 18, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596459
TOUCH CONTROL SCREEN AND TOUCH CONTROL DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12598890
DISPLAY DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12596461
ELECTRONIC DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12591335
SYSTEM AND METHOD FOR DIFFERENTIAL PARALLEL TOUCH SENSING
2y 5m to grant Granted Mar 31, 2026
Patent 12585362
IDENTIFYING AN OBJECT USING CAPACITIVE SENSING OF PREDETERMINED SPATIAL PATTERNS OF DETECTABLE ELEMENTS
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
84%
Grant Probability
90%
With Interview (+6.3%)
2y 1m
Median Time to Grant
Low
PTA Risk
Based on 482 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month