DETAILED ACTION
This Office Action is sent in response to Applicant's Response received 03/05/2026 for 18587815. Claims 1, 3-9, 11-17, and 19-27 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 03/05/2026 has been entered.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 03/05/2026 was filed before the mailing date of a first action. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the IDS is being considered by the examiner.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1 have been considered but are not persuasive in view of the new and/or updated citations used in the current rejection of record under Kin in view of Kaehler in response to the newly amended limitations, including at least a control indicator moving along with a user hand as disclosed in Kin [Figs. 7A-7C, para 0103] and presented focus indicators when a hand gesture is terminated and inactive as disclosed in Kaehler [Figs. 14-15, para 0105-0106, 0131, 0133-0134].
With respect to Applicant's arguments that Kin does not disclose "one or more user interface elements anchored to a first hand of a user" [pgs. 10:2-11:1], Examiner notes the Office Action maps a displayed control indicator [Kin, Fig. 7A, para 0100, 0103] to the "one or more user interface elements". Therefore, when a displayed indicator moves along with a user's hand movement [Figs. 7A-7C, 8B-8C, para 0103, 0111-0112], Kin discloses "one or more user interface elements anchored to a first hand of a user". Claim 1 remains rejected.
Claims 9 and 17 recite similar limitations to those recited in claim 1 and remain rejected upon a similar basis as claim 1 as stated above.
Dependent claims 3-8, 11-16, and 19-27 remain rejected at least based on their dependence from independent claims 1, 9, and 17.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 3-9, 11-17, and 19-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kin et al. (US 20190212827 A1) in view of Kaehler et al. (US 20180157398 A1).
As to claim 1, Kin discloses a method comprising:
at an electronic device in communication with a display [Fig. 1, para 0030, device including display]:
presenting, via the display, one or more user interface elements anchored to a first hand of a user in a three-dimensional environment [Fig. 7B-7C, para 0079, 0100, 0103, display control indicator (read: user interface element) for control in artificial reality of real-world environment (read: three-dimensional environment) after user hand intersects user gaze toward control, where control indicator moves along with (read: anchored) user hand];
while presenting the one or more user interface elements, detecting a first gesture corresponding to a respective user interface element of the one or more user interface elements in the three-dimensional environment [Fig. 7B, para 0063, 0100-0101, determine user input at control including completing gesture (read: first gesture) in artificial reality environment displaying control indicator (read: respective user interface element)];
in accordance with a determination that the first gesture is being maintained for a threshold period of time, presenting a first affordance associated with the respective user interface element in the three-dimensional environment [Fig. 7B, para 0100, 0105-0106, display temperature indicator (read: first affordance) indicating set point of control indicator in artificial reality environment subsequent to user input at control remaining n gesture for duration (read: threshold period of time) and before exiting gesture (read: maintaining first gesture)],
while presenting the first affordance and maintaining the first gesture, detecting a movement gesture directed to the first affordance in the three-dimensional environment [Figs. 7B-7C, para 0101-0103, 0106, 0131-0132, determine user hand movement (read: movement gesture) moving control indicator and changing temperature indicator subsequent to user input at control and before exiting gesture in artificial reality environment, note hand moving control updates temperature indicator and falls under broadest reasonable interpretation of directed to including being controlled or managed]; and
in accordance with a determination that the movement gesture satisfies one or more movement criteria, adjusting a value associated with the respective user interface element [Figs. 7A-7C, para 0104-0105, change set point (read: value) of indicator control after determining movement distance past threshold (read: satisfies movement criteria)].
While Kin does not teach "in accordance with a determination that the first gesture is terminated and not maintained for the threshold period of time, presenting one or more second affordances associated with the respective user interface element in the three-dimensional environment", note one of ordinary skill in the art would recognize that the broadest reasonable interpretation of a method (or process) claim having contingent limitations requires only those steps that must be performed and does not include steps that are not required to be performed because the condition(s) precedent are not met, thus the claim scope does not require the contingent limitation "presenting one or more second affordances associated with the respective user interface element in the three-dimensional environment" to be performed if the precedent condition "in accordance with a determination that the first gesture is terminated and not maintained for the threshold period of time" is not met [see MPEP 2111.04(II)].
Nevertheless, in an effort to advance compact prosecution, Kin teaches a determination that the first gesture is terminated [para 0107, detect user exits pinch gesture] and in accordance with a determination that another gesture is terminated, presenting one or more second affordances associated with the respective user interface element in the three-dimensional environment [Figs. 8A-8C, para 0109, 0111-0113, detect user exits pinch gesture and display contextual menu including options (read: one or more second affordances) in place of control indicator in artificial reality environment].
Kaehler teaches in accordance with a determination that the first gesture is terminated and not maintained for the threshold period of time, presenting one or more second affordances associated with the respective user interface element in the three-dimensional environment [Figs. 14-15, para 0105-0106, 0131, 0133-0134, display focus indicator (read: one or more second affordances) indicating virtual remote (read: respective user interface element) when determining user terminates hand gesture selecting remote and is inactive with respect to virtual remote for threshold period of time].
Kin and Kaehler are analogous art to the claimed invention being from a similar field of endeavor of computer-generated environments. Thus it would have been obvious to one skilled in the art before the effective filing date of the claimed invention apply the teachings of Kin presenting one or more second affordances in accordance with a determination that a gesture is terminated to the teachings of Kaehler presenting one or more second affordance in accordance with a determination that a gesture is terminated and not maintained for a threshold period of time with a reasonable expectation of success to result in in accordance with a determination that the first gesture is terminated and not maintained for the threshold period of time, presenting one or more second affordances associated with the respective user interface element in the three-dimensional environment [see MPEP 2143].
One of ordinary skill in the art would be motivated to apply this teaching to Kin to minimize unneeded virtual controls [Kaehler, para 0132].
As to claim 3, Kin discloses the method of claim 2, wherein:
the first affordance is representative of the value being adjusted [Figs. 7B-7C, para 0100, 0106, display temperature indicator indicates updated set point of control indicator]; and
in accordance with the adjusting of the value associated with the respective user interface element, an appearance of the first affordance is modified [Figs. 7B-7C, para 0104-0106, update display of temperature indicator indicating changed indicator control set point].
As to claim 4, Kin discloses the method of claim 1, wherein the movement gesture is detected in the three-dimensional environment at a location different from the first affordance [Figs. 7B-7C, para 0103-0105, determine user hand movement in artificial reality environment including temperature indicator, note Figures 7B and 7C show different locations for determined hand movement and displayed temperature indicator].
As to claim 5, Kin discloses the method of claim 1, wherein the detection of the first gesture is performed in accordance with detecting a pinch and hold gesture performed by a hand in the three-dimensional environment [Fig. 7B, para 0100-0101, 0106, determine input of user hand completing pinch gesture before existing gesture (read: pinch and hold) in artificial reality environment].
As to claim 6, Kin discloses the method of claim 1, further comprising
determining a gaze location [Figs. 7A-7B, 11, para 0100-0102, 0129-0130, determine user gaze terminus]; and
wherein detecting the first gesture corresponding to the respective user interface element includes determining the respective user interface element associated with the determined gaze location [Figs. 7B, 11, para 0063, 0100-0101, 0130, determine user input at control with user gaze intersecting completed gesture and terminating at control indicator].
As to claim 7, Kin discloses the method of claim 1, wherein in accordance with detecting the first gesture corresponding to the respective user interface element [Fig. 7B, para 0063, 0100-0101, determine user input at control including completing gesture in artificial reality environment displaying control indicator].
However, Kin does not specifically disclose causing other user interface elements to cease being presented in the three-dimensional environment.
Kaehler discloses wherein in accordance with detecting the first gesture corresponding to the respective user interface element, causing other user interface elements to cease being presented in the three-dimensional environment [Figs. 13-14, para 0123-0124, fade away focus indicators (read: other user interface elements) presented in environment during user selection interaction (read: first gesture) with virtual remote (read: respective user interface element)].
Kin and Kaehler are analogous art to the claimed invention being from a similar field of endeavor of computer-generated environments. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the presentation of other user interface elements with respect to a selected user interface element as disclosed by Kin the ceasing presentation of other user interface elements in accordance with determining selection of the user interface element as disclosed by Kaehler with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kin as described above to remove unneeded virtual indicators [Kaehler, para 0123].
As to claim 8, Kin discloses the method of claim 1, wherein:
the first gesture is a pinch and hold gesture [Fig. 7B, para 0100-0101, 0106, determine input of user hand completing pinch gesture before existing gesture]; and
detecting the movement gesture is performed in accordance with detecting movements of the first hand of the user in the three-dimensional environment [Figs. 7B-7C, para 0101-0103, 0106, determine user hand movement in artificial reality environment].
As to claim 9, Kin discloses an electronic device [Fig. 1, para 0030, device], comprising: one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions [para 0135-0136, computing device includes processor executing program code stored in storage medium] for: performing limitations substantially similar to those recited in claim 1 and is rejected under similar rationale.
As to claims 11-16 and 26, Kin discloses the electronic device of claim 9 performing limitations substantially similar to those recited in claims 3-8 and 25, respectively, and are rejected under similar rationale.
As to claim 17, Kin discloses a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device [para 0135-0136, storage medium stores program code executed by computing device processor] to: perform limitations substantially similar to those recited in claim 1 and is rejected under similar rationale.
As to claims 19-24 and 27, Kin discloses the electronic device of claim 9 performing limitations substantially similar to those recited in claims 3-8 and 25, respectively, and are rejected under similar rationale.
As to claim 25, Kin discloses the method of claim 1, wherein one of the one or more second affordances correspond to the first affordance [Figs. 6A, 7C, para 0092, 0099-0100, display switch indicator (read: one or more second affordances) with control indicator], the method further comprising:
while presenting the one or more second affordances, detecting a second gesture in the three-dimensional environment, including movement [Figs. 7A-7C, para 0099-0100, 0104-0105, 0108, display switch indicator with control indicator and determine user hand movement (read: second gesture) in artificial reality environment]; and
in accordance with a determination that the movement of the second gesture satisfies the one or more movement criteria, adjusting the value associated with the respective user interface element [Figs. 7A-7C, para 0104-0105, 0108, change set point of indicator control after determining movement distance past threshold, note user movements include upward or downward direction to change set point].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Canberk et al. (US 20220206588 A1), Ravasz et al. (US 20200387287 A1), and Wang et al. (US 20180120944 A1) generally disclose determining a pinch and pull gesture of a user interface element presented in a computer-generated environment.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LINDA HUYNH whose telephone number is (571)272-5240 and email is linda.huynh@uspto.gov. The examiner can normally be reached M-F between 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LINDA HUYNH/Primary Examiner, Art Unit 2172