Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This communication is in response to: Amendments filed on October 16th, 2025
This action is made Final.
Claims 1-20 are pending claims.
Applicants amended claims 1, 4, 15, 16, 19, and 20.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lacey, US PG PUB# 2019/0362557 A1 (hereinafter Lacey) in view of Hansen et al., US PG PUB# 2016/0231812 A1 (hereinafter Hansen).
As for independent claim 1:
Lacey discloses a method, comprising:
displaying, on a first portion of a display of an electronic device, an interface of an application, wherein a second portion of the display displays content unrelated to the application (Lacey discloses user interface of an application and multimodal user inputs including augmented reality, virtual reality, mixed reality. 0186 existing benefits of direct user inputs while improving accuracy of interacting with object in the 3D space and reducing user fatigue);
detecting, by a system process of the electronic device, a first indirect engagement indicator performed by a user of the electronic device and that the first indirect engagement indicator is directed toward the interface of the application (0187, Lacey shows a plurality of user inputs including head/eye pose, gesture, and voice. 0187: In multimodal user input techniques, one or more of the direct inputs may be used to identify a target virtual object (also referred to as a subject) which a user will interact with and to determine a user interface operation that will be performed on the target virtual object. For example, the user interface operation may include a command operation, such as select, move, zoom, pause, play, and a parameter of the command operation (such as, e.g., how to carry out the operation, where or when to the operation will occur, with which object will the target object interact, etc.). As an example of identifying a target virtual object and determining an interaction to be performed on the target virtual object, a user may look at a virtual sticky note (a head or eye pose mode of input), point at a table (a gesture mode of input), and say “move that there” (a voice mode of input). The wearable system can identify that the target virtual object in the phrase “move that there” is the virtual sticky note (‘that”) and can determine the user interface operation involves moving (the executable command) the virtual sticky note to the table (“there”);
Lacey does not disclose while withholding from the application a first input to the application based on the first indirect engagement indicator, detecting, by the system process, a second indicator the first indirect engagement indicator; and based on the second indicator, providing, by the system process the first input to the application based the first indirect engagement indicator. Hansen discloses while withholding from the application a first input to the application based on the first indirect engagement indicator, detecting, by the system process, a second indicator confirming the first indirect engagement indicator; and based on the second indicator, providing, by the system process the first input to the application based the first indirect engagement indicator in 0046-0047, 0055, and 0058. Hansen discloses upon confirmation transmitting the corresponding input. Hansen also discloses gesture sequence and process the indicator. Accordingly it would have been obvious before the effective filing date of the claimed invention to a skilled artisan to modify the method of Lacey to incorporate the teaching of Hansen, thus allow transmit input upon confirmation (Hansen, 0046-0047).
As for dependent claim 2:
Lacey – Hansen discloses method of claim 1, wherein the interface is a user interface of the application appearing at a location separated from the display of the electronic device (0186-0187, Lacey discloses wearable system with head eye pose, gesture, and voice. Hansen, 0025, eye gaze tracking system).
As for dependent claim 3:
Lacey – Hansen discloses method of claim 1, wherein the first indirect engagement indicator corresponds to an eye gaze input, a hand gesture input, or voice input (Hansen, 0025, eye gaze as primary engagement indicator and Lacey 0186-0187).
As for dependent claim 4:
Lacey – Hansen discloses method of claim 3, wherein the second indicator corresponds to another eye gaze input, another hand gesture input, or another voice input, wherein the second indicator is different than the first indirect engagement indicator (Lacey, 0187; Hansen, 0046 confirmation mechanism that can use a different modality than the initial engagement).
As for dependent claim 5:
Lacey – Hansen discloses method of claim 4, wherein the first indirect engagement indicator or the second indicator corresponds to the hand gesture input, the hand gesture input being performed at a location separated from an apparent location of the interface (Hansen, 025, Lacey 0187 and 0407).
As for dependent claim 6:
Lacey – Hansen discloses method of claim 4, wherein the first indirect engagement indicator or the second indicator corresponds to the eye gaze input, wherein a ray corresponding to the eye gaze input intersects an apparent location of the interface (0325, Lacey discloses ray in the gaze direction and 0407, Lacey discloses hand gesture wherein the use points at the virtual object which corresponds to a ray from the user’s hand).
As for dependent claim 7:
Lacey – Hansen discloses method of claim 1, further comprising: providing, by the system process, a second input to the application, the second input corresponding to the second indicator (Hansen, 0046, see second input correspond to the second indicator).
As for dependent claim 8:
Lacey – Hansen discloses method of claim 1, wherein a ray extending from the first indirect engagement indicator or the second indicator intersects with the interface (Lacey, 0325 and 0407; Hansen 0024, 0035, see gaze ray intersecting with the interface).
As for dependent claim 9:
Lacey – Hansen discloses method of claim 1, wherein the second indicator corresponds to a direct engagement with the interface (Hansen, 0025, 0030).
As for dependent claim 10:
Lacey – Hansen discloses method of claim 1, further comprising: detecting, by the system process, a direct engagement input based on a gesture performed by the user that intersects with an apparent location of the interface; and providing the direct engagement input to the application (Hansen, 0030 and 0035, 0037. In 0037 Hansen discloses direct engagement input based on a gesture and location of the interface).
As for dependent claim 11:
Lacey – Hansen discloses method of claim 1, wherein detecting the first indirect engagement indicator comprises detecting the first indirect engagement indicator concurrently with detecting the second indicator (Hansen, 0034 and 0036).
As for dependent claim 12:
Lacey – Hansen discloses method of claim 1, further comprising: detecting, by the system process, a third indirect engagement indicator performed by the user and that the third indirect engagement indicator is directed toward the application; while withholding from the application a third input to the application based on the third indirect engagement indicator, detecting, by the system process, a fourth indicator performed by the user and that the fourth indicator is not directed toward the interface of the application; and continuing to withhold the third input and an input corresponding to the fourth indicator from the application (Lacey, 0006, 0409, 0430; Hansen, 0041, 0032, 0056-0058, see withholding input until confirmation and handing and processing multiple input modalities).
As for dependent claim 13:
Lacey – Hansen discloses method of claim 12, wherein the fourth indicator is directed toward an interface of a second application (Hansen, 0032, 0041, 0056-0058 see withholding input until confirmation and handing and processing multiple input modalities)
As for dependent claim 14:
Lacey – Hansen discloses method of claim 13, wherein the fourth indicator corresponds to a direct engagement with the interface of the second application, further comprising providing to the second application, an input corresponding to the fourth indicator (Hansen, 0041, 0032, 0056-0058 see withholding input until confirmation and handing and processing multiple input modalities).
As for independent claims 15 and 19:
Claims 15 and 19 contain substantial subject matter as claimed in claim 1 and are respectfully rejected along the same rationale.
As for dependent claim 16:
Claim 16 contains substantial subject matter as claimed in claims 3-6 and is respectfully rejected along the same rationale.
As for dependent claim 17:
Lacey – Hansen discloses electronic device of claim 15, wherein the second indicator corresponds to a direct engagement with the interface (Lacey, 0187; Hansen, 0046 confirmation mechanism that can use a different modality than the initial engagement).
As for dependent claim 18:
Claim 18 contains substantial subject matter as claimed in claim 11 and is respectfully rejected along the same rationale.
As for dependent claim 20:
Claim 20 contains substantial subject matter as claimed in claims 3-6 and is respectfully rejected along the same rationale.
It is noted that any citation to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33,216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968)).
The Examiner notes MPEP § 2144.01, that quotes In re Preda, 401 F.2d 825,159 USPQ 342, 344 (CCPA 1968) as stating “in considering the disclosure of a reference, it is proper to take into account not only specific teachings of the reference but also the inferences which one skilled in the art would reasonably be expected to draw therefrom.” Further MPEP 2123, states that “a reference may be relied upon for all that it would have reasonably suggested to one having ordinary skill the art, including nonpreferred embodiments. Merck & Co. v. Biocraft Laboratories, 874 F.2d 804, 10 USPQ2d 1843 (Fed. Cir.), cert. denied, 493 U.S. 975 (1989).
Response to Arguments
Applicant's arguments filed have been fully considered but they are not persuasive. The Office refers applicants to MPEP 2123, where the MPEP 2123 states the entire reference is cited and specific cited sections of the reference are not limiting in any way. Any citation to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33,216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d
During patent examination, the pending claims must be 'given the broadest reasonable interpretation consistent with the specification.' Applicant always has the opportunity to amend the claims during prosecution and broad interpretation by the examiner reduces the possibility that the claim, once issued, will be interpreted more broadly than is justified. In re Prater, 162 USPQ 541,550-51 (CCPA 1969).
Reference is made to MPEP 2144.01 - Implicit Disclosure
"[I]n considering the disclosure of a reference, it is proper to take into account not only specific teachings of the reference but also the inferences which one skilled in the art would reasonably be expected to draw therefrom." In re Preda, 401 F.2d 825, 826, 159 USPQ 342, 344 (CCPA 1968).
As for 35 USC 103 rejection:
As for independent claims 1, 5 and 19:
Applicants assert the cited portions of the asserted references cannot be seen as disclosing or suggesting all of the features of claim 1, particularly with respect to at least the features of "while withholding from the application a first input to the application based on the first indirect engagement indicator, detecting, by the system process, a second indicator confirming the first indirect engagement indicator" and "based on the second indicator, providing, by the system process the first input to the application based the first indirect engagement indicator." (Applicant’s Remarks, Pg. 8).
The Office respectfully disagrees.
As shown above in the last Office Action mailed on July 16th, 2025whatwwwww, Lacey and Hansen discloses while withholding from the application a first input to the application based on the first indirect engagement indicator, detecting, by the system process, a second indicator the first indirect engagement indicator; and based on the second indicator, providing, by the system process the first input to the application based the first indirect engagement indicator in Hansen 0049, 0053-0055. Hansen discloses processing the first input upon a second indicator engagement. The system does not process the first input that is based on the first indirect engagement, thus withholding from the application from any first input. The cited paragraphs clearly discloses withholding and confirming features. From the evidence shown above, Lacey and Hansen show and suggest all the limitations in claim 1. At this time claim 1 is not in condition for an allowance.
Applicants assert the same argument for independent claims 15 and 19. As disclosed above, the same rationales apply to independent claims 5 and 19.
As for dependent claim 10:
Applicants assert Claim 10 recites "detecting, by the system process, a direct engagement input based on a gesture performed by the user that intersects with an apparent location of the interface" and "providing the direct engagement input to the application analyzing the first received signals." The Office Action asserts that Hansen discloses or suggests these features at 30 and 35 (Applicant’s Remarks, Pg. 9).
The Office respectfully disagrees.
As shown above in the last Office Action mailed on July 16th, 2025, Lacey and Hansen discloses detecting, by the system process, a direct engagement input based on a gesture performed by the user that intersects with an apparent location of the interface" and "providing the direct engagement input to the application analyzing the first received signals in Hansen 0037. Hansen clearly discloses direct engagement input based on a gesture and location of the interface in 0037. In the cited section Hansen discloses detecting the user input based on a gesture and location of the interface. From the evidence shown above, Lacey and Hansen show the limitations in claim 10. At this time claim 10 is not in condition for an allowance.
As for dependent claim 12-14:
Applicants assert Claim 12 recites "detecting, by the system process, a third indirect engagement indicator performed by the user and that the third indirect engagement indicator is directed toward the application," "while withholding from the application a third input to the application based on the third indirect engagement indicator, detecting, by the system process, a fourth indicator performed by the user and that the fourth indicator is not directed toward the interface of the application," and "continuing to withhold the third input and an input corresponding to the fourth indicator from the application." The Office Action asserts that the combination of Lacey and Hansen disclose or suggest these features. Applicant respectfully disagrees (Applicant’s Remarks, Pg. 10).
The Office respectfully disagrees.
As shown above and in the last Office Action mailed on July 16th, 2025, Lacey and Hansen discloses detecting, by the system process, a third indirect engagement indicator performed by the user and that the third indirect engagement indicator is directed toward the application; while withholding from the application a third input to the application based on the third indirect engagement indicator, detecting, by the system process, a fourth indicator performed by the user and that the fourth indicator is not directed toward the interface of the application; and continuing to withhold the third input and an input corresponding to the fourth indicator from the application in Hansen, 0041, 0032, 0056-0058, see withholding input until confirmation and handing and processing multiple input modalities. Hansen clearly discloses third and fourth indirect engagement indicator in the cited sections, also see Figures 13 and 14. From the evidence shown above, Lacey and Hansen show the limitations in claim 12. At this time claim 12 is not in condition for an allowance.
As for dependent claims 13 and 14:
Applicants assert neither Lacey nor Hansen discloses wherein the fourth indicator is directed toward an interface of a second application and wherein the fourth indicator corresponds to a direct engagement with the interface of the second application, further comprising providing to the second application, an input corresponding to the fourth indicator (Applicant’s Remarks, Pgs. 10 and 11).
The Office respectfully disagrees.
As shown above and in the last Office Action mailed on July 16th, 2025, Lacey and Hansen discloses wherein the fourth indicator is directed toward an interface of a second application and wherein the fourth indicator corresponds to a direct engagement with the interface of the second application, further comprising providing to the second application, an input corresponding to the fourth indicator in Hansen, 0041, 0032, 0056-0058 see withholding input until confirmation and handing and processing multiple input modalities. Hansen clearly discloses third and fourth indicator in the cited sections and additionally see Figures 13 and 14. From the evidence shown above, Lacey and Hansen show the limitations in claims 13 and 14. At this time claims 13 and 14 are not in condition for an allowance.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID PHANTANA ANGKOOL whose telephone number is (571)272-2673. The examiner can normally be reached M-F, 7:00-3:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Bashore can be reached at 571-272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/David Phantana-angkool/Primary Examiner, Art Unit 2172