DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 20 is rejected under 35 U.S.C. 101 because it describes a computer program per se.
Computer programs claimed as computer listings per se, i.e., the descriptions or expressions of the programs, are not physical "things." They are neither computer components nor statutory processes, as they are not "acts" being performed. Such claimed computer programs do not define any structural and functional interrelationships between the computer program and other claimed elements of a computer which permit the computer program's functionality to be realized. In contrast, a claimed non-transitory computer-readable medium encoded with a computer program is a computer element which defines structural and functional interrelationships between the computer program and the rest of the computer which permit the computer program's functionality to be realized, and is thus statutory. See Lowry, 32 F.3d at 1583-84, 32 USPQ2d at 1035.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 2, 15 and 17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 2 recites, “in response to the hand of the user moving under the menu-triggering gesture and the movement trajectory meeting the preset condition”. The meaning of, “the hand of the user moving under the menu-triggering gesture” is unclear and is not adequately described in the specification. For the purposes of this office action, “under” has been interpreted as “according to” - “the hand of the user moving according to the menu-triggering gesture”.
Claims 15 and 17 are rejected for depending from claim 2.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3 are rejected under 35 U.S.C. 103 as being unpatentable over Selig et al (US 20250068297 A1).
Regarding claim 1, Selig discloses a method for triggering a menu (Selig [0070], “process 500A (a method) can identify any suitable gesture that can be associated with or indicative of an intention to open a virtual menu (triggering a menu)”), comprising:
in response to a menu-triggering gesture presented by a hand of a user, detecting a movement trajectory of the hand of the user (Selig [0038], “the HMD 200 or 250 can also include additional subsystems, such as an eye tracking unit … etc., to monitor indications of user interactions and intentions … monitor the positions and poses of the user's hands to determine gestures … determining a gaze direction (gaze/head posture relative to the hand)”; [0070], “process 500A can identify any suitable gesture that can be associated with or indicative of an intention to open a virtual menu … process 500A can identify … a circling gesture (detecting a movement trajectory of the hand of the user), a movement in a particular direction, etc.”);
in response to the movement trajectory meeting a preset condition, displaying a menu panel to the user (Selig [0087], “In response to detecting rotation of hand 708, the XR device can render example view 700C in which particular selectable elements 706A, 706E-H are displayed.”).
Selig does not expressly disclose
a posture of a head relative to the hand of the user that meets a preset triggering condition
However, Selig suggests
a posture of a head relative to the hand of the user that meets a preset triggering condition (Selig [0038], “the HMD 200 or 250 can also include additional subsystems, such as an eye tracking unit … etc., to monitor indications of user interactions and intentions … monitor the positions and poses of the user's hands to determine gestures and other hand and body motions … determining a gaze direction (gaze/head posture relative to the hand)”; [0070], “process 500A can identify any suitable gesture that can be associated with or indicative of an intention to open a virtual menu … process 500A can identify … a circling gesture (detecting a movement trajectory of the hand of the user), a movement in a particular direction, etc.”);
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to determine gesture input based on the hand, body and gaze/head pose data. This would have been done to accurately determine user intent of opening a menu without errors.
Regarding claim 2, Selig discloses the method according to claim 1, wherein in response to the movement trajectory meeting the preset condition, displaying the menu panel to the user comprises:
in response to the hand of the user moving under the menu-triggering gesture and the movement trajectory meeting the preset condition, displaying the menu panel to the user (Selig [0088], “In response to detecting downward motion of hand 708 (user’s hand is moving under/according to a menu-triggering gesture), the XR device can render example view 700D in which particular selectable elements 706C-G are displayed (displaying a menu panel to the user).”).
Regarding claim 3, Selig discloses the method according to claim 1, wherein the preset condition comprises any one of the following:
the movement trajectory is a random closed shape;
the movement trajectory is a specific closed shape (Selig [0070], “a circling gesture”); and
the movement trajectory is a closed shape, and a dimension of the closed shape is within a preset size range.
Regarding claim 14, Selig discloses the method according to claim 1, wherein the menu panel is configured to provide at least one functional option; and the method further comprises:
after the menu panel is displayed to the user, obtaining a functional option selected from the menu panel through a menu selection gesture by the user, and performing the corresponding operation according to the functional option selected by the user (Selig [0071], “process 500A can render a virtual menu on the XR device in the XR environment. The virtual menu can include one or multiple selectable elements (e.g., virtual buttons or icons) corresponding to actions that can be taken on the XR device (obtaining a functional option selected from the menu panel through a menu selection gesture by the user) … the actions can include system-level actions controlling system-level functions on the XR device, such as volume controls, display controls, activation or deactivation of functions (e.g., audio capture, image capture, video capture, etc.), display of time or battery level, etc. (performing the corresponding operation according to the functional option selected by the user)”); and
wherein the menu-triggering gesture is the same as the menu selection gesture (Selig figs. 6B-C (both gestures are the same); Selig [0012] “FIG. 6B is a conceptual diagram illustrating an example view on an artificial reality device of a selectable element being highlighted with a gesture on a virtual menu displayed in an artificial reality environment.”; [0013] “FIG. 6C is a conceptual diagram illustrating an example view on an artificial reality device of a further selectable element being displayed in a virtual menu, corresponding to a sub-action of a highlighted selectable element, based on movement of the gesture off of the highlighted selectable element.”).
Regarding claim 15, , Selig discloses the method according to claim 2, wherein the preset condition comprises any one of the following:
the movement trajectory is a random closed shape;
the movement trajectory is a specific closed shape (Selig [0070], “a circling gesture”); and
the movement trajectory is a closed shape, and a dimension of the closed shape is within a preset size range.
Regarding claim 17, Selig discloses the method according to claim 2, wherein the menu panel is configured to provide at least one functional option; and the method further comprises:
after the menu panel is displayed to the user, obtaining a functional option selected from the menu panel through a menu selection gesture by the user, and performing the corresponding operation according to the functional option selected by the user (Selig [0071], “process 500A can render a virtual menu on the XR device in the XR environment. The virtual menu can include one or multiple selectable elements (e.g., virtual buttons or icons) corresponding to actions that can be taken on the XR device (obtaining a functional option selected from the menu panel through a menu selection gesture by the user) … the actions can include system-level actions controlling system-level functions on the XR device, such as volume controls, display controls, activation or deactivation of functions (e.g., audio capture, image capture, video capture, etc.), display of time or battery level, etc. (performing the corresponding operation according to the functional option selected by the user)”); and
wherein the menu-triggering gesture is the same as the menu selection gesture (Selig figs. 6B-C (both gestures are the same); Selig [0012] “FIG. 6B is a conceptual diagram illustrating an example view on an artificial reality device of a selectable element being highlighted with a gesture on a virtual menu displayed in an artificial reality environment.”; [0013] “FIG. 6C is a conceptual diagram illustrating an example view on an artificial reality device of a further selectable element being displayed in a virtual menu, corresponding to a sub-action of a highlighted selectable element, based on movement of the gesture off of the highlighted selectable element.”).
Claim 18 recites an electronic device which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the electronic device of claim 18.
Additionally Selig discloses,
An electronic device (Selig [0009]), comprising:
a storage and at least one processor (Selig [0029]);
wherein the storage stores a computed-executed instruction (Selig [0029]); and
the at least one processor executes the computed-executed instruction stored in the storage, so that the at least one processor implements a method (Selig [0029]).
Claim 19 recites a non-transitory computer-readable storage medium which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the non-transitory computer-readable storage medium of claim 19.
Additionally, Selig discloses
A non-transitory computer-readable storage medium with a computer-executed instruction stored thereon, wherein the computer-executed instruction, when being executed by a processor, implements a method (Selig [0029]).
Claim 20 recites a computer program product which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the computer program product of claim 20.
Additionally, Selig discloses
A computer program product comprising a computer program that, when being executed by a processor (Selig [0029]).
Claims 4-5 are rejected under 35 U.S.C. 103 as being unpatentable over Selig in view of Hicks et al (US 20140218343 A1).
Regarding claim 4, Selig discloses the method according to claim 1, wherein in response to the movement trajectory meeting the preset condition, displaying the menu panel to the user comprises:
But does not disclose in response to a movement of the hand of the user, providing corresponding visual feedback based on progress of the movement until the preset condition is met, displaying the menu panel to the user.
However, Hicks discloses
in response to a movement of the hand of the user, providing corresponding visual feedback based on progress of the movement until the preset condition is met, displaying the menu panel to the user (Hicks fig. 4b (providing corresponding visual feedback based on progress of the movement); [0053], “when the circular stylus gesture is performed, the tools menu is opened and displayed to the user (displaying the menu panel to the user when the gesture condition is met)”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Selig with Hicks to display gesture progression during a menu selection process. This would enable users to see their actions while accessing menus in an accurate manner.
Regarding claim 5, Selig in view of Hicks discloses the method according to claim 4, wherein in response to the movement of the hand of the user, providing corresponding visual feedback based on the progress of the movement comprises:
in response to the movement of the hand of the user, displaying a trailing or a movement trajectory corresponding to a preset part of the hand of the user, until the preset condition is met, displaying the menu panel in the final state (Hicks fig. 4b (displaying a movement trajectory corresponding to a preset part of the hand of the user); [0031], “The screen display can be layered above input sensors, such as a capacitive sensor grid for passive touch-based input, such as with a finger (movement of the hand)”; [0053], “when the circular stylus gesture is performed, the tools menu is opened and displayed to the user (displaying the menu panel to the user when the gesture condition is met)”)..
Selig in view of Hicks does not disclose
displaying a gradient menu panel based on the progress of the movement
However, Rainisto discloses
displaying a gradient menu panel based on the progress of the movement (Rainisto [0051], “in a menu application, the `scroll up` function is performed on detecting each first partial gesture. A `scroll-up` function displays a menu item that precedes the current menu item (displaying a gradient menu panel based on a gesture).”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Selig further with Rainisto to display a gradient menu during a menu gesturing action. This would have been done to accurately display user actions and thereby providing an accurate feedback for proper menu selection.
Allowable Subject Matter
Claims 6-13 and 16 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Regarding claim 6, none of the prior art of record, alone or in combination, disclose the claim as recited. In particular, none of the prior art of record, alone or in combination, disclose
“in response to the progress of movement meeting a progress condition, displaying the menu panel in an initial state, wherein attribute information of the menu panel in the initial state is different from that in the final state;
the attribute information comprises: size and/or opacity”
Claims 7-10 and 16 are allowed for depending from claim 6.
Regarding claims 11-13, none of the prior art of record, alone or in combination, disclose the claim as recited. In particular, none of the prior art of record, alone or in combination, disclose
“calculating a confidence level that the user has an intention to trigger the menu according to the intersection position which is determined;
in response to the confidence level meeting a preset requirement and the hand of the user presenting the menu-triggering gesture, detecting the movement trajectory of the hand of the user”
Conclusion
See the notice of references cited (PTO-892) for prior art made of record, including art that is not relied upon but considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JITESH PATEL whose telephone number is (571)270-3313. The examiner can normally be reached 8am - 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said A. Broome can be reached at (571) 272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JITESH PATEL/Primary Examiner, Art Unit 2612