Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-3,5-10,12-17, and 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Schwarz et al (20220171469) hereinafter, Schwarz in view of Sun (2023/0195236) hereinafter, Sun.
In regards to claim 1, Schwarz teaches a method comprising (abstract):
obtaining hand tracking data for a plurality of camera frames;(fig. 3 (305))[0020-0022]
detecting a touch of a contact event between a first finger and a second finger based on the hand tracking data for a first camera frame of the plurality of camera frames, [0073-0076] (fig. 8a and 8b 830 and 812)); (fig. 8a (812, 814, and 816)) (fig. 330)[0085-0087, 101](fig. 8a and 8b thumb tip
PNG
media_image1.png
788
586
media_image1.png
Greyscale
PNG
media_image2.png
606
816
media_image2.png
Greyscale
)); and
PNG
media_image3.png
564
870
media_image3.png
Greyscale
in response to a determination that a contact location on the first finger is associated with a first gesture enabling a gesture-based input event corresponding to the first gesture (fig. 3 (325-350) (fig. 8a-8b (812-816)).
Schwarz fails to teach detecting a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and triggering an input action corresponding to the first gesture in response to detecting the release of the touch.
However, Sun teaches detecting a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames (fig. 2 S210-S220 pinch) ([0027] images captures are the frames); and
triggering an input action corresponding to the first gesture in response to detecting the release of the touch.(fig. 9b pinch and release to trigger option 915)
[0097] In one embodiment, in response to determining that the indicator R1 indicates the content area 914 in a case of the pinch gesture is determined to be the pinch-and-release gesture, the processor 104 may trigger the content area 914. For example, if the processor 104 determines that the pinch-and-release gesture is detected while the indicator R1 indicating (e.g., pointing to) an option 915 in the content area 914, the processor 104 may accordingly trigger the option 915 and perform the operations corresponding to the triggered option 915, but the disclosure is not limited thereto.
[0098] In one embodiment, in response to determining that the indicator R1 indicates the content area 914 in a case of the pinch gesture is determined to be the pinch-and-hold gesture, the processor 104 may scroll the content area 914 according to a movement of the hand gesture during the hand gesture maintaining the pinch-and-hold gesture. For example, if the processor 104 determines that the pinch-and-hold gesture is detected while the indicator R1 indicating (e.g., pointing to) the content area 914, the processor 104 may accordingly scroll the content area 914 upward/downward when the user's hand moves upward/downward while maintaining the pinch gesture, but the disclosure is not limited thereto.
PNG
media_image4.png
616
868
media_image4.png
Greyscale
It would have been obvious to one of ordinary skill in the art to modify the teachings of Schwarz to further include detecting a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and triggering an input action corresponding to the first gesture in response to detecting the release of the touch as taught by Sun in order to provide to be beneficial for the developer to design various mechanisms for the user to interact with the VR world with various hand gestures.[0003]
In regards to claim 8, Schwarz teaches non-transitory computer readable medium comprising computer readable code executable by one or more processors to: obtain hand tracking;(fig. 3 (305))[0020-0022] data for a plurality of camera frames; [0073-0076] (fig. 8a and 8b 830 and 812)); detect a touch of a contact event between a first finger and a second finger based on the hand tracking data for a first camera frame of the plurality of camera frames, (fig. 8a (812, 814, and 816));; (fig. 330)[0085-0087, 101](fig. 8a and 8b thumb tip)); and in response to a determination that contact location on the first finger is associated with a first gesture, enable a gesture-based input event corresponding to the first gesture. (fig. 3 (325-350) (fig. 8a-8b (812-816)).
Schwarz fails to teach detect a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and trigger an input action corresponding to the first gesture in response to detecting the release of the touch.
However, Sun teaches detect a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and trigger an input action corresponding to the first gesture in response to detecting the release of the touch.(fig. 2 S210-S220 pinch) ([0027] images captures are the frames); and fig. 9b pinch and release to trigger option 915)
It would have been obvious to one of ordinary skill in the art to modify the teachings of Schwarz to further include detecting a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and triggering an input action corresponding to the first gesture in response to detecting the release of the touch as taught by Sun in order to provide to be beneficial for the developer to design various mechanisms for the user to interact with the VR world with various hand gestures.[0003]
In regards to claim 15, Schwarz teaches system comprising: one or more processors; and one or more computer readable media comprising computer readable code executable by the one or more processors to: ;(fig. 3 (305))[0020-0022] obtain hand tracking data for a plurality of camera frames [0073-0076] (fig. 8a and 8b 830 and 812));; detect a touch of a contact event between a first finger and a second finger based on the hand tracking data for a first camera frame of the plurality of camera frames, (fig. 8a (812, 814, and 816)); (fig. 330)[0085-0087, 101](fig. 8a and 8b thumb tip));; and
in response to a determination that the a contact location on the first finger is associated with a first gesture, enable a gesture-based input event corresponding to the first gesture. (fig. 3 (325-350) (fig. 8a-8b (812-816)).
Schwarz fails to teach detect a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and trigger an input action corresponding to the first gesture in response to detecting the release of the touch.
However, Sun teaches detect a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and trigger an input action corresponding to the first gesture in response to detecting the release of the touch.(fig. 2 S210-S220 pinch) ([0027] images captures are the frames); and fig. 9b pinch and release to trigger option 915)
It would have been obvious to one of ordinary skill in the art to modify the teachings of Schwarz to further include detecting a release of the touch of the contact event between the first finger and the second finger based on hand tracking data for a second camera frame of the plurality of camera frames; and triggering an input action corresponding to the first gesture in response to detecting the release of the touch as taught by Sun in order to provide to be beneficial for the developer to design various mechanisms for the user to interact with the VR world with various hand gestures.[0003]
In regards to claim 2, Schwarz in view of Sun teaches method of claim 1, wherein a contact event comprises a touch and a release of the touch, wherein the touch corresponds to the first finger making contact with the second finger, and wherein the gesture-based input event comprises a visual feedback (fig. 3 (325-350)[0065-0067] (fig. 10 1075/1085))). Schwarz (fig. 3a/b and fig. 9b initiation of the release) Sun
In regards to claim 3, Schwarz in view of Sun teaches method of claim 2, wherein the gesture-based input event comprises the visual indication of the gesture-based input event. (fig. 3 (325-350)[0065-0067] (fig. 10 1075/1085) and fig. 9a moving from 812 to 814 and fig. 10 1082 for example [0083-0085]) Schwarz (fig. 3a/b and fig. 9b initiation of the release) Sun
In regards to claim 5, Schwarz in view of Sun teaches method of claim 1, wherein a determination is made as to whether a contact event is detected in each of the one or more camera frames [0072]. Schwarz [0027] Sun
In regards to claim 6, Schwarz in view of Sun teaches method of claim 1, wherein detecting the contact event comprises: obtaining relational features from the hand tracking data; obtaining features related to action recognition [0029, 0048-0053]. Schwarz [0091-0098] Sun
In regards to claim 7, Schwarz in view of Sun teaches method of claim 1, wherein the first gesture comprises a pinch (fig. 9b pinch and release Sun)
In regards to claim 9, Schwarz in view of Sun teaches non-transitory computer readable medium of claim 8, wherein a contact event comprises a touch and a release (fig. 9b release Sun) of the touch [0070], wherein the touch corresponds to the first finger making contact with the second finger, and wherein the gesture-based input event comprises a visual feedback fig. 3 (325-350)[0065-0067, 0070] (fig. 10 1075/1085))) Schwarz. (fig. 3a/b and fig. 9b initiation of the release) Sun
In regards to claim 10, Schwarz in view of Sun teaches non-transitory computer readable medium of claim 9, wherein the gesture-based input event comprises the visual indication of the gesture-based input event. (fig. 9b executing 915) Sun and (fig. 3 (325-350)[0065-0067, 0070] (fig. 10 1075/1085))) Schwarz.
In regards to claim 12, Schwarz in view of Sun teaches non-transitory computer readable medium of claim 8, wherein a determination is made as to whether a contact event is detected in each of the one or more camera frames. [0072]. Schwarz [0027] Sun
In regards to claim 13, Schwarz in view of Sun teaches non-transitory computer readable medium of claim 8, wherein the computer readable code to detect the contact event comprises computer readable code to: obtain relational features from the hand tracking data; obtain features related to action recognition[0029, 0048-0053]. Schwarz [0046-0056] Sun
In regards to claim 14, Schwarz in view of Sun teaches non-transitory computer readable medium of claim 8, wherein the first gesture comprises a pinch. (fig. 9b pinch and release Sun)
In regards to claim 16, Schwarz in view of Sun teaches system of claim 15, wherein a contact event comprises a touch and a release of the touch, wherein the touch corresponds to the first finger making contact with the second finger, and wherein the gesture-based input event comprises a visual feedback. (fig. 3 (325-350)[0065-0067, 0070] (fig. 10 1075/1085))) Schwarz. (fig. 3a/b and fig. 9b initiation of the release) Sun
In regards to claim 17, Schwarz in view of Sun teaches system of claim 16, wherein the gesture-based event comprises visual indication of the gesture-based input event(fig. 3 (325-350)[0065-0067, 0070] (fig. 10 1075/1085))) Schwarz. (fig. 9b executing 915) Sun
In regards to claim 19, Schwarz in view of Sun teaches system of claim 15, wherein a determination is made as to whether a contact event is detected in each of the one or more camera frames. [0072]. Schwarz and [0027] Sun
In regards to claim 20, Schwarz in view Sun teaches system of claim 15, wherein the first gesture comprises a pinch. (fig. 9b pinch gesture Sun)
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-3,5-10,12-17, and 19-20 have been considered but are moot because the new ground of rejection does not rely on any combination of references applied in the prior rejection of record, for any teaching or matter specifically challenged in the argument. Examiner notes Sun has been incorporated to better address the release of the touch triggers.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRANT SITTA whose telephone number is (571)270-1542. The examiner can normally be reached M-F 7:30-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patrick Edouard can be reached at 571-272-6084. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GRANT SITTA/Primary Examiner, Art Unit 2622