Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1,4,9,12,16 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lacey (2021/0263593) hereinafter, Lacey in view Doganis (2021/0200322) hereinafter, Doganis further in view of Kojima (2026/0037071) hereinafter, Kojima.
In regards to claim 1, Lacey teaches a method comprising (abstract):
PNG
media_image1.png
576
848
media_image1.png
Greyscale
obtaining sensor data of a hand performing a pinch [0098, 113-115, 117](fig. 3 302-314), wherein the hand performs a pinch in the hand tracking data [0010, 0029,0034,0041-0044;0075-0088,100-108] (fig. 13-15 hand and pinch))
PNG
media_image2.png
786
546
media_image2.png
Greyscale
Lacey fails to teach determining location information from one or more hand joints and one ore more finger joints.
However, Doganis teaches determining location information from one or more hand joints and one or more finger joints.
[0092] Using the palm and/or the back of the hand to allow the users to provide an orientation to the system makes the method more reliable. Indeed the relative planarity of these areas is slightly determined by the gestures of the hand. Furthermore, these areas are the largest surfaces of the hand, which increases the reliability of the determination of their orientation. The position of the palm and/or the back of the hand may be for example located at the wrist joint 160 or the centroid of the triangle formed by the wrist joint 160, the position of the metacarpophalangeal joint 170 of index finger and the position of the metacarpophalangeal joint 180 of little finger.
PNG
media_image3.png
411
556
media_image3.png
Greyscale
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey to further include determining location information from one or more hand joints and one or more finger joints as taught by Doganis in order to using :the palm and/or the back of the hand to allow the users to provide an orientation to the system makes the method more reliable.[092].
Lacey and Doganis fail to teach applying an offset comprising a predefined distant from a hand location.
However, Kojima teaches applying an offset comprising a predefined distant from a hand location (fig. 9c and 9d midpoint) (fig. 11 fingertip position) [0077-0078,0087] Kojima.
[0077] The target tip detector 611 searches the image of the hand or finger acquired from the camera 10 in a search axis direction, detects a fingertip corresponding to an uppermost stream side in the search axis when a plurality of fingertips is present on the image, and outputs a detected fingertip end position (coordinates on the image) to the fingertip region both-ends detector 613. In an example of FIG. 11, there are two fingertips of the thumb and the index finger, and the fingertip of the index finger, among the thumb and the index finger, on the uppermost stream side in the search axis is detected as the fingertip end.
[0078] The offset calculator 612 calculates an offset vector (a vector indicated as “offset” of FIG. 11) by multiplying a unit vector in the search axis direction by a predetermined coefficient based on information in the search axis direction used by the target tip detector 611.
PNG
media_image4.png
496
596
media_image4.png
Greyscale
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis to further include applying an offset comprising a predefined distant from a hand location as taught by Kojima in order to make making a selection easier when it is placed between the thumb and finger[0048] and offsetting to provide a more natural feeling to the user [0087]
Therefore, Lacey in view of Doganis and Kojima teaches:
determining a hand position and a hand orientation for the hand from the sensor [0098-0099] Lacey based on a location of one or more hand joints;
PNG
media_image5.png
582
386
media_image5.png
Greyscale
PNG
media_image6.png
588
310
media_image6.png
Greyscale
applying an offset comprising a predefined distance from a hand location corresponding to the hand position and in a direction away from the hand location(fig. 9c and 9d midpoint) (fig. 11 fingertip position) [0077-0078,0087] Kojima [176] Lacey based on the hand orientation to determine a pinch centroid [175-176] Lacey ; and
Lacey [0176] If both the index finger tip keypoint and the thumb tip keypoint are unavailable, interaction point 1202 is moved to the midpoint between the index finger PIP keypoint and the thumb IP keypoint (e.g., the β location described above in reference to FIG. 6C). If the index finger PIP keypoint is unavailable (e.g., occluded or below a critical confidence level), interaction point 1202 is moved to the midpoint between the index finger MCP keypoint and the thumb IP keypoint. If the thumb finger IP keypoint is unavailable (e.g., occluded or below a critical confidence level), interaction point 1202 is moved to the midpoint between the index finger PIP keypoint and the thumb MCP keypoint. If both the index finger PIP keypoint and the thumb IP keypoint are unavailable, interaction point 1202 is moved to the midpoint between the index finger MCP keypoint and the thumb MCP keypoint (e.g., the γ location described above in reference to FIG. 6C).
Causing movement of a user interface component based on a location of the pinch centroid [171-178] (fig. 7c ray and pinch (fig. 12 (1202) 0097-0098,105) Lacey.
PNG
media_image7.png
798
514
media_image7.png
Greyscale
PNG
media_image8.png
474
726
media_image8.png
Greyscale
In regards to claim 9, Lacey teaches non-transitory computer readable medium comprising computer readable code executable by one or more processors to (abstract fig. 2 (250)), : obtain sensor data of a hand performing a pinch [0098, 113-115, 117](fig. 3 302-314), [0010, 0029,0034,0041-0044;0075-0088,100-108] (fig. 13-15 hand and pinch));
Lacey fails to teach determining location information from one or more hand joints and one or more finger joints.
However, Doganis teaches determining location information from one or more hand joints and one ore more finger joints.(fig. 19 hand joints and [0092]) Doganis)
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey to further include determining location information from one or more hand joints and one ore more finger joints as taught by Doganis in order to using :the palm and/or the back of the hand to allow the users to provide an orientation to the system makes the method more reliable.[092]
Lacey and Doganis fail to teach applying an offset comprising a predefined distant from a hand location.
However, Kojima teaches applying an offset comprising a predefined distant from a hand location (fig. 9c and 9d midpoint) (fig. 11 fingertip position) [0077-0078,0087] Kojima.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis to further include applying an offset comprising a predefined distant from a hand location as taught by Kojima in order to make making a selection easier when it is placed between the thumb and finger[0048] and offsetting to provide a more natural feeling to the user [0087]
Therefore, Lacey in view of Doganis and Kojima teaches determine a hand position and a hand orientation for the hand from the sensor data [0098-0099] Lacey based on a location of one or more hand joints; [176] Lacey (fig. 19 hand joints and [0092]) Doganis)
apply an offset comprising a predefined distance from a hand location corresponding to the hand position and in a direction and away from the hand location [176] Lacey based on the hand orientation to determine a pinch centroid [175-176] Lacey and (fig. 9c and 9d midpoint) (fig. 11 fingertip position) [0077-0078,0087] Kojima.
cause movement of a user interface component based on a location of the pinch centroid. [171-178] (fig. 7c ray and pinch (fig. 12 (1202)(fig. 3 (314)[0097-0098,105 Lacey).
In regards to claim 16, Lacey teaches system comprising: one or more processors(abstract fig. 2 (250)); and one or more computer readable media comprising computer readable code executable by the one or more processors to:
Obtain sensor data of a hand performing a pinch [0098, 113-115, 117](fig. 3 302-314), [0010, 0029,0034,0041-0044;0075-0088,100-108] (fig. 13-15 hand and pinch));
Lacey fails to teach determining location information from one or more hand joints and one or more finger joints.
However, Doganis teaches determining location information from one or more hand joints and one or more finger joints.(fig. 19 hand joints and [0092]) Doganis)
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey to further include determining location information from one or more hand joints and one ore more finger joints as taught by Doganis in order to using :the palm and/or the back of the hand to allow the users to provide an orientation to the system makes the method more reliable.[092]
Lacey and Doganis fail to teach applying an offset comprising a predefined distant from a hand location.
However, Kojima teaches applying an offset comprising a predefined distant from a hand location (fig. 9c and 9d midpoint) (fig. 11 fingertip position) [0077-0078,0087] Kojima.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis to further include applying an offset comprising a predefined distant from a hand location as taught by Kojima in order to make making a selection easier when it is placed between the thumb and finger[0048] and offsetting to provide a more natural feeling to the user [0087]
Therefore, Lacey in view of Doganis and Kojima teaches
determine a hand position and a hand orientation for the hand from the sensor data based on a location of one or more hand joints [0098-0099] Lacey; (fig. 19 hand joints and [0092]) Doganis);
apply an offset comprising a predefined distance from a hand location corresponding to the hand position and in a direction away from the hand location [176] Lacey based on the hand orientation to determine a pinch centroid [175-176] Lacey and (fig. 9c and 9d midpoint) (fig. 11 fingertip position) [0077-0078,0087] Kojima.
cause movement of a user interface component based on a location of the pinch centroid. [171-178] (fig. 7c ray and pinch (fig. 12 (1202)(fig. 3 (314)[0097-0098,105 Lacey).
In regards to claim 4, Lacey in view of Doganis and Kojima teaches method of claim 1, wherein hand joints comprises an index knuckle the hand position is based on index knuckle location, and wherein the offset is applied to the index knuckle location [152-160] (fig. 6c:IM.Lacey and “the position of the metacarpophalangeal joint 170 of index finger” [0090] Doganis
In regards to claim 12, Lacey in view of Doganis and Kojima teaches non-transitory computer readable medium of claim 9, wherein hand position is based on an index knuckle location, and wherein the offset is applied to the index knuckle location. [152-160] (fig. 6c:IM.)Lacey
In regards to claim 19, Lacey in view of Doganis and Kojima teaches system of claim 16, wherein hand position is based on an index knuckle location, and wherein the offset is applied to the index knuckle location. [152-160] (fig. 6c:IM)Lacey
Claim(s) 2, 10 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lacey and Doganis and Kojima in view of Calabrese et al 2018/0143693, hereinafter, Calabrese
In regards to claim 2, Lacey and Doganis and Kojima fail to teach the method of claim 1, wherein the movement of the user interface component is further determined based on a deadband applied about the pinch centroid
However, Calabrese teaches wherein the movement of the user interface component is further determined based on a deadband applied about the pinch centroid [0045-0048] (fig. 4 (404)) Calabrese.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis and Kojima to further include wherein the movement of the user interface component is further determined based on a deadband applied about the pinch centroid as taught by Calabrese in order to further disambiguate a user's intended gestures from unintended or negligible movements of the input objects [0048].
In regards to claim 10, Lacey and Doganis and Kojima in view of Calabrese, see rational of claim 2, teaches the non-transitory computer readable medium of claim 9, wherein the movement of the user interface component is further determined based on a deadband applied about the pinch centroid. [0045-0048] (fig. 4 (404)) Calabrese
In regards to claim 17, Lacey and Doganis and Kojima in view of Calabrese, see rational of claim 2, teaches system of claim 16, wherein the movement of the user interface component is further determined based on a deadband applied about the pinch centroid. [0045-0048] (fig. 4 (404)) Calabrese
Claim(s) 3, 11 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lacey and Doganis and Kojima in view of Nelson et al (2019/0126140) hereinafter, Nelson.
In regards to claim 3, Lacey and Doganis and Kojima teaches the method of claim 1, wherein the sensor data is associated with a user motion, and wherein causing movement of a user interface component comprises [0010, 0029,0034,0041-0044;0075-0088,100-108] (fig. 13-15 hand and pinch)) Lacey;
Lacey and Doganis and Kojima fails to teach determining an input motion based on the sensor data; and applying a sensitivity scaling to the input motion based on one or more characteristics of the input motion; and adapting a gain for an output based on the input motion and the sensitivity scaling.
However, Nelson teaches determining an input motion based on the sensor data; and applying a sensitivity scaling to the input motion based on one or more characteristics of the input motion; and adapting a gain for an output based on the input motion and the sensitivity scaling [0067-0087]. Nelson
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis and Kojima to further include determining an input motion based on the sensor data; and applying a sensitivity scaling to the input motion based on one or more characteristics of the input motion; and adapting a gain for an output based on the input motion and the sensitivity scaling as taught by Nelson in order to adjust sensitivity of input device [0003]
In regards to claim 11, Lacey and Doganis and Kojima in view of Nelson teaches, see rational of claim 3, the non-transitory computer readable medium of claim 9, wherein the sensor data is associated with a user motion, and wherein causing movement of a user interface component comprises [0010, 0029,0034,0041-0044;0075-0088,100-108] (fig. 13-15 hand and pinch)) Lacey; comprises computer readable code to: determine an input motion based on the sensor data; and apply a sensitivity scaling to the input motion based on one or more characteristics of the input motion; and adapt a gain for an output based on the input motion and the sensitivity scaling. [0067-0087]. Nelson
In regards to claim 18, Lacey and Doganis and Kojima in view of Nelson teaches, see rational of claim 3, system of claim 16, wherein the sensor data is associated with a user motion, and wherein causing movement of a user interface component comprises [0010, 0029,0034,0041-0044;0075-0088,100-108] (fig. 13-15 hand and pinch)) Lacey: determine an input motion based on the sensor data; and apply a sensitivity scaling to the input motion based on one or more characteristics of the input motion; and adapt a gain for an output based on the input motion and the sensitivity scaling. [0067-0087]. Nelson
Claim(s) 5-8,13-15, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lacey and Doganis and Kojima in view of Calabrese et al 2018/0143693, hereinafter, Calabrese further in view of Dao et al (2023/0031200) hereinafter, Dao
In regards to claim 5, Lacey and Doganis and Kojima and Calbrese fail to teach Lacey fails to teach the method of claim 1, further comprising: in response to detecting that a location of the pinch centroid remains within a predefined dead zone during a duration of the pinch, processing the pinch as a tap gesture.
However, Dao teaches further comprising: in response to detecting that a location of the pinch centroid remains within a predefined dead zone during a duration of the pinch, processing the pinch as a tap gesture. [0017,0035, 0039] (fig 11-12 pinch).Examiner notes Dao drag discussion to emulate a mouse click [0035] within a distance and speed.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis and Calabrese in response to detecting that a location of the pinch centroid remains within a predefined dead zone during a duration of the pinch, processing the pinch as a tap gesture as taught by Dao in order to implement movements similar to a mouse that the user is familiar with (abstract [0004-006])
In regards to claim 6, Lacey and Doganis and Kojima and Calabrese fail to teach the method of claim 1, further comprising: in response to detecting that a location of the pinch centroid exits a predefined dead zone during a duration of the pinch, processing the pinch as a drag gesture.
However, Dao teaches further comprising: in response to detecting that a location of a pinch centroid exits a predefined dead zone during a duration of the pinch, processing the pinch as a drag gesture [0017, 0039] (fig 11-12 pinch and drag outside of threshold distance).
It would have been obvious to one of ordinary skill in the art to modify the teachings of Lacey and Doganis and Kojima and Calabrese to further include further comprising: in response to detecting that a location of the pinch centroid exits a predefined dead zone during a duration of the pinch, processing the pinch as a drag gesture as taught by Dao in order to implement movements similar to a mouse that the user is familiar with (abstract [0004-006])
In regards to claim 7, Lacey and Doganis and Kojima and Calabrese in view of Dao teaches the method of claim 6, wherein the hand tracking data is associated with a user motion, and wherein a size of the dead zone is adapted in accordance with the user motion. [0017] FIG. 7 depicts an example of a drag gesture detection, consisting of a pinch followed by a change in hand location, followed by a release (release of pinched fingers). The change in distance between the thumb and index finger, meeting a set of criteria (factors such as change in distance and speed).Dao Examiner notes if both speed and distance aren’t met the zone will be adaptive until then.
In regards to claim 8, Lacey and Doganis and Kojima and Calabrese in view of Dao teaches method of claim 7, further comprising: in accordance with a determination that the user motion has ceased, reducing a size of the dead zone [0017] Dao. Examiner notes once motion has ceased the distance and speed criteria will still be the same and the zone/threshold will defacto been reset.
In regards to claim 13, Lacey and Doganis and Kojima and Calabrese in view of Dao teaches non-transitory computer readable medium of claim 9, see rational of claim 5, further comprising computer readable code to: in response to detecting that a location of the pinch centroid remains within a predefined dead zone during a duration of the pinch, process the pinch as a tap gesture; and in response to detecting that the location of the pinch centroid exits the predefined dead zone during the duration of the pinch, process the pinch as a drag gesture. . [0017,0035, 0039] (fig 11-12 pinch).Examiner notes Dao drag discussion to emulate a mouse click [0035] within a distance and speed.
In regards to claim 14, Lacey and Doganis and Kojima and Calabrese in view of Dao teaches non-transitory computer readable medium of claim 13, wherein the hand tracking data is associated with a user motion, and wherein a size of the predefined dead zone is adapted in accordance with the user motion. [0017] FIG. 7 depicts an example of a drag gesture detection, consisting of a pinch followed by a change in hand location, followed by a release (release of pinched fingers). The change in distance between the thumb and index finger, meeting a set of criteria (factors such as change in distance and speed).Dao Examiner notes if both speed and distance aren’t met the zone will be adaptive until then.
In regards to claim 15, Lacey and Doganis and Kojima and Calabrese in view of Dao teaches non-transitory computer readable medium of claim 14, further comprising computer readable code to: in accordance with a determination that the user motion has ceased, reduce a size of the predefined dead zone. [0017,0035, 0039] (fig 11-12 pinch).Examiner notes Dao drag discussion to emulate a mouse click [0035] within a distance and speed.
In regards to claim 20, Lacey and Doganis and Kojima and Calabrese in view of Dao teaches, see rational of claim 5, system of claim 19, further comprising computer readable code to: in response to detecting that a location of the pinch centroid remains within a predefined dead zone during a duration of the pinch, process the pinch as a tap gesture; and in response to detecting that the location of the pinch centroid exits the predefined dead zone during the duration of the pinch, process the pinch as a drag gesture. [0017,0035, 0039] (fig 11-12 pinch).Examiner notes Dao drag discussion to emulate a mouse click [0035] within a distance and speed.
Response to Arguments
Applicant’s arguments, see Remarks, filed 12/19/2025, with respect to the rejection(s) of claim(s) 1, 4, 9, 12, 16, and 19 under Lacey and Doganis have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Lacey and Doganis and Kojima.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRANT SITTA whose telephone number is (571)270-1542. The examiner can normally be reached M-F 7:30-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patrick Edouard can be reached at 571-272-6084. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GRANT SITTA/Primary Examiner, Art Unit 2622