DETAILED ACTION
This action is responsive to the application filed 3/21/2024.
Claims 1-20 are pending. Claims 1, 3-11, 13-16 and 18-20 have been amended by preliminary amendment, and Claims 21-40 are canceled.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
Claims 19 and 20 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which a joint inventor regards as the invention.
Each of Claims 19 and 20 recites the limitation of determining a distance via "the one or more sensors.” There is insufficient antecedent basis for this limitation, as neither claim previously recites one or more sensors. Therefore, these claims are indefinite and appropriate correction is required.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 2, 4, 5, 10, 16-18 and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kin, et al., U.S. PGPUB No. 2019/0212827 (“Kin”).
With regard to Claim 1, Kin teaches a method comprising:
at a device comprising one or more processors, non-transitory memory, and one or more sensors ([0136] describes storage on non-transitory media; [0057] describes an application executed by a processor. [0058] describes tracking the device and user hands using information from sensors):
detecting a gesture being performed via a first object in association with a second object in a graphical environment ([0069] describes that an electronic display in a near eye device can include a virtual recreation of a user’s hands, as well as other objects. [0074] describes that the system can recognize gestures of the user’s hand as well as a recognized object to which the gesture was related);
determining, via the one or more sensors, a distance between a representation of the first object and the second object ([0092] describes that the distance between a gaze terminating at a particular object and the user’s hand can be determined);
on a condition that the distance is greater than a threshold, displaying a change in the graphical environment according to the gesture and a determined gaze ([0118]-[0119] describe an example where a user gesture is detected within a threshold such that a user pinch gesture is detected when an object is between the user’s fingers; upon detecting that the pinching hand has moved past a threshold distance, an informational display is shown relating to the object which is the terminus of the user’s gaze); and
on a condition that the distance is not greater than the threshold, displaying the change in the graphical environment according to the gesture and a projection of the representation of the first object on the second object ([0094]-[0095] describe that when the hand is within the threshold distance, such as identifying that an object is between the user’s fingers, and a gesture is input, a change to the object in the environment is carried out, such as changing a switch indicator and allowing a user to modify the switch position in the graphical environment).
Claim 20 recites the non-transitory memory recited in the method of Claim 1, and the claim is similarly rejected.
With regard to Claim 2, Kin teaches that the first object comprises an extremity. [0069] describes a representation of a user’s hand, and [0074] describes recognizing gestures of a user’s hand.
With regard to Claim 4, Kin teaches that the representation of the first object comprises an image of the first object. [0069] describes that the virtual environment includes a virtual recreation using computer graphics of the user’s hands.
With regard to Claim 5, Kin teaches that the first object is a physical object and the second object is a virtual object. [0069] describes that the representation of the user’s hands are in the environment along with virtual objects.
With regard to Claim 10, Kin teaches that the change in the graphical environment comprises a manipulation of the second object. [0094]-[0095] describe an example input through which a user gesture is used to manipulate a switch.
With regard to Claim 16, Kin teaches on a condition that the distance is greater than the threshold, displaying the change in the graphical environment according to a gaze vector based on a gaze and an offset determined based on a position of the first object. [0118]-[0119] describe an example where a user gesture is detected within a threshold such that a user pinch gesture is detected when an object is between the user’s fingers; upon detecting that the pinching hand has moved past a threshold distance, an informational display is shown relating to the object which is the terminus of the user’s gaze.
With regard to Claim 17, Kin teaches that displaying the change in the graphical environment at a location corresponding to an end portion of the first object. Fig. 9B shows that the change to the environment is displayed at a location where the fingers of the user’s hand are located.
With regard to Claim 18, Kin teaches that the device comprises a head-mountable device (HMD). [0027] describes that the artificial reality system may be implemented on a head-mounted device.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 3, 6, 15 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Kin, in view of Poulos, et al., U.S. PGPUB No. 2017/0363867 (“Poulos”).
With regard to Claim 3, Poulos teaches that the first object comprises an input device. [0025] describes a head mounted device which models a physical space. [0028] describes that the location of a control device is also tracked. [0039] describes that a control device such as a baton can be represented in the display, altered based on proximity to various objects and used to interact therewith.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment.
With regard to Claim 6, Poulos teaches on a condition that the distance is not greater than the threshold, displaying a virtual writing instrument. [0047] describes that the system can determine that the control device is within a predetermined proximity to a holographic object, and modify the representation of the control device. [0061] describes that the device can be changed to writing instruments such as a paintbrush, pencil, or pen.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment.
With regard to Claim 15, Poulos teaches selecting a brush stroke type based on the distance between the representation of the first object and the second object. [0065] describes that a user can have a handheld baton, which is represented graphically in a virtual space. As a user varies the distance between the baton and a virtual painting surface object, the brush stroke can be changed in width, shade, density, or other visual characteristic.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment.
With regard to Claim 19, Kin teaches a device comprising:
one or more processors, a non-transitory memory; a display; an input device; and one or more programs stored in the non-transitory memory, which, when executed by the one or more processors ([0136] describes storage on non-transitory media; [0055] describes input devices and [0057] describes an application executed by a processor. [0058] describes tracking the device and user hands using information from sensors. Fig. 3 shows a near-eye display), cause the device to:
detect a gesture being performed via a first object in association with a second object in a graphical environment ([0069] describes that an electronic display in a near eye device can include a virtual recreation of a user’s hands, as well as other objects. [0074] describes that the system can recognize gestures of the user’s hand as well as a recognized object to which the gesture was related);
determine, via the one or more sensors, a distance between a representation of the first object and the second object ([0092] describes that the distance between a gaze terminating at a particular object and the user’s hand can be determined);
on a condition that the distance is greater than a threshold, display a change in the graphical environment according to the gesture and a determined gaze ([0118]-[0119] describe an example where a user gesture is detected within a threshold such that a user pinch gesture is detected when an object is between the user’s fingers; upon detecting that the pinching hand has moved past a threshold distance, an informational display is shown relating to the object which is the terminus of the user’s gaze); and
on a condition that the distance is not greater than the threshold, display the change in the graphical environment according to the gesture and a projection of the representation of the first object on the second object ([0094]-[0095] describe that when the hand is within the threshold distance, such as identifying that an object is between the user’s fingers, and a gesture is input, a change to the object in the environment is carried out, such as changing a switch indicator and allowing a user to modify the switch position in the graphical environment).
Kin does not teach an audio sensor; Poulos teaches at [0021] that the head-mounted device has sensors which include a microphone. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment.
Claims 7-9 are rejected under 35 U.S.C. 103 as being unpatentable over Kin, in view of Gebbie, et al., U.S. PGPUB No. 2019/0130656 (“Gebbie”).
With regard to Claim 7, Gebbie teaches that the change in the graphical environment comprises a creation of an annotation associated with the second object. [0027] describes that a user can annotate an object by moving a tool within a threshold of the object to initiate the annotation. [0023] describes that the tool can be a finger, handheld controller, gaze, etc. [0004] describes the environment is interaction with virtual objects in a virtual environment, and [0006] describes that the tool is a virtual tool.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Gebbie with Kin. One of skill in the art would have sought the combination, to improve user experience and system utility by enabling additional collaboration tools such as annotations, thereby providing additional use cases and functionality to the virtual environment.
With regard to Claim 8, Gebbie teaches that the change in the graphical environment comprises a modification of an annotation associated with the second object. [0052] describes that a user can type content for an annotation, thereby modifying annotation text while writing content of the annotation.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Gebbie with Kin. One of skill in the art would have sought the combination, to improve user experience and system utility by enabling additional collaboration tools such as annotations, thereby providing additional use cases and functionality to the virtual environment.
With regard to Claim 9, Gebbie teaches that the change in the graphical environment comprises a removal of an annotation associated with the second object. [0028] describes that users are able to undo any attachment or drawing on a virtual object.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Gebbie with Kin. One of skill in the art would have sought the combination, to improve user experience and system utility by enabling additional collaboration tools such as annotations, thereby providing additional use cases and functionality to the virtual environment.
Claims 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Kin, in view of Duheille, U.S. PGPUB No. 2014/0055385 (“Duheille”).
With regard to Claim 11, Duheille teaches applying a scale factor to the gesture. [0098] describes that gestures can scale with dimensions of objects displayed to a user.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience.
With regard to Claim 12, Kin, in view of Duheille teaches selecting the scale factor based on the distance between the representation of the first object and the second object. Duheille teaches at [0099] that a gesture can be scaled with the distance of the user’s hand from the object being manipulated using the gesture. Kin teaches the use of a representation of a hand and a second object at [0069].
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience.
With regard to Claim 13, Duheille teaches selecting the scale factor based on a size of the second object. [0098] describes that gestures can scale with the dimensions of a displayed object.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience.
With regard to Claim 14, Duheille teaches selecting the scale factor based on an input. [0098] describes that the gesture scale can be determined for specific objects, thereby selecting a scale factor based on which object a user provides input to.
It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KEITH D BLOOMQUIST whose telephone number is (571)270-7718. The examiner can normally be reached M-F, 8:30-5 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kieu Vu can be reached at 571-272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KEITH D BLOOMQUIST/Primary Examiner, Art Unit 2171
1/22/2026