DETAILED ACTION
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2 are rejected under 35 U.S.C. 103 as being unpatentable over Sapozhnik et al., US 2022/0066565, in view of docs.manus-meta.com webpage “Manus Prime X Series -Glove Attachments Explained” with includes the webpage and an embedded YouTube video published on August 24th, 2021 (hereinafter MANUS), and Chen et al., US 10,551,916.
In Reference to Claim 1
Sapozhnik et al. teaches a tracking system (Abstract), comprising a tracking system hardware module essentially having a signal emitter (Fig. 2, 4 and Par. 86-87 “first magnetic field generator 241” and “second magnetic field generator 242” and Par. 63-65), multiple sensor modules and a system controller (Fig. 2, 4 and Par. 86-87 “plurality of first sensors 251 to 255” and “plurality of second sensors 256 to 260.” First electronic device 401 and second electronic device 403. Par. 66-68 which teaches the sensors. Par. 33 which teaches that the electronic device is a processor.), the sensor modules being mounted on a user’s fingertips and a dorsal side of the user’s hand (Fig. 5 and Par. 101-102 “Sensors 351 to 355 among the plurality of sensors 351 to 356 may be placed on the user's finger tips, and the sensor 356 may be placed on the user's back of hand.”), wherein the signal emitter and the system controller are mounted on the dorsal side of the user's hand (Fig. 4-5 and Par. 86 “included in, or mounted on, a first glove worn on the user's left hand” and “included in, or mounted on, a second glove worn on the user's left hand”) such that the signal emitter receives control signals from the system controller and generates known electromagnetic signals (Par. 64 and Par. 89-92), and, when the user’s fingers move relative to the signal emitter, the sensor modules mounted on the fingertips measure the electromagnetic signals generated by the signal emitter and output measurement signals thus obtained (Par. 93-97 “may obtain the posture (e.g., at least one of the position or direction) for each of the plurality of first sensors 251 to 255, based on the second-first signal” and “may obtain the posture (e.g., at least one of the position or direction) for each of the plurality of second sensors 256 to 260, based on the second-fourth signal”); and a hand gesture estimation module, wherein the signal processing module receives the measurement signals from the sensor modules, performs signal processing on the measurement signals (Par. 93-97), and sends the processed measurement signals to the posture estimation module for calculating a posture of each of the sensor modules and a posture data of each of the user’s fingers relative to the signal emitter such that the posture data is used by the hand gesture estimation module to estimate a flexion state of each joint of each of the user’s fingers and the fingers’ posture relative to the hand, wherein (Par. 7 “a method for providing three-dimensional (3D) input, which may detect the movement of an object (e.g., the user's finger) in a 3D space and provide information about the detected object movement as input to another electronic device” Par. 93-97 and Par. 139-140 “the processor 220 may obtain information having six degrees of freedom, such as (x, y, z, ψ, θ, φ. For example, the processor 220 may obtain the positions and directions between the magnetic field generator 230 and the sensor 240.” Par. 178 which teaches using the sensor position as input to an AR device. And Fig. 4-5 and Par. 7 which teaches that the sensor are mounted on the fingertips of a glove and may be used to determine the position of the user’s fingers).
However, although Sapozhnik et al. teaches a system integrated into a glove (Par. 86) and using the finger position information as input for another device (Par. 7 and 178), they do not explicitly teach a device integrating an existing positional tracking system and a docking mechanism a docking mechanism mounted on the dorsal side of the user’s hand and adapted to dock with the existing positional tracking system; and a positional tracking software module forming the tracking system together with the tracking system hardware module and having a signal processing module connected to the sensor modules, a posture estimation module connected to the signal processing module, and a hand gesture estimation module connected to the posture’ estimation module, with the docking mechanism operating in conjunction with an existing third-party positional tracking system, a posture relation between the existing third-party positional tracking system and the tracking system is established with a known geometric design parameter of the docking mechanism, and absolute posture of the user’s fingers in actual space is calculated according to the posture relation.
MANUS teaches a device integrating an existing positional tracking system and a docking mechanism a docking mechanism mounted on the dorsal side of the user’s hand and adapted to dock with the existing positional tracking system; with the docking mechanism operating in conjunction with an existing third-party positional tracking system (Page 1 “The Prime II and Prime X series gloves can be used with a large variety of different tracking system. To add hand and finger tracking to your personal tracking setup MANUS offers a set of adapters to connect trackers to the top of the gloves.” Page 3 which shows a “HTC Vive tracker” docked to the top of a sensing glove and states “The HTC Vive tracker is now ready to be calibrated and used.” See also pages 4-5 which show an “Oculus” controller being mounted to dock on the back of the sensing glove).
It would be desirable to modify the system of Sapozhnik et al. to include the docking mechanism and attachments to attach third party tracking devices as taught by MANUS in order to allow a user to add hand and finger tracking to their personal body tracking setup as described by MANUS, such as for the AR application discussed in Par. 178 of Sapozhnik et al.
Adams teaches a positional tracking software module forming the tracking system together with the tracking system hardware module and having a signal processing module connected to the sensor modules, a posture estimation module connected to the signal processing module, and a hand gesture estimation module connected to the posture’ estimation module, with hand tracking mechanism operating in conjunction with a positional tracking system, a posture relation between the tracking system and the tracking system is established with a known geometric design parameter of the docking mechanism, and absolute posture of the user’s fingers in actual space is calculated according to the posture relation (Col. 12 lines 14-35 “In one or more embodiments, additional signals from one or more additional sensors, such as those described above in conjunction with FIG. 4, are obtained 540. For example, one or more additional sensors are inertial sensors (such as accelerometers and gyroscopes). An orientation of the wearable device (e.g., of a hand wearing the glove) is determined 550 using the additional signals in various embodiments. Locations of portions of the wearable device (e.g., locations of the fingers on a glove) are determined 560 by combining the position vectors and the orientation information.” And “In one or more embodiments, the disclosed systems and methods for position sensing (e.g., sensing of fingertip positions) are used in conjunction with a virtual reality (VR) system. For example, the disclosed methods for detecting positions of fingers or other body parts are used to provide information about or to render a state of a body part (e.g., a hand) of a user contacting portions of the wearable device in a VR environment that one or more portions of or VR world. For example, states of a hand (e.g., open, closed, pointing, gesturing, etc.) contacting the wearable device are be determined based on the detected positions or locations of fingers or finger tips of the hand.”)
It would be desirable to modify the system of Sapozhnik et al. to include the measurement of a user’s hand or body position and combining that measurement with the measurement of finger tracking as taught by Chen et al. in order to allow a user to add detailed finger tracking to a body tracking setup as described above by MANUS, such as for the AR application discussed in Par. 178 of Sapozhnik et al. Thus allowing more detailed hand and body manipulation of a program such as the AR application of Sapozhnik et al.
Therefore, it would have been obvious to one of ordinary skill in the art at the time of filing of the invention to modify the system of Sapozhnik et al. to include the docking mechanism and attachments to attach third party tracking devices as taught by MANUS and to modify the system of Sapozhnik et al. to include the measurement of a user’s hand or body position and combining that measurement with the measurement of finger tracking as taught by Chen et al.
In Reference to Claim 2
Sapozhnik et al., as modified by MANUS and Chen et al., teach wherein the docking mechanism is either a tracker docking mechanism or a handheld controller docking mechanism, depending on the existing positional tracking system (MANUS page 3-5 “HTC Vive tracker” and “Oculus Controller.”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARL V LARSEN whose telephone number is (571)270-3219. The examiner can normally be reached Monday through Friday; 10:00 am - 6:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dmitry Suhol can be reached at (571) 272-4430. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CARL V LARSEN/ Examiner, Art Unit 3715