DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claim 28 is objected to because of the following informalities: there is an unnecessary comma in between “forearm” and “of” in line 2. Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 16-30 are rejected under 35 U.S.C. 103 as being unpatentable over Chen et al., US Patent Publication 2022/0342489 in view Islam, US Patent Publication 2024/0130621.
Regarding independent claim 16, Chen et al. teaches a sensor assembly to be attached to a forearm of a user (paragraph 0047 explains that the wearable device of detectors is worn on a forearm), comprising:
a first laser feedback interferometry sensor (paragraph 0036 explains that the sensor modalities can be laser interferometers);
a carrier structure (band 502 of figure 5 as given in paragraph 0047); and
wherein the carrier structure is configured to attach the first laser feedback interferometry sensor on the forearm of the user (paragraph 0047 describes the carrier structure as a wristband 502 of figure 5 that can be worn on any other part of the body including the forearm) such that the first laser feedback interferometry sensor is configured to detect a pose of a first tendon of a back of a hand or of the forearm of the user (paragraph 0047 explains the situation of figure 5 where the back of the hand is the focus as shown and the light is emitted in a way to detect the given part of the hand), wherein the unit is configured to determine a hand gesture of the user depending on the detected pose of the first tendon of the back of the hand or of the forearm of the user (paragraph 0049 explains how the hand pose is translated into a corresponding gesture),
wherein the first laser feedback interferometry sensor is configured to emit a light beam toward the first tendon, and configured to detect light reflected from the first tendon to determine the pose of the first tendon (paragraphs 0031-0036 explain that light is emitted and then reflected as given in paragraph 0032 towards the user’s tissue that is given in paragraph 0035 to include the tendon to detect the pose of the tendon to yield the user motion as given in paragraph 0031 using laser interferometers as given in paragraph 0036).
Chen et al. does not specify the use of the output of the gesture detection using the first computing unit. Islam teaches the use of the output of the gesture detection using the first computing unit (paragraphs 0236 and 0270-0271 explain how the output of the measurements is interpreted with a computer or computing device to use the output). It would have been obvious to one of ordinary skill in the art before the effective filing date to include the computing device that uses the output of the sensors as taught by Islam in the system of Chen et al. The rationale to combine would be since body language often communicates more information than verbal communication (paragraph 0602 of Islam).
Regarding claim 17, Chen et al. teaches the sensor assembly according to claim 16, wherein the first laser feedback interferometry sensor is configured to detect a first pose of the first tendon of the back of the hand or of the forearm of the user at a first point in time, and a second pose of the first tendon of the back of the hand or of the forearm of the user at a second point in time following the first point in time, wherein the first computing unit is configured to determine the hand gesture of the user depending on the detected first and second poses (paragraphs 0050-0053 explain how different poses from the different sensors are interpreted to determine a gesture made by the user involving movements).
Regarding claim 18, Chen et al. teaches the sensor assembly according to claim 16, wherein the carrier structure is configured to attach the first laser feedback interferometry sensor on the forearm of the user such that a first light beam emitted by the first laser feedback interferometry sensor is emitted toward the back of the hand or the forearm of the user (paragraph 0047 explains the situation of figure 5 where the back of the hand is the focus as shown and the light is emitted in a way to detect the given part of the hand).
Regarding claim 19, Chen et al. teaches the sensor assembly according to claim 16, wherein the carrier structure is configured as an armband (paragraph 0047 describes the carrier structure as a wristband 502 of figure 5 that can be worn on any other part of the body including the forearm, which would make it an armband).
Regarding claim 20, Chen et al. teaches the sensor assembly according to claim 19, wherein the first laser feedback interferometry sensor is integrated on an outer side of the armband (to allow interactions depicted in figures 2 and 8A and 8B and described in paragraphs 0032 and 0053).
Regarding claim 21, Chen et al. teaches the sensor assembly according to claim 19, wherein the sensor assembly further comprises a static deflection unit integrated into the armband and configured to deflect the first light beam emitted by the laser feedback interferometry sensor toward the back of the hand or the forearm of the user (paragraph 0032 describes the optical paths and elements depicted in figure 2 that allow the light to be directed towards the given portions of the user).
Regarding claim 22, Chen et al. teaches the sensor assembly according to claim 19, wherein the armband includes an optical window for outcoupling a first light beam generated by the first laser feedback interferometry sensor (paragraph 0032 describes the optical paths and elements depicted in figure 2 that allow the light to be directed towards the given portions of the user).
Regarding claim 23, Chen et al. teaches the sensor assembly according to claim 19, wherein the sensor assembly comprises a plurality of laser feedback interferometry sensors (paragraph 0036 explains the use of multiple laser interferometers), wherein: (i) at least one laser feedback interferometry sensor of the plurality of laser feedback interferometry sensors is assigned to each tendon of the back of the hand (paragraph 0032 describes the optical paths and elements depicted in figure 2 that allow the light to be directed towards the given portions of the user) and/or (ii) a further laser feedback interferometry sensor is assigned to at least one tendon of the forearm, wherein the carrier structure is configured to attach each of the laser feedback interferometry sensors on the forearm of the user (paragraph 0047 describes the carrier structure as a wristband 502 of figure 5 that can be worn on any other part of the body including the forearm) such that each laser feedback interferometry sensor is configured to detect a pose of a respectively assigned tendon of the back of the hand or of the forearm (paragraph 0035 explains how the tendon is measured during a user motion and paragraph 0045 explains how that is converted to determine a gesture).
Regarding claim 24, Islam teaches further the sensor assembly according to claim 16, wherein the sensor assembly further comprises a transmission unit (paragraphs 0236-0238 describe how data is transmitted from the measuring devices), wherein the transmission unit is configured to transmit the hand gesture of the user that was determined by using the first computing unit, to an optical system (paragraphs 0488 and 0539 explain how the measurements are taken to determine a hand gesture), which is configured to display image contents of a virtual or augmented reality to the user of the sensor assembly (paragraph 0539 explains the use in augmented/virtual or mixed reality device).
Regarding independent claim 25, Chen et al. teaches an optical system, the system comprising:
a receiving unit configured to receive a hand gesture of the user that was determined by a sensor assembly arranged on a forearm of the user (paragraph 0047 describes the carrier structure as a wristband 502 of figure 5 that can be worn on any other part of the body including the forearm), the sensor assembly including:
a first laser feedback interferometry sensor (paragraph 0036 explains that the sensor modalities can be laser interferometers),
a carrier structure (band 502 of figure 5 as given in paragraph 0047), and
wherein the carrier structure is configured to attach the first laser feedback interferometry sensor on the forearm of the user (paragraph 0047 describes the carrier structure as a wristband 502 of figure 5 that can be worn on any other part of the body including the forearm) such that the first laser feedback interferometry sensor is configured to detect a pose of a first tendon of a back of a hand or of the forearm of the user (paragraph 0047 explains the situation of figure 5 where the back of the hand is the focus as shown and the light is emitted in a way to detect the given part of the hand), wherein the unit is configured to determine a hand gesture of the user depending on the detected pose of the first tendon of the back of the hand or of the forearm of the user (paragraph 0049 explains how the hand pose is translated into a corresponding gesture),
wherein the first laser feedback interferometry sensor is configured to emit a light beam toward the first tendon, and configured to detect light reflected from the first tendon to determine the pose of the first tendon (paragraphs 0031-0036 explain that light is emitted and then reflected as given in paragraph 0032 towards the user’s tissue that is given in paragraph 0035 to include the tendon to detect the pose of the tendon to yield the user motion as given in paragraph 0031 using laser interferometers as given in paragraph 0036).
Chen et al. does not specify the use of the output of the gesture detection as a system for displaying a virtual or augmented reality to a user of the system using the first computing unit; and a second computing unit configured to control an image content of the virtual or augmented reality depending on the received hand gesture of the user. Islam teaches the use of the output of the gesture detection as a system for displaying a virtual or augmented reality to a user of the system (paragraph 0539 explains the use in augmented/virtual or mixed reality device) using the first computing unit (paragraphs 0236 and 0270-0271 explain how the output of the measurements is interpreted with a computer or computing device to use the output); and a second computing unit configured to control an image content of the virtual or augmented reality depending on the received hand gesture of the user (paragraph 0476 explains that the purpose of the device is to control the avatars to be more realistic). It would have been obvious to one of ordinary skill in the art before the effective filing date to include the computing device that uses the output of the sensors as taught by Islam in the system of Chen et al. The rationale to combine would be since body language often communicates more information than verbal communication (paragraph 0602 of Islam).
Regarding claim 26, Islam teaches further the optical system according to claim 25, wherein the second computing unit is configured to control a virtual avatar depending on the received hand gesture of the user (paragraph 0476 explains that the purpose of the device is to control the avatars to be more realistic).
Regarding claim 27, Islam teaches further the optical system according to claim 25, wherein the optical system is a pair of glasses (paragraphs 0475 and 0489 describe the use of smart glasses as the device).
Regarding independent claim 28, Chen et al. teaches a method for determining a hand gesture of a user using a sensor assembly arranged on a forearm of the user, the method comprises the following method steps:
detecting a pose of a first tendon of a back of a hand or of the forearm,of the user by a first laser feedback interferometry sensor of the sensor assembly (paragraph 0047 explains the situation of figure 5 where the back of the hand is the focus as shown and the light is emitted in a way to detect the given part of the hand); and determining the hand gesture of the user depending on the detected pose of the first tendon of the back of the hand or of the forearm of the user using a unit of the sensor assembly (paragraphs 0050-0053 explain how different poses from the different sensors are interpreted to determine a gesture made by the user involving movements and paragraph 0049 explains how the hand pose is translated into a corresponding gesture),
wherein the first laser feedback interferometry sensor is configured to emit a light beam toward the first tendon, and configured to detect light reflected from the first tendon to determine the pose of the first tendon (paragraphs 0031-0036 explain that light is emitted and then reflected as given in paragraph 0032 towards the user’s tissue that is given in paragraph 0035 to include the tendon to detect the pose of the tendon to yield the user motion as given in paragraph 0031 using laser interferometers as given in paragraph 0036).
Chen et al. does not specify the use of the output of the gesture detection using the first computing unit. Islam teaches the use of the output of the gesture detection using the first computing unit (paragraphs 0236 and 0270-0271 explain how the output of the measurements is interpreted with a computer or computing device to use the output). It would have been obvious to one of ordinary skill in the art before the effective filing date to include the computing device that uses the output of the sensors as taught by Islam in the system of Chen et al. The rationale to combine would be since body language often communicates more information than verbal communication (paragraph 0602 of Islam).
Regarding claim 29, Chen et al. teaches the method according to claim 28, further comprising the following steps:
detecting a first pose of a first tendon of the back of the hand or of the forearm of the user using the first laser feedback interferometry sensor of the sensor assembly at a first point in time; detecting a second pose of the first tendon of the back of the hand or of the forearm of the user using the first laser feedback interferometry sensor of the sensor assembly at a second point in time following the first point in time; and determining the hand gesture of the user depending on the detected first and second poses of the back of the hand or of the forearm of the user using the first computing unit of the sensor assembly (paragraphs 0050-0053 explain how different poses from the different sensors are interpreted to determine a gesture made by the user involving movements).
Regarding claim 30, Islam teaches further the method according to claim 28, further comprising the following step: transmitting the determined hand gesture (paragraphs 0488 and 0539 explain how the measurements are taken to determine a hand gesture) using a transmission unit (paragraphs 0236-0238 describe how data is transmitted from the measuring devices) of the sensor assembly to an optical system in order to display a virtual or augmented reality to the user of the optical system (paragraph 0539 explains the use in augmented/virtual or mixed reality device).
Response to Arguments
Applicant's arguments filed 2/18/26 have been fully considered but they are not persuasive. Applicant contends that the prior art does not teach the amended features, rendering the claims to be allowable. The examiner disagrees. These features were not previously considered and have now been rejected above. Chen teaches in paragraphs 0031-0036 that light is emitted and then reflected as given in paragraph 0032 towards the user’s tissue that is given in paragraph 0035 to include the tendon to detect the pose of the tendon to yield the user motion as given in paragraph 0031 using laser interferometers as given in paragraph 0036.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The closest prior art is made of record in the attached notice of references cited.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PARUL H GUPTA whose telephone number is (571)272-5260. The examiner can normally be reached Monday through Friday, from 10 AM to 7 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PARUL H GUPTA/Primary Examiner, Art Unit 2627