DETAILED ACTION
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
2. Claims 1-15 are withdrawn from further consideration pursuant to 37CFR 1.142(b) as being drawn to a nonelected species, there being no allowable generic or linking claim. Election of group III, claims 16-20 was made without traverse in thereply filed on 6/03/2025.
Specification
3. The specification is objected to for the following:
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
4. Claim 16 is rejected under 35 U.S.C. 102(a)(1) as being anticipated by Nocon et al. (US 2022/0100280 A1).
As in Claim 16, Nocon teaches a gesture detection system, comprising (at least pars. 14, 66, 91-92, a gesture recognition system):
an inertial measurement unit (IMU) (Fig. 1, pars. 68-69, an inertial measurement unit (IMU) 108);
a processor coupled to the IMU, the processor including a gesture detection module coupled to the IMU (at least pars. 68-70, 73, a processor (e.g., 130) (or controller 106) is coupled to the IMU and includes modules or functionality to detect gestures (e.g., patterns of movements) based on motion data from the IMU; further see pars. 11-12, 14, 92);
a memory coupled to the gesture detection module, the memory configured to store a sequence of detected gesture steps and a plurality of gesture step templates (at least pars. 70-71, 73, 98, 116- 122, a memory or storage device is coupled to gesture detection modules or sensors and stores gesture patters or identifies in a library or database for recognition; further see pars. 11-12); and
wherein the gesture detection module is configured to determine a first proximity difference between the sequence of detected gesture steps as compared to at least a first portion of the plurality of gesture step templates (pars. 112, 116-117, 120, and 101, the system or device can determine if the detected gesture matches a stored gesture (or a pattern of movements) by first classifying the received gesture signature as a gesture type or as unclassifiable. The processor then compares the classified gesture type against entries in the gesture library. If the gesture type corresponds to an entry in the library, it is considered a match; otherwise, it is considered not a match).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Nocon et al. (US 2022/0100280 A1) in view of Bulzacki, Adrian (US 2014/0198954 A1).
As in Claim 17, Nocon teaches all the limitations of Claim 16. Nocon does not teach that the sequence of detected gesture steps that minimizes the first proximity difference is registered as a gesture.
However, in the same filed of the invention, Bulzacki teaches that the sequence of detected gesture steps that minimizes the first proximity difference is registered as a gesture (par. 49, when a newly detected gesture matches an existing gesture in the database, the system may updated the existing gesture with new gesture data).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify the system for identifying the detected gestures by comparing them with the gesture library, as taught by Nocon, and to update the existing gesture in the database when the new gesture matches with it, as taught by Lianides. The motivation is to improve the quality and precision of the gesture in the databased over time, which helps it better recognize the movement in the future.
6. Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Nocon et al. (US 2022/0100280 A1) in view of Bulzacki, Adrian (US 2014/0198954 A1) and further in view of Aubauer et al. (US 2012/0313882 A1).
As in Claim 18, Nocon-Bulzacki teaches all the limitations of Claim 17. Bulzacki further teaches that a second portion of the plurality of gesture step templates are not selected for comparison depending upon an operational state of the gesture detection system.
However, in the same filed of the invention, Aubauer teaches that a second portion of the plurality of gesture step templates are not selected for comparison depending upon an operational state of the gesture detection system (pars. 102, 159, the system speeds up gesture recognition by progressively narrowing down possible reference gestures after each segment comparison. If a segment doesn’t match, related gestures are excluded, allowing the process to skip unnecessary comparison).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify the system for identifying the detected gestures by comparing them with the gesture library, as taught by Nocon, in view of Bulzacki’s teachings, and to provide the way to skip unnecessary comparison, as taught by Aubauer. The motivation is to quickly recognize gestures, reducing unnecessary checks and comparison effort.
7. Claims 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Nocon et al. (US 2022/0100280 A1) in view of Lianides et al. (US 2022/0105389 A1).
As in Claim 19, Nocon teaches all the limitations of Claim 16. Nocon doe not teach that the processor is further configured to determine a second proximity difference between a preselected gesture step template and a sequence of gesture steps implemented by a user of the gesture detection system.
However, in the same filed of the invention, Lianides teaches that the processor is further configured to determine a second proximity difference between a preselected gesture step template and a sequence of gesture steps implemented by a user of the gesture detection system (at least pars. 44-47, the system can determine similarity scores between detected gestures and reference gestures in the gesture library, with lower scores indicating closer matches to the target poses; further see pars.20, 30, 32-43).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify the system for identifying the detected gestures by comparing them with the gesture library, as taught by Nocon, and to provide the way to determine similarity scores, as taught by Lianides. The motivation is to enable precise analysis of gestures and to deliver effective feedback to users.
As in Claim 20, Nocon-Lianides teaches all the limitations of Claim 19. Nocon-Lianides further teaches that the processor is further configured to guide the user through feedback to minimize the second proximity difference (Lianides, at least pars. 29, 53, the system provides real-time feedback to guide the user in matching their poses to the target).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Rinna Yi whose telephone number is (571) 270-7752 and fax number is (571) 270-8752. The examiner can normally be reached on M-F 8:30am-5:00pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Fred Ehichioya can be reached on (571) 272-4034.
Information regarding the status of an application may be obtained from Patent Center and the Private Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from Patent Center or Private PAIR. Status information for unpublished applications is available through Patent Center or Private PAIR to authorized users only. Should you have questions about access to Patent Center or the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
/RINNA YI/
Primary Examiner, Art Unit 2179