DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Information Disclosure Statement
The information disclosure statements (IDS) submitted on June 6th 2025 and August 9th 2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Claim Objections
Claims 12, 18, and 19 are objected to because of the following informalities:
Claims 12, 18 and 19 recite “ML instructions” and “AI model”. ML should be written out as “Machine Learning” and AI as “Artificial-Intelligence”.
Claim 12 recites “MR headset”, MR should be written out as “Mixed-Reality”.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 6-7, 12-13, and 16-20 are rejected under 35 U.S.C. 103 as being unpatentable over Berger et al. U.S. Patent Application Publication 20230070631 A1 (hereinafter Berger) in view of Maizels et al. U.S. Patent Application Publication 20250279100 A1 (hereinafter Maizels).
Regarding claim 1, Berger teaches A device (Electronic Device, Para. 0016), comprising:
a (AR/VR System/Application, Para. 0063) including:
a processor (Processors 1102, Para. 0144) configured to execute machine-learning (ML) instructions (Machine Learning Technique Module 512 Para. 0117 or Trained Machine Learning techniques 307 Para. 0092);
a memory (Memory 1104, Para. 0145) configured to store a first set of data (Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019);
and a communications module (Cloud Computing) configured to access a cloud storage (Network/Server/Database, Para. 0169 and 0172) including a second set of data (Data Stored on Network/Server/Database, Para. 0169 and 0172),
wherein the ML instructions (Machine Learning Technique Module 512 Para. 0117 or Trained Machine Learning techniques 307 Para. 0092) are configured to train an artificial-intelligence (AI) model to infer facial expressions (Para. 0036 and 0094) based on at least one of the first set of data(Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) or the second set of data(Data Stored on Network/Server/Database, Para. 0169 and 0172). Segmented Training Images 309 which are used to train the Machine Learning Techniques 307 are store in databases (Para. 0110) which can be local storage (First Set of Data) or remote storage (Second Set of Data).
However, Berger fails to teach a mixed-reality (MR) headset.
Berger and Maizels are analogous to the claimed invention because both of them are in the same field of utilizing AR/VR to track a user’s facial expression and body gestures during an activity using machine learning.
Maizels teaches a mixed-reality (MR) headset (AR/VR/MR Headsets, Para. 0161) that tracks facial expressions (Para. 0247) and body movements (Para. 0521) using machine learning (Para. 0147). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s AR/VR application to utilize Maizels MR Headset. Since doing so would provide the benefit of utilizing sensors available to extended reality appliances to increase efficiency and quality of data collected. (Maizels et al. Para. 0161 and Berger et al. Para. 0014)
Regarding claim 2, Berger teaches the device of claim 1, wherein the first set of data(Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) and the second set of data(Data Stored on Network/Server/Database, Para. 0169 and 0172)comprise images (Images of the Body of the User, Para. 0016 and 0063) or video (Videos of the Body of the User, Para. 0016) clips of body poses (Position of the Body of the User in the Images/Videos).
Regarding claim 3, Berger teaches the device of claim 2, wherein the body poses(Position of the Body of the User in the Images/Videos) are provided by AI-powered body scanning (3D Body Tracking Module 513 Para. 0110 or Skeletal Key-Points Module 511 Para. 0014)
Regarding claim 4, Berger teaches the device of claim 2, wherein the body poses (Position of the Body of the User in the Images/Videos) comprise body motions in at least one of a social activity (Group of Friends Taking Photos/Videos for Social Media, Para. 0028, 0053, and 0087) or a physical activity including a sports activity or a fitness activity.
Regarding claim 6, Berger teaches the device of claim 1, wherein the first set of data(Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) or the second set of data(Data Stored on Network/Server/Database, Para. 0169 and 0172) further comprise audio (Para. 0046 and 0048) including environment sounds (Background Noise, Para. 0148), music or voice.
Regarding claim 7, Berger teaches the device of claim 1, wherein the first set of data(Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) or the second set of data(Data Stored on Network/Server/Database, Para. 0169 and 0172) further comprise a measured user’s biometric data (Biosignals, Para. 0147) including a heart rate or a blood pressure used to
However, Berger fails to use the biometric data to indicate an intensity of a physical activity.
Maizels teaches using biometric data to indicate an intensity of a physical activity (Para. 0278 and 0413). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s Biometric Data to incorporate Maizels indication of the Intensity of Physical Activity. Since doing so would provide the benefit of monitoring the users level of activity and adjusting the system based on the intensity (Maizels et al. Para. 0413) which increases the efficiency and accuracy of the system.
Regarding claim 12, Berger teaches an apparatus (Electronic Device, Para. 0016), comprising:
an (AR/VR System/Application, Para. 0063)
a processor (Processors 1102, Para. 0144) configured to execute ML instructions (Machine Learning Technique Module 512 Para. 0117 or Trained Machine Learning techniques 307 Para. 0092);
a memory (Memory 1104, Para. 0145) configured to store a first set of data (Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019);
and a communications module (Cloud Computing) configured to access a cloud storage (Network/Server/Database, Para. 0169 and 0172) including a second set of data (Data Stored on Network/Server/Database, Para. 0169 and 0172), wherein:
at least one of the first set of data(Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) or the second set of data(Data Stored on Network/Server/Database, Para. 0169 and 0172) includes a plurality of facial expressions, (Images/Videos with Facial Expressions, Para. 0036)
and the ML instructions(Machine Learning Technique Module 512 Para. 0117 or Trained Machine Learning techniques 307 Para. 0092) are configured to train an AI model (Para. 0092-0093) to infer at least one body pose(Position of the Body of the User in the Images/Videos) based on at least one of the first set of data (Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) or the second set of data (Data Stored on Network/Server/Database, Para. 0169 and 0172).
However, Berger fails to teach a MR headset.
Maizels teaches a MR headset (AR/VR/MR Headsets, Para. 0161) that tracks facial expressions (Para. 0247) and body movements (Para. 0521) using machine learning (Para. 0147). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s AR/VR application to utilize Maizels MR Headset. Since doing so would provide the benefit of utilizing sensors available to extended reality appliances to increase efficiency and quality of data collected. (Maizels et al. Para. 0161 and Berger et al. Para. 0014)
Regarding claim 13, Berger teaches the apparatus of claim 12, wherein the plurality of facial expressions comprises elated, thrilled, delighted, excited, happy, friendly, agreeable, worried, anxious, upset, nervous, anger, rage, aggression expressions, nostril flaring, chest and neck being animated or changing of a skin color. (Para. 0036)
Regarding claim 16, has similar limitations as of claim 7, therefore it is rejected under the same rationale as claim 7.
Regarding claim 17, has similar limitations as of claim 6, therefore it is rejected under the same rationale as claim 6.
Regarding claim 18, has similar limitations as of claim 12, therefore it is rejected under the same rationale as claim 12.
Regarding claim 19, Berger teaches the method of claim 18, wherein the ML instructions(Machine Learning Technique Module 512 Para. 0117 or Trained Machine Learning techniques 307 Para. 0092) are configured to train an AI model to infer at least one facial expression(Para. 0036 and 0094) based on at least one of the first set of data(Data Stored on Electronic Device Para. 0016 or Client Device 102, Para. 0019) or the second set of data(Data Stored on Network/Server/Database, Para. 0169 and 0172).
Regarding claim 20, has similar limitations as of claims 6 and 7, therefore it is rejected under the same rationale as claims 6 and 7.
Claim(s) 5, 8-11, and 14-15 are rejected under 35 U.S.C. 103 as being unpatentable over Berger et al. U.S. Patent Application Publication 20230070631 A1 (hereinafter Berger) in view of Maizels et al. U.S. Patent Application Publication 20250279100 A1 (hereinafter Maizels) in further view of IDS Reference NPL “Survey on Emotional Body Gesture Recognition” by Fatemeh Noroozi, Ciprian Adrian Corneanu, Dorota Kaminska, Tomasz Sapinski, Sergio Escalera, and Gholamreza Anbarjafari (hereinafter Noroozi).
Regarding claim 5, Berger and Maizels fail to teach the device of claim 2, wherein the body poses are indicative of emotional states in one of a plurality of contexts.
Berger, Maizels, and Noroozi are analogous to the claimed invention because all of them are in the same field of tracking body poses/gestures and emotional states.
Noroozi teaches the device of claim 2, wherein the body poses are indicative of emotional states in one of a plurality of contexts. (Section: 2 Expressing Emotion Through Body Language, Page 2) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional States. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Regarding claim 8, Berger and Maizels fail to teach the device of claim 1, wherein the facial expressions include elated, thrilled, delighted or excited expressions inferred from a hand-in-the-air body gesture.
However, Noroozi teaches the device of claim 1, wherein the facial expressions include elated, thrilled, delighted or excited expressions inferred from a hand-in-the-air body gesture. (Table 1 – Arms open (Happiness Row) or Both Hands over the Head(Surprise Row)) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional States. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Regarding claim 9, Berger and Maizels fail to teach the device of claim 1, wherein the facial expressions include worried, anxious, upset, or nervous expressions inferred from a form of a stop body gesture.
However, Noroozi teaches the device of claim 1, wherein the facial expressions include worried, anxious, upset, or nervous expressions inferred from a form of a stop body gesture. (Table 1 – One Hand up (Disgust Row) or Hands Kept Lower than their Normal Position (Sadness Row)) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional States. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Regarding claim 10, Berger and Maizels fail to teach the device of claim 1, wherein the facial expressions include happy, friendly or agreeable expressions inferred from a form of a peace-sign body gesture.
However, Noroozi teaches the device of claim 1, wherein the facial expressions include happy, friendly or agreeable expressions inferred from a form of a (Table 1 – Happiness Row)
While Noroozi fails to explicitly teach a peace sign body gesture. Noroozi teaches tracking of hands and body (Fig. 3, Fig. 4 and Fig. 8) and assigning emotions to different poses/gestures. (Section: 4.3.2 Emotion Recognition) Thus, assigning the universal peace sign to its respective cultural emotion is possible. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional State. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Regarding claim 11, Berger and Maizels fails to teach the device of claim 1, wherein the facial expressions include anger, rage or aggression expressions inferred from a form of a punching body gesture.
However, Noroozi teaches the device of claim 1, wherein the facial expressions include anger, rage or aggression expressions inferred from a form of a punching body gesture. (Table 1 – Closed Hands or Clenched Fists (Anger Row)) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional State. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Regarding claim 14, Berger and Maizels fail to teach the apparatus of claim 12, wherein the at least one body pose comprises one or more of a hand-in-the-air body gesture, a stop body gesture, a peace-sign body gesture and a punching body gesture.
However, Noroozi teaches the apparatus of claim 12, wherein the at least one body pose comprises one or more of a hand-in-the-air body gesture(Table 1 – Arms open (Happiness Row) or Both Hands over the Head(Surprise Row)), a stop body gesture(Table 1– One Hand up (Disgust Row) or Hands Kept Lower than their Normal Position (Sadness Row)), a peace-sign body gesture and a punching body gesture(Table 1 – Closed Hands or Clenched Fists (Anger Row)). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional State. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Regarding claim 15, Berger teaches the apparatus of claim 12, (Position of the Body of the User in the Images/Videos) comprises body motions in at least one of a social activity (Group of Friends Taking Photos/Videos for Social Media, Para. 0028, 0053, and 0087) or a physical activity including a sports activity or a fitness activity.
However, Berger and Maizels fail to teach wherein the at least one body pose is indicative of an emotional state in one of a plurality of contexts.
Noroozi teaches wherein the at least one body pose is indicative of an emotional state in one of a plurality of contexts. (Section: 2 Expressing Emotion Through Body Language, Page 2) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Berger’s and Maizels Body Poses/Gestures to incorporate Noroozi’s Body Poses/Gestures that indicate Emotional States. Since doing so would provide the benefit of indicating emotional states through body language and facial expressions. As body language can be used to determine the emotional state of a person. (Noroozi et al. Section: 2 Expressing Emotion Through Body Language, Page 2)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIANNA R COCHRAN whose telephone number is (571)272-4671. The examiner can normally be reached Mon-Fri. 7:30am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571) 272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRIANNA RENAE COCHRAN/Examiner, Art Unit 2615
/ALICIA M HARRINGTON/
Supervisory Patent Examiner, Art Unit 2615