Prosecution Insights
Last updated: April 19, 2026
Application No. 18/063,277

VISUALIZING REAL WORLD SENTIMENTS OF OBJECTS AND INTERACTIONS VIA AUGMENTED REALITY

Non-Final OA §103
Filed
Dec 08, 2022
Examiner
MUSHAMBO, MARTIN
Art Unit
2615
Tech Center
2600 — Communications
Assignee
International Business Machines Corporation
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
690 granted / 816 resolved
+22.6% vs TC avg
Moderate +14% lift
Without
With
+14.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
15 currently pending
Career history
831
Total Applications
across all art units

Statute-Specific Performance

§101
12.7%
-27.3% vs TC avg
§103
48.5%
+8.5% vs TC avg
§102
23.7%
-16.3% vs TC avg
§112
8.6%
-31.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 816 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 12/08/2022 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-5, 8-12 and 15-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jerauld (US 20170117005 A1), in view of Chaudhary et al. (US 20230059399 A1) hereinafter referred to as Chaudhary, further in view of Gordon et al. (US 20170337476 A1) hereinafter referred to as Gordon.Claim 1, Jerauld discloses a computer-implemented method for representing real world sentiments of an inanimate object (Jerauld, ABSTRACT A see-through, head mounted display and sensing devices cooperating with the display detect audible and visual behaviors of a subject in a field of view of the device. With the subject being the inanimate object) comprising: receiving, by a computing device (Jerauld Processing environment 4), sensor data from a plurality of sensors associated with the inanimate object, the sensor data tracking one or more conditions of the inanimate object (Jerauld, [0004] A variety of sensors on the display provide input data which is utilized to compute emotional states of subjects within a field of view. During an interaction, the device, recognizes emotional states in subjects by comparing, in real time, detected sensor input against a database of human/primate gestures/expressions, posture, and speech. Feedback is provided to the wearer after interpretation of the sensor input); analyzing, by the computing device (Jerauld Processing environment 4), the sensor data to determine a status of a real world sentiment associated with the inanimate object (Jerauld, Fig.8 #850A [0118] the analysis application 850a weighs any combination of inputs and associated behaviors at a given time and derives a conclusion about an emotional state to report feedback on given the inputs); generating, by the computing device (Jerauld Processing environment 4), a feedback associated with the inanimate object (Jerauld, ABSTRACT Emotional states are computed based on the behaviors and feedback provided to the wearer indicating computed emotional states of the subject. [0116] the analysis application 850a provides a wearer with feedback concerning interactions with subjects or groups of people within the field of view of the wearer); transmitting, by the computing device (Jerauld Processing environment 4), the feedback to a wearable device of the user (Jerauld, [0039] Feedback is provided to the wearer after interpretation of the sensor input.). Jerauld does not explicitly disclose based on the determination, generating, by the computing device, an augmented reality-based visualization associated with the inanimate object, the augmented reality-based visualization including at least one indicator of the status; Chaudhary discloses the augmented reality device to present one or more second augmented reality visualizations corresponding to the predicted emotional state by utilizing an emotion detection machine learning model (Chaudhary, claim 16 and Fig.7 #704-705). It would have been obvious to one ordinary skilled in the art before the filing of the claimed invention to combine the teachings of Jerauld with the teachings of Chaudhary since they are both analogous in virtual reality related field related field. One ordinary skilled in the art before the filing of the claimed invention would have been motivated to combine the teachings of Jerauld with the teachings of Chaudhary in order to enable providing detected threat indications to the augmented reality device to present by utilizing the threat detection machine learning model and utilizing augmented reality visualizations in an efficient manner. The combination of Jerauld and Chaudhary does not explicitly disclose feedback including the at least one indicator for a user; Gordon discloses HMD device includes a display for presenting an augmented reality, which may include a visual indicator of another user's emotional or cognitive state (Gordon, [0054]). It would have been obvious to one ordinary skilled in the art before the filing of the claimed invention to combine the teachings of combination of Jerauld and Chaudhary with the teachings of Gordon since they are all analogous in virtual reality related field related field. One ordinary skilled in the art before the filing of the claimed invention would have been motivated to combine the teachings of combination of Jerauld and Chaudhary with the teachings of Gordon in order to enable a user to be aware of another user current emotional or cognitive state. Independent claims 8 (Jerauld [0158]) and 15 (Jerauld Fig.8) essentially recite the same limitations as claim 1. Therefore the rejection of claim 1 is applied to claims 8 and 15. Claims 2, 9, 16. The computer-implemented method of claim 1, wherein the indicator is one or more of an injury, a defect, a repair, an upgrade, a downgrade, or a received service associated with the inanimate object (Jerauld discloses determining emotional state [0039][0107]. Emotions that could be positive or negative. Gordon discloses emotional states [0021][0022]. Mental injuries trigger (are the cause) of emotional state). Same rationale as claim 1Claims 3, 10, 17. The computer-implemented method of claim 1, wherein receiving sensor data further comprises: communicating, by the computing device (Jerauld Processing environment 4), with a computer vision system to receive computer vision data associated with the inanimate object; and communicating, by the computing device (Jerauld Processing environment 4), with the wearable device in order to receive user-specific data associated with the user; wherein the computer vision data and user-specific data are utilized by the computing device to determine the status (Jerauld [0111] See Fig.8 smart glasses). Same rationale as claim 1Claims 4, 11, 18. The computer-implemented method of claim 3, wherein analyzing the sensor data further comprises: determining, by the computing device, a direction of visual focus of the user from the user-specific data; wherein the indicator pertains to a part of the inanimate object that is within the direction of visual focus (Jerauld, [0117] the emotional state indication is of a subject who may be within the field of view of a wearer of a see-through head mounted display). Same rationale as claim 1Claims 5, 12. The computer-implemented method of claim 1, wherein the feedback is a plurality of electrical signals representing a sentiment associated with the indicator; wherein the plurality of electrical signals comprises one or more of electrical impulses, air pressure, mechanical pressure, and vibration (Jerauld, See Fig.13 vibrate feedback). Same rationale as claim 1. Claim(s) 6, 13, 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Jerauld, Chaudhary and Gordon, in view of Teyssier et al. “Skin-On Interfaces: A Bio-Driven Approach for Artificial Skin Design to Cover Interactive Devices” cited in IDS filed on 12/08/2022.Claim 6, 13, 19, the combination of Jerauld, Chaudhary and Gordon does not disclose the computer-implemented method of claim 1, wherein the wearable device comprises an electronic artificial skin configured to apply one or more feedback actions to the user. Teyssier discloses the computer-implemented method of claim 1, wherein the wearable device comprises an electronic artificial skin configured to apply one or more feedback actions to the user (Teyssier, Fig.1, pages 308, 309 “Skin-on interface” section: Skin-On interfaces augment interactive systems (e.g. smartphones) with an artificial skin). It would have been obvious to one ordinary skilled in the art before the filing of the claimed invention to combine the teachings of combination of Jerauld, Chaudhary and Gordon with the teachings of Teyssier since they are all analogous in virtual reality related field related field. One ordinary skilled in the art before the filing of the claimed invention would have been motivated to combine the teachings of combination of Jerauld, Chaudhary and Gordon with the teachings of Teyssier in order to enhance interaction. Claim(s) 7, 14, 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Jerauld, Chaudhary and Gordon, in view of Marc Teyssier “Skin-On Interfaces: A Bio-Driven Approach for Artificial Skin Design to Cover Interactive Devices” cited in IDS filed on 12/08/2022, further in view of design choice.Claim 7, 14, 20, the combination of Jerauld, Chaudhary, Gordon and Teyssier does not disclose the computer-implemented method of claim 6, wherein a receiving location for the electronic artificial skin to receive the feedback is based on an indicator location of the inanimate object associated with the indicator. On page 309 “Human skin properties”, Teyssier disclosure is clear that sensory of skin differ on body location. As a matter of design choice, one ordinary skilled in the art before the filing of the claimed invention would have been motivated to have a receiving location for the electronic artificial skin to receive the feedback based on an indicator location of the inanimate object associated with the indicator in order enhance tactile feedback. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure is as follows: US 20190096135 A1 A system for visual inspection includes: a scanning system configured to capture images of an object and to compute a three-dimensional (3-D) model of the object based on the captured images; an inspection system configured to: compute a descriptor of the object based on the 3-D model of the object; retrieve metadata corresponding to the object based on the descriptor; and compute a plurality of inspection results based on the retrieved metadata and the 3-D model of the object; and a display device system including: a display; a processor; and a memory storing instructions that, when executed by the processor, cause the processor to: generate overlay data from the inspection results; and show the overlay data on the display, the overlay data being aligned with a view of the object through the display. US 20220281617 A1 This aircraft inspection support device (100) is provided with a three-dimensional model generation unit (25a), an inspection position acquisition unit (25b), and a control unit (25) configured to perform control to acquire an inspection result (50) of an inspection target (40) and cause the three-dimensional model position (55) on the three-dimensional model (52) corresponding to the inspection position (53), an inspection position (53), and an inspection result (50) to be stored in association with each other. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARTIN MUSHAMBO whose telephone number is (571)270-3390. The examiner can normally be reached Monday-Friday (8:00AM-5:00PM). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571) 272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MARTIN MUSHAMBO/Primary Examiner, Art Unit 2615 02/11/2026
Read full office action

Prosecution Timeline

Dec 08, 2022
Application Filed
Nov 08, 2023
Response after Non-Final Action
Feb 11, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602892
WALLPAPER DISPLAY METHOD AND APPARATUS, AND ELECTRONIC DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12598282
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12586331
SYSTEM AND METHOD FOR CHANGING OVERALL STYLE OF PUBLIC AREA BASED ON VIRTUAL SCENE
2y 5m to grant Granted Mar 24, 2026
Patent 12579754
INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Mar 17, 2026
Patent 12573146
PRODUCT PLACEMENT SYSTEMS AND METHODS FOR 3D PRODUCTIONS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
99%
With Interview (+14.1%)
2y 5m
Median Time to Grant
Low
PTA Risk
Based on 816 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month