Prosecution Insights
Last updated: April 19, 2026
Application No. 19/220,726

METHOD FOR INTERACTING WITH A VIRTUAL OBJECT, PROGRAM USING SAME, AND IMMERSIVE REALITY DEVICE

Non-Final OA §102§103§112
Filed
May 28, 2025
Examiner
NADKARNI, SARVESH J
Art Unit
2629
Tech Center
2600 — Communications
Assignee
Orange
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant
85%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
354 granted / 494 resolved
+9.7% vs TC avg
Moderate +14% lift
Without
With
+13.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
37 currently pending
Career history
531
Total Applications
across all art units

Statute-Specific Performance

§101
1.1%
-38.9% vs TC avg
§103
72.6%
+32.6% vs TC avg
§102
11.3%
-28.7% vs TC avg
§112
11.6%
-28.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 494 resolved cases

Office Action

§102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings are objected to because the unlabeled rectangular box(es) (FIG. 3 boxes 2, 12, 13, 14, 15, 16, 100, 101, etc.) shown in the drawings should be provided with descriptive text labels. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-12 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites “a virtual object” at line 5 of the claim a second time after originally introducing “a virtual object” at line 1. It is not clear whether the second reference to “a virtual object” is intended to be a different virtual object than the first reference to “a virtual object”, rendering the claim indefinite. Appropriate correction is required. By virtue of their dependency, claims 2-11 are also rejected. Claim 2 recites “a continuous gesture” after originally introduced in claim 1. It is not clear whether the second reference to “a continuous gesture” is intended to be a different gesture, rendering the claim indefinite. Appropriate correction is required. Claim 3 recites “a virtual object” originally introduced in claim 1, from which claim 3 depends. It is not clear whether the reference in claim 3 is intended to be a different virtual object than the first reference to “a virtual object”, rendering the claim indefinite. Appropriate correction is required. Claim 6 recites “a virtual object” originally introduced in claim 1, from which claim 6 depends. It is not clear whether the reference in claim 6 is intended to be a different virtual object than the first reference to “a virtual object”, rendering the claim indefinite. Appropriate correction is required. Claim 6 recites “a gesture of the real hand complementary to the detected continuous gesture” and “a complementary gesture”. It is not clear whether “a gesture of the real hand complementary to the detected continuous gesture” is the same gesture as “a complementary gesture”, rendering the claim indefinite. Appropriate correction is required. Claim 8 recites “a virtual object” after it was originally introduced in claim 1, from which claim 8 depends. It is not clear whether the reference in claim 8 is intended to be a different virtual object than the first reference to “a virtual object”, rendering the claim indefinite. Appropriate correction is required. Claim 8 recites “a gesture of the real hand complementary to the detected continuous gesture” and “a complementary gesture” and “the complimentary gesture”. It is not clear whether “a gesture of the real hand complementary to the detected continuous gesture” is the same gesture as “a complementary gesture” and whether “the complimentary gesture” references “a gesture of the real hand complementary to the detected continuous gesture” or “a complementary gesture” (if they are different), rendering the claim indefinite. Appropriate correction is required. Claim 9 recites “a virtual hand” originally introduced in claim 1, from which claim 9 depends. It is not clear whether the reference in claim 9 is intended to be a different virtual hand than the first reference to “a virtual hand”, rendering the claim indefinite. Appropriate correction is required. Claim 9 recites “a real hand” originally introduced in claim 1, from which claim 9 depends. It is not clear whether the reference in claim 9 is intended to be a different real hand than the first reference to “a real hand”, rendering the claim indefinite. Appropriate correction is required. Claim 12 recites “the real hand” at line 5 of the claim after originally introducing “real hands” at line 3 and then references “a real hand” at lines 7-8. It is not clear which hand of “real hands” is being referenced by the subsequent references to “the real hand” and “a real hand”, rendering the claim indefinite. Appropriate correction is required. Claim 12 recites “a virtual hand” and “a user” at lines 7 and 8, respectively, after originally introducing these limitations at lines 7 and 3, respectively. It is not clear whether the subsequent references to “a virtual hand” and “a user” are intended to be referencing new elements, rendering the claim indefinite. Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-5, 7, and 11-12 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Andersen et al., US 2020/0005026 A1 (hereinafter “Andersen”). Regarding claim 1, Andersen discloses an interaction method (FIGS. 5 and 7 generally at [0082]-[0087] and [0098]-[0101]), for interacting with a virtual object (FIGS. 6A-6B and 8A-8B and virtual object 610 at [0088]-[0092] virtual object 810 and [0102]-[0105]) reproduced by an immersive reality device (FIGS. 1-3 and eyeware device 100, 200 at [0020]-[0024] and near-eye display 305 at [0030]-[0032] describing the display therein), the method being performed by the immersive reality device (FIGS. 5 and 7 referencing NED 300 generally at [0082]-[0087] and [0098]-[0101]) and comprising: In response to a virtual hand ([0076] virtual representation of user’s hand) associated with a real hand ([0076] virtual representation of user’s hand) of a user ([0076] virtual representation of user’s real hand) of the immersive reality device (FIGS. 1-3 and eyeware device 100, 200 at [0020]-[0024] and near-eye display 305 at [0030]-[0032]) being in proximity of a virtual object (FIGS. 6A-6B and [0091] user moves their hand near the displayed virtual object 610), detecting a continuous gesture of the real hand (tracking module 360 and gesture ID module at FIG. 3 and [0051]-[0060] detects and determines the gestures; FIG. 3 and [0041]-[0042] NED 305 having a DCA 340 capable of detecting and a imaging device 315 capable of detecting with continuous recognition at [0051] and [0061] and [0066]-[0069] detecting the gesture; and FIGS. 6A-6B and 8A-8B at [0091] and [0105] detection of the gesture performed), the detection of the continuous gesture triggering gripping of the virtual object by the virtual hand (generally [0075] describing pinching and grabbing gesture for virtual object by the virtual hand of the user, further at FIGS. 5-8B and [0089]-[0092] and [0105]-[0106]). Regarding claim 2, Andersen discloses the interaction method according to claim 1 (see above), wherein the interaction method comprises: taking the virtual object with the virtual hand as soon as a continuous gesture of the real hand is detected (FIGS. 8A-8B and hold of virtual object once the gesture is performed at [0102]-[0105]). Regarding claim 3, Andersen discloses the interaction method according to claim 1 (see above), wherein the interaction method comprises: when the virtual hand is holding a virtual object (FIGS. 6A-6B and 8A-8B hold of virtual object once the gesture is performed at [0088]-[0097] and [0102]-[0105]), detecting an interruption of the continuous gesture of the real hand (FIGS. 6A-6B and dartboard gesture at [0088]-[0094] and second motion 635 with releasing and letting go), the detection of the interruption of the continuous gesture triggering an end of the gripping of the virtual object by the virtual hand (FIGS. 6A-6B and dartboard gesture at [0088]-[0094] and second motion 635 with releasing and letting go). Regarding claim 4, Andersen discloses the interaction method according to claim 1 (see above), wherein the interaction method comprises: letting go of the virtual object held by the virtual hand as soon as interruption of the continuous gesture of the real hand is detected (FIGS. 6A-6B and gesture at [0088]-[0094] second motion 635 with releasing and letting go) . Regarding claim 5, Andersen discloses the interaction method according to claim 1 (see above), wherein the virtual object is gripped by the virtual hand depending on an orientation of the palm of the virtual hand with respect to the virtual object when the continuous gesture is detected (FIGS. 6A-6B and [0089] and [0094] palm up gesture ends hold). Regarding claim 7, Andersen discloses the interaction method according to claim 1 (see above), wherein the detected continuous gesture is contact of the thumb and the index finger of the real hand (Andersen at [0058], [0070], [0081], [0091], [0102]). Regarding claim 11, Andersen discloses a non-transitory computer readable medium ([0116]-[0118]) comprising a program stored thereon ([0116]-[0118]) and comprising program code instructions ([0116]-[0118]) for executing the interaction method according to claim 1 (see above) when said program is executed by processor of the immersive reality device ([0116]-[0118]). Regarding claim 12, Andersen discloses an immersive reality device (FIGS. 1-3 and eyeware device 100, 200 at [0020]-[0024] and near-eye display 305 at [0030]-[0032]) comprising: a reproduction device (FIGS. 1-3 display assembly of NED at [0019]-[0022] and [0062]-[0071]) for reproducing a virtual object (FIGS. 6A-6B and 8A-8B and virtual object 610 at [0088]-[0092] virtual object 810 and [0102]-[0105] produced by the display of NED 305 called the optical assembly 320 at FIG. 3 and [0030]-[0032]); a camera (FIG. 3 and [0032] depth camera array 340, [0041]-[0042] NED 305 having a DCA 340 capable of detecting and a imaging device 315 capable of detecting with continuous recognition at [0051] and [0061] and [0066]-[0069] detecting the gesture; and FIGS. 6A-6B and 8A-8B at [0091] and [0105] detection of the gesture performed) able to capture real hands of a user of the immersive reality device (FIG. 3 and [0041]-[0042]; and a gesture detector able to detect a continuous gesture of the real hand (tracking module 360 and gesture ID module at FIG. 3 and [0051]-[0060] detects and determines the gestures; additionally, [0032] depth camera array 340, [0041]-[0042] NED 305 having a DCA 340 capable of detecting and a imaging device 315 capable of detecting with continuous recognition at [0051] and [0061] and [0066]-[0069] detecting the gesture), the detection of the continuous gesture triggering gripping of the virtual object by the virtual hand (generally [0075] describing pinching and grabbing gesture for virtual object by the virtual hand of the user, further at FIGS. 5-8B and [0089]-[0092] and [0105]-[0106]) when a virtual hand ([0076] virtual representation of user’s hand) associated with a real hand ([0076] virtual representation of user’s hand) of a user ([0076] virtual representation of user’s real hand) of the immersive reality device (FIGS. 1-3 and eyeware device 100, 200 at [0020]-[0024] and near-eye display 305 at [0030]-[0032]) is in proximity of the virtual object (FIGS. 6A-6B and [0091] user moves their hand near the displayed virtual object 610). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Andersen in view of Stafford et al., US 2016/0306431 A1 (hereinafter “Stafford”). Regarding claim 6, Andersen discloses the interaction method according to claim 1 (see above), wherein the interaction method comprises: when the virtual hand is holding a virtual object (Andersen at FIGS. 8A-8B and [0102]-[0105] describing holding with one hand 805), detecting a gesture of the real hand complementary to the detected continuous gesture, detecting a complementary gesture (Andersen at FIGS. 8A-8B and [0102]-[0105] describing holding with one hand 805 and interaction with other hand 810 or other inputs by the same hand such as casting at FIGS. 6A-6B at [0088]-[0095]). However, Andersen does not explicitly disclose triggering a rotation of the virtual object in the virtual hand. In the same field of endeavor, Stafford discloses a virtual reality interactive using pinch gestures to interact with virtual objects, where two pinch gestures are capable of triggering a rotation of the virtual object in the virtual hand (FIG. 2A and two handed pinch hold causing rotation of the object for elements E, F, G, H and alternatively rotations with only one hand at I generally at [0062]-[0068] and [0073] and [0080]). Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the gesture inputs within a virtual reality environment of Andersen to incorporate the rotational gesture input as disclosed by Stafford because the references are within the same field of endeavor, namely, virtual reality gesture inputs and methods. The motivation to combine these references would have been to improve interaction with the environment without the need for a physical controller (see Stafford at least at [0003]-[0004] and [0010]). Therefore, a person of ordinary skill in the art would have been motivated to combine the prior art to achieve the claimed invention and there would have been a reasonable expectation of success. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Andersen in view of Stafford et al., US 2016/0306431 A1 (hereinafter “Stafford”) further in view of Goodrich et al., US 2022/0197393 A1 (hereinafter “Goodrich”). Regarding claim 8, Andersen discloses the interaction method according to claim 7 (see above), wherein the interaction method comprises: when the virtual hand is holding a virtual object (Andersen at FIGS. 8A-8B and [0102]-[0105] describing holding with one hand 805), detecting a gesture of the real hand complementary to the detected continuous gesture, detecting a complementary gesture (Andersen at FIGS. 8A-8B and [0102]-[0105] describing holding with one hand 805 and interaction with other hand 810 or other inputs by the same hand such as casting at FIGS. 6A-6B at [0088]-[0095]) However, Andersen does not explicitly disclose triggering a rotation of the virtual object in the virtual hand, wherein the complementary gesture that is detected is sliding of the thumb against the index finger of the real hand. In the same field of endeavor, Stafford discloses a virtual reality interactive using pinch gestures to interact with virtual objects, where two pinch gestures are capable of triggering a rotation of the virtual object in the virtual hand (FIG. 2A and two handed pinch hold causing rotation of the object for elements E, F, G, H and alternatively rotations with only one hand at I generally at [0062]-[0068] and [0073] and [0080]). Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the gesture inputs within a virtual reality environment of Andersen to incorporate the rotational gesture input as disclosed by Stafford because the references are within the same field of endeavor, namely, virtual reality gesture inputs and methods. The motivation to combine these references would have been to improve interaction with the environment without the need for a physical controller (see Stafford at least at [0003]-[0004] and [0010]). Therefore, a person of ordinary skill in the art would have been motivated to combine the prior art to achieve the claimed invention and there would have been a reasonable expectation of success. However, Andersen in view of Stafford does not explicitly disclose wherein the complementary gesture that is detected is sliding of the thumb against the index finger of the real hand. In the same field of endeavor, Goodrich discloses an gesture control of various objects in a virtual reality environment (Abstract) wherein the complementary gesture that is detected is sliding of the thumb against the index finger of the real hand (FIGS. 6-9 and [0031]-[0032] and [0110] [0118]-[0120] and [0123]-[0126] describing rotational function of the sliding of thumb against finger). Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the rotational input gesture in a virtual reality environment of Andersen in view of Stafford to incorporate the thumb and index finger sliding gesture as disclosed by Goodrich because the references are within the same field of endeavor, namely, gesture inputs for a virtual reality environment. The motivation to combine these references would have been to improve the efficiency of using the devices through intuitive gestures (see Goodrich at least at [0017]). Therefore, a person of ordinary skill in the art would have been motivated to combine the prior art to achieve the claimed invention and there would have been a reasonable expectation of success. Claims 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Andersen in view of Lansdale et al., US 2012/0276995 A1 (hereinafter “Lansdale”). Regarding claim 9, Andersen discloses the interaction method according to claim 1 (see above). However, Andersen does not explicitly disclose wherein the interaction method comprises: lengthening a virtual arm attached to a virtual hand associated with a real hand moving toward a distant virtual object. In the same field of endeavor, Lansdale discloses wherein the interaction method comprises: lengthening a virtual arm attached to a virtual hand associated with a real hand moving toward a distant virtual object (FIGS. 5-7 and [0044]-[0049] and [0052] describing the nonlinear extension of the arm 508 of the user’s avatar is extended toward the distant object). Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the rotational input gesture in a virtual reality environment of Andersen in to incorporate the extension of the virtual arm to reach an object as disclosed by Lansdale because the references are within the same field of endeavor, namely, gesture inputs for a virtual reality environment. The motivation to combine these references would have been to improve and enable fine grained control of a games and manipulate virtual objects with fine grained control in a robust easy to use manner, thereby enhancing the user experience (see Lansdale [0021]). Therefore, a person of ordinary skill in the art would have been motivated to combine the prior art to achieve the claimed invention and there would have been a reasonable expectation of success. Regarding claim 10, Andersen in view of Lansdale discloses the interaction method according to claim 9 (see above), wherein the interaction method comprises: using a non-linear mapping between the movement of the real hand as captured in a real environment and the movement of the virtual hand (Lansdale at FIGS. 5-7 and [0044]-[0049] and [0052] describing the exaggerated depiction therein) in an immersive environment (Andersen at FIGS. 6A-8B and at [0088]-[0094] and [0102]-[0104]), the movement of the virtual hand depending on the captured movement of the real hand (Lansdale at FIGS. 5-7 and [0044]-[0049] and [0052] describing the exaggerated depiction therein based on the real hand). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Araki et al. US 2019/0324549 A1; Kwon et al., US 2016/0349849 A1; Yang et al., US 2019/0258320 A1; Dessero et al., US 2024/0094862 A1; Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARVESH J NADKARNI whose telephone number is (571)270-7562. The examiner can normally be reached 8AM-5PM M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached at (571) 272-7671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SARVESH J NADKARNI/Examiner, Art Unit 2621 /LUNYI LAO/Supervisory Patent Examiner, Art Unit 2621
Read full office action

Prosecution Timeline

May 28, 2025
Application Filed
Jan 06, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12573325
SCAN SIGNAL DRIVER CIRCUIT, DISPLAY PANEL, DISPLAY DEVICE, AND DRIVING METHOD
2y 5m to grant Granted Mar 10, 2026
Patent 12560967
ANNULAR HOUSING FOR DETECTION DEVICE WITH FIRST AND SECOND FLEXIBLE SUBSTRATES
2y 5m to grant Granted Feb 24, 2026
Patent 12554334
PERSONALIZED CALIBRATION OF USER INTERFACES
2y 5m to grant Granted Feb 17, 2026
Patent 12548519
POWER SUPPLY SYSTEM, DISPLAY DEVICE INCLUDING THE SAME, AND METHOD OF DRIVING THE SAME
2y 5m to grant Granted Feb 10, 2026
Patent 12504831
TACTILE PRESENTATION APPARATUS AND TACTILE PRESENTATION KNOB
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
85%
With Interview (+13.7%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 494 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month