Prosecution Insights
Last updated: April 19, 2026
Application No. 19/130,061

MECHANISM TO CONTROL THE REFRESH RATE OF THE REAL-ENVIRONMENT COMPUTATION FOR AUGMENTED REALITY (AR) EXPERIENCES

Non-Final OA §102§103
Filed
May 14, 2025
Examiner
ADAMS, CARL
Art Unit
2627
Tech Center
2600 — Communications
Assignee
Interdigital Ce Patent Holdings SAS
OA Round
1 (Non-Final)
71%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
88%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
556 granted / 780 resolved
+9.3% vs TC avg
Strong +17% interview lift
Without
With
+17.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
26 currently pending
Career history
806
Total Applications
across all art units

Statute-Specific Performance

§101
1.2%
-38.8% vs TC avg
§103
58.3%
+18.3% vs TC avg
§102
30.9%
-9.1% vs TC avg
§112
7.9%
-32.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 780 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 2, 4, 8, 9, 12, 13, 16 – 18 and 23 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Marchenko et al. (US Pub. No. 2018/0329501 A1). As to claims 1 and 2, Marchenko shows a method (Fig. 5 and para. 112) and an associated apparatus (Fig. 2 and para. 43) comprising: obtaining sensor data (via sensor module 240, Fig. 2 and para. 52) describing a user's environment (i.e. an object/gesture, Figs. 4A – 4C and 5 and paras. 82, 114 and 115); obtaining information indicating user movement information (i.e. a gesture, Figs. 4A – 4C and 5 and paras. 82 and 115); determining a candidate environment-computation refresh rate based on the at least one factor (i.e. minimum/higher/highest frame rate, for example, Figs. 4B and 4C paras. 103 and 109 – 111); selecting an environment-computation refresh rate as a minimum of: the candidate environment-computation refresh rate and a maximum environment-computation refresh rate (Fig. 5 and paras. 112 – 120); and updating a real-environment computation using the selected environment-computation refresh rate (Fig. 5 and paras. 112 – 120). As to claims 4, 9 and 23, Marchenko shows receiving metadata (i.e. complex movement data) indicating the maximum environment-computation refresh rate (Fig. 4C and paras. 110 and 111). As to claim 8, Marchenko shows that the candidate environment-computation refresh rate is determined based at least on the user movement information and the environment evolution information (i.e. object detection, step 502 and gesture type detection, step 504, Fig. 5 and paras. 112 – 120). As to claim 11, Marchenko shows that the candidate environment-computation refresh rate is determined based at least in part on the user movement information (Figs. 4B and 4C paras. 103 and 109 – 111). As to claim 12, Marchenko shows obtaining information indicating a threshold environment-computation refresh rate and a specified action (i.e. determining whether frame rate beyond minimum is required and interpreting the detected gesture, Figs. 4B and 4C paras. 103 and 109 – 111); and in response to a determination that the selected environment-computation refresh rate is less than the threshold environment-computation refresh rate, performing the specified action (i.e. interpreting the detected gesture, Fig. 5 and paras. 112 – 120). As to claim 13, Marchenko shows that the candidate environment-computation refresh rate is calculated based at least in part on metadata received in a scene description file (i.e. complex movement data, Fig. 4C and paras. 110 and 111). As to claim 16, Marchenko shows that the scene description data further includes at least one weight value (i.e. complex movement data, for example) for use in calculating the environment-computation refresh rate (Fig. 4C and paras. 110 and 111). As to claim 17, Marchenko shows that the scene description data further includes, for the at least one weight value, information identifying a factor associated with the respective weight value (i.e. complex movement data, Fig. 4C and paras. 110 and 111). As to claim 18, Marchenko shows that the factor comprises movement information (i.e. complex movement data, Fig. 4C and paras. 110 and 111). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3, 5, 10, 14, 22 and 24 are rejected under 35 U.S.C. 103 as being unpatentable over Marchenko in view of Tanner et al. (Pub. No. 2021/0360155 A1). As to claims 3 and 22, Marchenko does not show obtaining scene description data describing an extended reality scene, and presenting the scene in the user's environment. Tanner shows the method of obtaining scene description data describing an extended reality scene, and presenting the scene in the user's environment (i.e. augmented reality, paras. 146, 152 and 194). It would have been obvious to one of ordinary skill in the art at the time of filing to modify the teachings of Marchenko with those of Tanner because designing the system in this way allows the device to conform to modern cloud gaming platforms (para. 194). As to claim 14, Marchenko shows a method comprising: providing scene description data for a virtual reality experience (Figs 4a – 4c and 5 and para. 102), wherein the scene description data includes: information indicating a threshold value of an environment-computation refresh rate (i.e. minimum frame rate, for example, Figs. 4B and 4C paras. 103 and 109 – 111), information indicating a nominal value of the environment-computation refresh rate (i.e. higher/highest frame rate, for example, Figs. 4B and 4C paras. 103 and 109 – 111), and information indicating at least one specified action to be performed in response to the environment-computation refresh rate falling below the threshold value (i.e. interpreting the detected gesture, Fig. 5 and paras. 112 – 120). Marchenko does not show obtaining scene description data describing an extended reality scene, and presenting the scene in the user's environment. Tanner shows the method of obtaining scene description data describing an extended reality scene, and presenting the scene in the user's environment (i.e. augmented reality, paras. 146, 152 and 194). It would have been obvious to one of ordinary skill in the art at the time of filing to modify the teachings of Marchenko with those of Tanner because designing the system in this way allows the device to conform to modern cloud gaming platforms (para. 194). As to claims 5 and 24, Marchenko does not show that the selection of the environment-computation refresh rate is made on a periodic basis. Tanner shows that the selection of the environment-computation related data is made on a periodic basis (Fig. 12A and para. 182). It would have been obvious to one of ordinary skill in the art at the time of filing to modify the teachings of Marchenko with those of Tanner because designing the system in this way allows the device to save power by only performing detection periodically. As to claim 10, Marchenko does not show determining the maximum environment-computation refresh rate based at least in part on the network condition information. Tanner shows the method of altering display characteristics based on network condition information (i.e. bandwidth, Figs. 9A and 9B and paras. 166 and 167). It would have been obvious to one of ordinary skill in the art at the time of filing to modify the teachings of Marchenko with those of Tanner because designing the system in this way allows the device to provide good quality video and a better user experience (para. 167). Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Marchenko in view of Heinz, II et al. (Pub. No. 2017/0195390 A1). As to claim 6, Marchenko shows that the candidate environment-computation refresh rate is determined using parameters representing at least two of the factors including user movement and environment evolution (i.e. object detection, step 502 and gesture type detection, step 504, Fig. 5 and paras. 112 – 120). Marchenko does not show a weighted sum of parameters. Heinz shows the process of establishing priority of display of objects based on weighted values (Figs. 7 and 8 and paras. 56 – 58 and 71 – 77). It would have been obvious to one of ordinary skill in the art at the time of filing to modify the teachings of Marchenko with those of Heinz because designing the system in this way allows the device to dynamically adjust scene quality based on priorities of the user (para. 78). Allowable Subject Matter Claim 7 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Specifically, claim 7 recites “… receiving metadata indicating weights used in the weighted sum.” The prior at does not show this configuration, therefore, this claim contains allowable subject matter. CONCLUSION Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARL ADAMS whose telephone number is (571)270-7448. The examiner can normally be reached Monday - Friday, 9AM - 5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CARL ADAMS/Examiner, Art Unit 2627
Read full office action

Prosecution Timeline

May 14, 2025
Application Filed
Jan 10, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601813
VIRTUAL TOUCH INTERACTION FOR ANY DISPLAY DEVICES USING RADAR
2y 5m to grant Granted Apr 14, 2026
Patent 12591131
LIGHT SOURCE DEVICE, CONTROL METHOD, AND COMPUTER-READABLE RECORDING MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12591330
Electronic Devices With Display and Touch Sensor Structures
2y 5m to grant Granted Mar 31, 2026
Patent 12582388
SYSTEM AND APPARATUS FOR REMOTE INTERACTION WITH AN OBJECT
2y 5m to grant Granted Mar 24, 2026
Patent 12582382
SYSTEM AND APPARATUS FOR REMOTE INTERACTION WITH AN OBJECT
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
71%
Grant Probability
88%
With Interview (+17.1%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 780 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month