Prosecution Insights
Last updated: April 19, 2026
Application No. 18/612,348

GENERATING THREE-DIMENSIONAL USER INTERFACES BASED ON TWO-DIMENSIONAL USER INTERFACES

Non-Final OA §102§103
Filed
Mar 21, 2024
Examiner
DEBROW, JAMES J
Art Unit
2174
Tech Center
2100 — Computer Architecture & Software
Assignee
Adeia Guides Inc.
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
95%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
351 granted / 504 resolved
+14.6% vs TC avg
Strong +26% interview lift
Without
With
+25.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
25 currently pending
Career history
529
Total Applications
across all art units

Statute-Specific Performance

§101
11.1%
-28.9% vs TC avg
§103
52.5%
+12.5% vs TC avg
§102
23.9%
-16.1% vs TC avg
§112
6.4%
-33.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 504 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Office Action is responsive to: Application filed 21 Mar. 2024 Claims 1-8, 11-20, 23 and 24 are pending in this case. Claims 1 and 13 are independent claims Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-8, 11, 13-20 and 23 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bucior et al. (Pub. No.: US 2025/0148698 A1; Filed: Nov. 2, 2023) (hereinafter “Bucior”). Regarding independent claims 1 and 13, Bucior disclose a method, comprising: accessing a two-dimensional (2D) user interface (UI) comprising a plurality of 2D UI elements (0045-0046; 0059; 0079; 0087-0088; 0096); generating a plurality of three-dimensional (3D) UI elements of a 3D UI based on each of the plurality of 2D UI elements (0045-0046; 0059; 0079; 0087-0088; 0096); identifying a respective location of each of a plurality of geometric shapes in a surrounding physical environment of an extended reality device using at least one sensor (0046; 0073; 0113; 0123; 0132); determining a priority for each 3D UI element of the plurality of 3D UI elements (0060-0062; 0078); mapping each 3D UI element of the plurality of 3D UI elements to a respective geometric shape of the plurality of geometric shapes based on: (a) the respective location of each of the plurality of geometric shapes in the surrounding physical environment (0046; 0073; 0113; 0123; 0132), and (b) the respective priority for each 3D UI element of the plurality of 3D UI elements (0060-0062; 0078); and generating for display, using the extended reality device, each 3D UI element to overlay the mapped respective geometric shape (0023; 0045; 0047-0049). Regarding dependent claims 2 and 14, Bucior disclose the method of claims 1 and 13 respectively, wherein the priority for each 3D UI element of the plurality of 3D UI elements is determined based on a frequency of interaction with the 3D UI element (0060-0062; 0078). Regarding dependent claims 3 and 15, Bucior disclose the method of claims 1 and 13 respectively, wherein the priority for each 3D UI element of the plurality of 3D UI elements is determined based on a spatial relationship of a respective 2D UI element to other 2D UI elements in the 2D UI (0045; 0059; 0079; 0087-0088). Regarding dependent claims 4 and 16, Bucior disclose the method of claims 1 and 13 respectively, wherein the priority for each 3D UI element of the plurality of 3D UI elements is determined based on a functionality the 3D UI element (0045-0046; 0085). Regarding dependent claims 5 and 17, Bucior disclose the method of claims 1 and 13 respectively, further comprising: sorting each 3D UI element into one of a plurality of subsets based on a functionality of the 3D UI element (0078; 0110; 0116); and sorting each of the plurality of geometric shapes into one of a plurality of subsets based on the respective location (0046; 0063; 0069; 0078; 0110; 0116), wherein: mapping each 3D UI element to the respective geometric shape is further based on (c) the plurality of subsets of the 3D UI elements, and (d) the plurality of subsets of the geometric shapes (0045-0046; 0059; 0079-0080; 0095-0096). Regarding dependent claims 6 and 18, Bucior disclose the method of claims 1 and 13 respectively, wherein mapping each 3D UI element to a respective geometric shape further comprises: determining an orientation of each of the plurality of geometric shapes (0062; 0064; 0085; 0087; 0091; 0095; 0096); and adjusting an orientation of each 3D UI element to match the orientation of the respective geometric shape (0062; 0064; 0085; 0087; 0091; 0095; 0096). Regarding dependent claims 7 and 19, Bucior disclose the method of claims 1 and 13 respectively, further comprising: determining a prevalent color of each of the plurality of geometric shapes (0065; 0071; 0088; 0110-0111); and determining a prevalent color of each 3D UI element (0065; 0071; 0088; 0110-0111), wherein: mapping each 3D UI element to the respective geometric shape is further based on a contrast ratio of the prevalent color of the 3D UI elements to the prevalent color of the geometric shapes exceeding a contrast threshold (0065; 0071; 0088; 0110-0111). Regarding dependent claims 8 and 20, Bucior disclose the method of claims 1 and 13 respectively, further comprising: identifying an object type and a respective location of each of a plurality of objects in the surrounding physical environment (0046; 0073; 0113; 0123; 0132); and identifying a functionality of each 3D UI element (0045-0046; 0085), wherein: mapping each 3D UI element to the respective geometric shape is further based on (c) the object type of each of the plurality of objects having a respective location within a proximity threshold of the respective geometric shape, and (d) the respective functionality of each 3D UI element of the plurality of 3D UI elements (0046; 0060-0062; 0078; 0073; 0113; 0123; 0132). Regarding dependent claims 11 and 23, Bucior disclose the method of claims 1 and 13 respectively, further comprising: identifying an update to a 2D UI element of the 2D UI (0045-0046; 0059; 0079; 0087-0088; 0096); generating an updated 3D UI element of the plurality of 3D UI elements based on the updated 2D UI element (0045-0046; 0059; 0079; 0087-0088; 0096); and in response to generating the updated 3D UI element: re-mapping each 3D UI element of the plurality of 3D UI elements to a respective geometric shape of the plurality of geometric shapes based on the respective priority for each 3D UI element of the plurality of 3D UI elements (0046; 0073; 0113; 0123; 0132); and generating for display, using the extended reality device, each 3D UI element to overlay the mapped respective geometric shape (0023; 0045; 0047-0049). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 12 and 24 are rejected under 35 U.S.C. 103 as being unpatentable over Bucior in view of Williams et al. (Pub. No.: US 2025/0148698 A1; Filed: Nov. 2, 2023) (hereinafter “Williams”). Regarding dependent claims 12 and 24, Bucior does not expressly disclose the method of claims 1 and 13 respectively, wherein: accessing the 2D UI further comprises accessing a first website of a domain; generating the plurality of 3D UI elements of the 3D UI based on each of the plurality of 2D UI elements comprises: determining whether any of the 3D UI elements are stored in a cache of a non- transitory memory associated with the domain; and in response to determining a portion of the 3D UI elements are stored in the cache: retrieving the portion of the 3D UI elements stored in the cache; and generating the remaining 3D UI elements; and the method further comprises: accessing a second website of the domain; and retrieving at least one of the portion of the 3D UI elements stored in the cache. Williams disclose accessing the 2D UI further comprises accessing a first website of a domain (0120; 0133); generating the plurality of 3D UI elements of the 3D UI based on each of the plurality of 2D UI elements (0129-0130; 0133) comprises: determining whether any of the 3D UI elements are stored in a cache of a non- transitory memory associated with the domain (0123; 0137); and in response to determining a portion of the 3D UI elements are stored in the cache: retrieving the portion of the 3D UI elements stored in the cache (0123; 0137); and generating the remaining 3D UI elements (0123; 0137); and the method further comprises: accessing a second website of the domain (0120; 0133); and retrieving at least one of the portion of the 3D UI elements stored in the cache (0123; 0137). Therefore Before the effective filing date of the claims invention, it would have been obvious to one of ordinary skill in the art to combine Williams with Bucior for the benefit of allowing user to access, manipulate, and position 3D virtual objects in environment presented by image display devices NOTE It is noted that any citations to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the reference should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. See MPEP 2123. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES J DEBROW whose telephone number is (571)272-5768. The examiner can normally be reached on 09:00 - 06:00. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Bashore can be reached on 571-272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center and the Private Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from Patent Center or Private PAIR. Status information for unpublished applications is available through Patent Center or Private PAIR to authorized users only. Should you have questions about access to Patent Center or the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. /James J Debrow/ Primary Patent Examiner Art Unit 2174 571-272-5768
Read full office action

Prosecution Timeline

Mar 21, 2024
Application Filed
Jan 24, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602148
SYSTEM AND METHOD FOR DIAGNOSIS AND ANALYSIS OF MEDICAL IMAGES
2y 5m to grant Granted Apr 14, 2026
Patent 12591734
BULK ENVELOPE MANAGEMENT IN DOCUMENT MANAGEMENT SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12572731
INFORMATION PROCESSING METHOD AND APPARATUS, TERMINAL, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 10, 2026
Patent 12566917
SERVER APPARATUS AND CLIENT APPARATUS
2y 5m to grant Granted Mar 03, 2026
Patent 12561357
DOCUMENT PROCESSING METHOD AND APPARATUS, DEVICE, AND MEDIUM
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
95%
With Interview (+25.7%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 504 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month