Prosecution Insights
Last updated: April 19, 2026
Application No. 18/345,624

Adjustable Immersion Level for Content

Final Rejection §103§112
Filed
Jun 30, 2023
Examiner
CRADDOCK, ROBERT J
Art Unit
2618
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
2 (Final)
84%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 84% — above average
84%
Career Allow Rate
519 granted / 616 resolved
+22.3% vs TC avg
Moderate +14% lift
Without
With
+14.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
27 currently pending
Career history
643
Total Applications
across all art units

Statute-Specific Performance

§101
11.1%
-28.9% vs TC avg
§103
39.6%
-0.4% vs TC avg
§102
24.3%
-15.7% vs TC avg
§112
12.4%
-27.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 616 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments filed 10/15/2025 Applicant's arguments filed 10/15/2025 have been fully considered but they are not persuasive. The applicant’s arguments are directed towards limitations that were in the alternative, see MPEP 2173.05(h)) but now are not expressed in the alternative, thus changing the scope of the claim. As the applicant arguments are directed towards limitations that are addressed by a different reference, they are not persuasive. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 25 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 25 recites, “The electronic device defined in claim 1, wherein the portal is presented as world-locked content.” Which is considered to be indefinite. The limitation recites world-locked content in a way that leaves doubt as to what is trying to be claimed. Said another way, what the portal is and how world-locked content relates to the portal. As is, it is not clear if the world-locked content is the environment that you access once you enter the portal. Is the world-locked content presented by the portal itself? As in, does this portal unambiguously present world-locked content will be for the user that enters the portal before they even enter the portal or does the portal present the world-locked content after? Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1, 2,4-10, 12-18, 20- 24 are rejected under 35 U.S.C. 103 as being unpatentable over Long et al. (US Patent No. 9,573,062 Bl) in view of Song (US 20240082718 A1) in further view of Singh et al. (US 20190005717 A1) Regarding claim 1, Long teaches an electronic device comprising: […] one or more displays (Fig. 1, multiple displays); one or more processors (See col. 25 lines 1 – 8, “one or more processors”); and memory storing instructions configured to be executed by the one or more processors , the instructions for (See col. 25 lines 1 – 8, “The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computing device or computer, and that, when read and executed by one or more processors in the computer, cause the computer to perform operations necessary to execute elements involving the various aspects of the invention.”): obtaining content comprising a representation of a three-dimensional environment (See col. 12 lines 41 – col. 13 lines 8: different 3d environments such as various video games. Col. 13 line 29 -55: describes 3d-360 viewing mode); presenting, using at least the one or more displays, the content in an extended reality environment in a first viewing mode, wherein in the first viewing mode the content is presented based on a viewpoint positioned within the representation of the three-dimensional environment (See col. 13 line 29-55: the view mode is the 3d-360 mode); obtaining, [...], user input (col. 3 lines 1-35: user input); and in response to the user input, presenting, using at least the one or more displays, the content in the extended reality environment in a second viewing mode, wherein in the second viewing mode the content is presented based on a viewpoint that is external to the representation of the three-dimensional environment (col. 21 line 29-col. 22 line 39: describes a minimap. A minimap is considered to be a second viewing mode. The mini-map adjusts based upon input of the user. See Fig. 8, Fig. 14 and 15, shows a second viewing mode, that are external to the first viewing mode.) but doesn’t explicitly disclose: one or more sensors; obtaining, via the one or more sensors, user input; and wherein the second viewing mode comprises a portal viewing mode in which the content is presented as a portal into the three-dimensional environment. PNG media_image1.png 628 878 media_image1.png Greyscale Fig. 8: shows a minimap, which can be a second view. PNG media_image2.png 651 902 media_image2.png Greyscale Fig. 14: shows a minimap, which can be a second view. PNG media_image3.png 644 910 media_image3.png Greyscale Fig. 15: shows a second view mode, that is in 2d and external to the representation of the three-dimensional environment. Song teaches one or more sensors; obtaining, via the one or more sensors, user input (¶676, “The gyroscope sensor 5512 may detect a body direction and a rotation angle of the terminal 5500. The gyroscope sensor 5512 may cooperate with the acceleration sensor 5511 to acquire a 3D action by the user on the terminal 5500. The processor 5501 may implement the following functions according to data acquired by the gyroscope sensor 5512: motion sensing (for example, the UI is changed according to a tilt operation of a user), image stabilization during shooting, game control, and inertial navigation.”) Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Long in view of Song as doing so would be applying known techniques of Song, that is sensors and inputs (Song ¶676), to a known device of Long ready for improvement to yield predicable results. Both Long in view of Song doesn’t explicitly disclose and wherein the second viewing mode comprises a portal viewing mode in which the content is presented as a portal into the three-dimensional environment. Singh teaches wherein the second viewing mode comprises a portal viewing mode in which the content is presented as a portal into the three-dimensional environment (See Fig. 6 with annotation below. See ¶27, ¶48, ¶49, ¶53, ¶55). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Long in view of Song in further view of Singh as doing so would be applying the known techniques of Singh, that is the second viewing mode comprises a portal viewing mode in which the content is presented as a portal into the three-dimensional environment (See Fig. 6 with annotation below. See ¶27, ¶48, ¶49, ¶53, ¶55), to a known device of Long in view of Song ready for improvement to yield predicable results. PNG media_image4.png 916 1344 media_image4.png Greyscale Fig. 6: With additional annotations pointing out 3D environment from the environment before the portal and a 3D environment in the environment the portal provides access to. Regarding claim 2, Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, wherein the first viewing mode comprises a virtual reality viewing mode in which the viewpoint is repositioned responsive to rotational movements detected by the one or more sensors (See Song ¶91, three-dimensional mode, ¶676: rotation inputs). Regarding claim 4, Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, wherein the portal is presented on a subset of a full screen of the one or more displays (See Fig. 6 with annotation below. See ¶27, ¶48, ¶49, ¶53, ¶55 a portal is a subset of the full screen. See MPEP 2173.05(h)) Regarding claim 5, Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, wherein presenting the content in the extended reality environment in the second viewing mode further comprises presenting a view of a physical environment in which the electronic device is located from the viewpoint that is external to the representation of the three-dimensional environment (See Long Fig. 14 and 15). Regarding claim 6, Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, wherein the content is a live multiuser communication session (See Long col. 13 line 56 – col. 14 line 21 Livestreaming Live-broadcast) or a replay of a multiuser communication session (See col. 14-lines 22 – col. 15 line 15: multiple replay examples). Regarding claim 7, Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, wherein the content is a hierarchical multiuser communication session (See Long col. 6 lines 24 – 32: presenter/broadcaster or spectator). Regarding claim 8 Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, wherein the user input comprises a touch input (See Song Fig. 9, showing touch input), or a button input (See Fig. 9, button input, see MPEP 2173.05(h)). Claims 9, 10, 12-18, 20-24 recite similar limitations to that of claims 1, 2, 4-8 and thus are rejected under similar rationale as detailed above. Claims 25 is rejected under 35 U.S.C. 103 as being unpatentable over Long et al. (US Patent No. 9,573,062 Bl) in view of Song (US 20240082718 A1) in view of Singh et al. (US 20190005717 A1) in further view of King (US 20170169610 A1) Regarding claim 25, Long in view of Song in further view of Singh teaches the electronic device defined in claim 1, but doesn’t explicitly disclose wherein the portal is presented as world-locked content. King teaches wherein the portal is presented as world-locked content (See King, ¶46 “With reference now to FIG. 3, in one example Robin's holographic portal may comprise a holographic window 330 displayed to user Adam 320 via his HMD device 100. In some examples, the HMD device 100 may display the holographic window 330 at a world-locked location in Adam's living room 300, such as above the couch 334. In other examples the holographic portal may take the form of a virtual display such as a virtual television, a virtual miniaturized billboard, a floating blimp, etc. In other examples and as described in more detail below with respect to FIG. 6, a holographic portal may comprise displaying the visual representation of activity in the third party real world three dimensional environment as a holographic representation of at least a portion of such environment that appears outside of the user's real world three dimensional environment, such as beyond a wall that appears to have been partially removed.” The portal is presented as world-locked content as the portal is world-locked. That is, the portal presentation is world-locked content). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Long in view of Song in view of Singh in further view King as presenting a portal with its content world-locked at a position provides a consistent location to from which a user can view or enter another environment, thus providing a cohesive and memorable environment by having a consistent location to view or join another environment. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Wang et al. (US8595299B1), see abstract, “A computer-implemented method enables participation by a plurality of clients in a first multi-dimensional virtual environment and a second multi-dimensional virtual environment. A first client sets an object in the multi-dimensional virtual environment system to function as a portal to a second multi-dimensional virtual environment. The portal can be used by the first client and/or the second client to enter the second multi-dimensional virtual environment. A server system receives from the first client data indicating that a first object in the first multi-dimensional virtual environment has been set to function as a portal to the second multi-dimensional virtual environment, and receives from a second client participating in the first multi-dimensional virtual environment data indicating that the second client has invoked the first object. The server system transmits to the second client data representing objects in the second multi-dimensional virtual environment.” Fig 10 shows “Once participating in a first multi-dimensional virtual environment, a client 106 may want to enter into a second multi-dimensional virtual environment. FIG. 10 illustrates a process 1000 of a client having a 3D client application 110, using a portal from a first virtual environment to a second virtual environment according to some embodiments of the invention.” As per col. 15 lines 17 – 23. Meaning a portal from a first 3d environment is presenting a second 3d environment. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT J CRADDOCK whose telephone number is (571)270-7502. The examiner can normally be reached Monday - Friday 10:00 AM - 6 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona E Faulk can be reached at 571-272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROBERT J CRADDOCK/Primary Examiner, Art Unit 2618
Read full office action

Prosecution Timeline

Jun 30, 2023
Application Filed
Aug 07, 2025
Non-Final Rejection — §103, §112
Oct 15, 2025
Response Filed
Jan 18, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597214
SCANNABLE CODES AS LANDMARKS FOR AUGMENTED-REALITY CONTENT
2y 5m to grant Granted Apr 07, 2026
Patent 12597101
IMAGE TRANSMISSION SYSTEM, IMAGE TRANSMISSION METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12579767
AUGMENTED-REALITY SYSTEMS AND METHODS FOR GUIDED INSTALLATION OF MEDICAL DEVICES
2y 5m to grant Granted Mar 17, 2026
Patent 12579792
ELECTRONIC DEVICE FOR OBTAINING IMAGE DATA RELATING TO HAND MOTION AND METHOD FOR OPERATING SAME
2y 5m to grant Granted Mar 17, 2026
Patent 12555331
INFORMATION PROCESSING APPARATUS
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
84%
Grant Probability
99%
With Interview (+14.4%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 616 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month