Prosecution Insights
Last updated: April 19, 2026
Application No. 18/849,393

MULTILOCATION AUGMENTED REALITY

Non-Final OA §102§103§112
Filed
Sep 20, 2024
Examiner
PARK, HYORIM NMN
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Pictorytale AS
OA Round
1 (Non-Final)
100%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 100% — above average
100%
Career Allow Rate
1 granted / 1 resolved
+38.0% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
9 currently pending
Career history
10
Total Applications
across all art units

Statute-Specific Performance

§101
4.0%
-36.0% vs TC avg
§103
60.0%
+20.0% vs TC avg
§102
20.0%
-20.0% vs TC avg
§112
16.0%
-24.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Information Disclosure Statement The information disclosure statement (IDS) submitted on 09/20/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Specification The disclosure is objected to because of the following informalities: paragraph 15 states “AR backend,” which should read “AR backend server” for consistency. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 1-7 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites “the detected object” in lines 5 and 8. However, the examiner is unclear as to applicant’s intended meaning. Does applicant mean generating a virtual representation of a detected predefined type of object? Or is applicant referring to another object in the camera view. Thus claim 1 is unclear and indefinite. Claim 1 recites the limitation “the at least one other device” in line 9. There is insufficient antecedent basis for this limitation in the claim. Claim 2 recites the limitation "the virtual object" in line 2. There is insufficient antecedent basis for this limitation in the claim. Claim 7 recites “where processing of information relating to generation or updating of information”. However, claim 6 from which it depends fails to positively recite processing step or updating step. Thus, the claim 7 is unclear and indefinite. For these reasons, independent claims 3-6 are rejected as well. Claims 1-7 will be examined as best understood by the examiner. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1 and 3-9 are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by Miller et al. (US 20140306866 A1 – IDS REF) (hereinafter Miller). Regarding claim 1, Miller discloses A method of synchronizing simultaneous rendering of the same augmented reality environment at multiple locations, comprising: (para. [0002], “The present invention generally relates to systems and methods configured to facilitate interactive virtual or augmented reality environments for one or more users.”) obtaining a camera view of a local environment; (Figure 21; para. [0092], “The environment-sensing system 306 includes one or more sensors 312 for obtaining data from the physical environment around a user.”) PNG media_image1.png 576 581 media_image1.png Greyscale detecting a predefined type of object in the camera view of the local environment; (para. [0093], “The environment-sensing system 306 may be used for mapping one or more elements of the physical environment around the user by detecting and registering the local environment, including static objects, dynamic objects, people, gestures and various lighting, atmospheric and acoustic conditions.”) generating a virtual representation of the detected object; maintaining an augmented reality environment by integrating the camera view of the local environment with the generated virtual representation of the detected object; and (Figure 21; para. [0106], “the interface display 303 may be substantially transparent, thereby allowing the user to view the local, physical environment, while various local, physical objects are displayed to the user as rendered physical objects”; para. [0098], “the virtual object 404 is an object created by the system 100, and displayed via the user interface 302”) transmitting the generated virtual representation of the detected object and information relating to its integration into the augmented reality environment to the at least one other device. (claim 13, “wherein the processor is communicatively coupled to a computer network to transmit at least a portion of a virtual world data”; para. [0085], “The data stored in one or more servers 110 within the computing network 105 is, in one embodiment, transmitted or deployed at a high-speed, and with low latency, to one or more user devices 120”). Regarding claim 3, Miller discloses wherein the predefined type of object is a person. (para. [0075], “Objects may be any type of animate or inanimate object, including but not limited to, buildings, plants, vehicles, people,”) Regarding claim 4, Miller discloses further comprising generating information relating to the integration of the virtual representation of the person, including a description of movement performed by the person. (Figure 18; para. [0084], “The servers 110 within the computing network 105 also store computational state data for each of the digital worlds. The computational state data (also referred to herein as state data) may be a component of the object data, and generally defines the state of an instance of an object at a given instance in time. Thus, the computational state data may change over time and may be impacted by the actions of one or more users and/or programmers maintaining the system 100.”; para. [0171], “the system may be configured to present information deemed relevant to the scene presented, such as a message through the messaging interface (112) that agent 006 is approaching, along with visually-presented highlighting around the agent 006 character.”). PNG media_image2.png 390 577 media_image2.png Greyscale Regarding claim 5, Miller discloses wherein information transmitted to the at least one other device is used by the at least one other device to integrate or change an integration of a corresponding virtual object and a camera view of a local environment. (para. [0084], “As a user impacts the computational state data (or other data comprising the digital worlds), the user directly alters or otherwise manipulates the digital world. If the digital world is shared with, or interfaced by, other users, the actions of the user may affect what is experienced by other users interacting with the digital world”) Regarding claim 6, Miller discloses wherein information from at least one device is transmitted to a server which is configured to forward the received information to the at least one other device. (para. [0071], “User devices are configured for communicating directly with computing network 105, or any of the servers 110. Alternatively, user devices 120 communicate with the remote servers 110, and, optionally, with other user devices locally, through a specially programmed, local gateway 140 for processing data and/or for communicating data between the network 105 and one or more local user devices 120.") Regarding claim 7, Miller discloses where processing of information relating to generation or updating of information is distributed between the at least one device, the server, and the at least one other device. (para. [0071], “User devices are configured for communicating directly with computing network 105, or any of the servers 110. Alternatively, user devices 120 communicate with the remote servers 110, and, optionally, with other user devices locally, through a specially programmed, local gateway 140 for processing data and/or for communicating data between the network 105 and one or more local user devices 120.”; para. [0084], “As a user impacts the computational state data (or other data comprising the digital worlds), the user directly alters or otherwise manipulates the digital world. If the digital world is shared with, or interfaced by, other users, the actions of the user may affect what is experienced by other users interacting with the digital world”) Regarding claim 8, A device configured to maintain a description of an augmented reality environment by: (para. [0022], “the head-mounted user display device renders a display image associated with at least a portion of the virtual world data), and similar reasoning as discussed in claim 1 is applied. Regarding claim 9, Miller discloses being further configured to: receive information relating to the integration of the virtual object from the at least one other device, and update the augmented reality environment in accordance with the received information. (para. [0076], “a digital world can allow for others to create or modify objects. Once an object is instantiated, the state of the object may be permitted to be altered, controlled or manipulated by one or more users experiencing a digital world.”) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Miller et al. (US 20140306866 A1 – IDS REF) (hereinafter Miller) in view of Valli et al. (WO 2017177019 A1) (hereinafter Valli). Regarding claim 2, Miller discloses wherein the information relating to the integration of the virtual object includes information describing a change (para. [0084], “The computational state data (also referred to herein as state data) may be a component of the object data, and generally defines the state of an instance of an object at a given instance in time.”) Miller does not explicitly disclose in at least one of a position and an orientation of the virtual object. However, Valli more explicitly teaches in at least one of a position and an orientation of the virtual object. (para. [0012], “in at least one of a position and an orientation of the virtual object.”) As both Miller and Valli are from the same field of endeavor, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include in at least one of a position and an orientation of the virtual object, in the context of augmented reality, by Miller according to the teaching of Valli in order for a system to identify position and orientation information consistently based on the changes in the environment (Valli, para. [0004]). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Hyorim Park whose telephone number is (571)272-3859. The examiner can normally be reached Monday - Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571) 272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Hyorim Park/ Examiner, Art Unit 2615 /ALICIA M HARRINGTON/Supervisory Patent Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Sep 20, 2024
Application Filed
Mar 19, 2026
Non-Final Rejection — §102, §103, §112 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
100%
Grant Probability
99%
With Interview (+100.0%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 1 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month