Prosecution Insights
Last updated: April 19, 2026
Application No. 18/396,682

CROSS REALITY SYSTEM FOR LARGE SCALE ENVIRONMENTS

Final Rejection §102§103§DP
Filed
Dec 26, 2023
Examiner
HARRISON, CHANTE E
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Magic Leap Inc.
OA Round
4 (Final)
69%
Grant Probability
Favorable
5-6
OA Rounds
3y 4m
To Grant
97%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
497 granted / 725 resolved
+6.6% vs TC avg
Strong +29% interview lift
Without
With
+28.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
30 currently pending
Career history
755
Total Applications
across all art units

Statute-Specific Performance

§101
8.9%
-31.1% vs TC avg
§103
40.3%
+0.3% vs TC avg
§102
31.8%
-8.2% vs TC avg
§112
15.2%
-24.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 725 resolved cases

Office Action

§102 §103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 1. This action is responsive to communications: Amendment & Request for Reconsideration, filed on 11/18/2025. This action is made FINAL. 2. Claims 1-20 are pending in the case. Claims 1, 9 and 17 are independent claims. Claims 1, 9 and 17 have been amended. Response to Arguments Applicant's arguments filed November 18, 2025 have been fully considered but they are not persuasive. Claim Rejections – Double Patenting Applicant requests the rejection be held in abeyance until a notice of allowance is received. In response, Applicant’s request is acknowledged. Accordingly, the double patenting rejection is maintained. Claim Rejections – 35 U.S.C. § 102 Applicant argues (claim 17) Mohan fails to teach wherein computing a plurality of candidate localizations is computed for each collection of features of the plurality of snapshots using a two-step process including a rough localization and a refined localization, wherein the rough localization is performed on subsets of features in each collection of features. In response, Mohan discloses identifying a nearest key frame based on coarse spatial information (Para 331). Mohan uses frame descriptors to determine whether a new image matches frames selected as being associated with a nearby persistent pose (Para 332) and identifying a matching image frame includes performing feature matching against 3D features in the maps that correspond to the identified nearest key frames (Para 333). Thus, Mohan teaches wherein computing a plurality of candidate localizations is computed for each collection of features of the plurality of snapshots using a two-step process including a rough localization and a refined localization, wherein the rough localization is performed on subsets of features in each collection of features. Accordingly, the rejection of claims 18-20 is maintained for at least the reasons provided with respect to their base claim 17. Claim Rejections – 35 U.S.C. § 103 Applicant argues claims 1, 9 and 16 are allowable for at least the reasons provided with respect to claim 17. In response, claims 1, 9 and 16 are not allowable at least based on the above rationale provided in the above response to arguments of claim 17. Applicant requests withdrawal of the 35 U.S.C. 103 rejection of claims 2-8 and 10-15 as the teaching Huo in combination of Mohan and Gallagher do not disclose the features of their respective base claims 1, 9 and 16. In response, the rejection of claims 2-8 and 10-15 is maintained based on the above rationale as applied to corresponding base claims 1, 9 and 16, in combination with the teaching of Huo as identified in the following rejection. To the extent that the response to the applicant's arguments may have mentioned new portions of the prior art references which were not used in the prior office action, this does not constitute a new ground of rejection. It is clear that the prior art reference is of record and has been considered entirely by applicant. See In re Boyer, 363 F.2d 455, 458 n.2, 150 USPQ 441, 444, n.2 (CCPA 1966) and In re Bush, 296 F.2d 491, 496, 131 USPQ 263, 267 (CCPA 1961). The mere fact that additional portions of the same reference may have been mentioned or relied upon does not constitute new ground of rejection. In re Meinhardt, 392, F.2d 273, 280, 157 USPQ 270, 275 (CCPA 1968). Information Disclosure Statement The information disclosure statement filed April 15, 2024 fails to comply with the provisions of 37 CFR 1.97, 1.98 and MPEP § 609 because the IDS is not filed in compliance with §1.98 (a)(1)(ii) and (iii) using form PTO/SB/08. It has been placed in the application file, but the information referred to therein has not been considered as to the merits. Applicant is advised that the date of any re-submission of any item of information contained in this information disclosure statement or the submission of any missing element(s) will be the date of submission for purposes of determining compliance with the requirements based on the time of filing the statement, including all certification requirements for statements under 37 CFR 1.97(e). See MPEP § 609.05(a). Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1, 3, 5-10, 16-18 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 4-6, 9-12 and 14 of U.S. Patent No. 11,900,547. Although the claims at issue are not identical, they are not patentably distinct from each other because each generates a localization of a portable device based on a plurality of candidate localizations. It would have been obvious to ne o skill in the art before the effective filing date of the claimed invention at the time the invention was made to conclude that the invention defined in the claims at issue would have been an obvious variation of the invention defined in a claim in the patent because the patent determines a plurality of candidate localizations using transformations that align a coordinate frame orientation to the direction of gravity. Application: 18/396,682 Patent: 11,900,547 Claim 1 Claim 1, 4, 5 Claim 3 Claim 5 Claims 5-8 Claim 6 Claim 9 Claim 9, 11 Claim 10 Claim 11 Claim 16 Claim 10 Claim 17 Claim 12 Claim 18 Claim 14 The following tables show an example of the corresponding conflicting claims of the current application and Patent 11/900,547. Application: 18/396,682 (claim 1) Patent: 11/900,547 (claim 1) A system that supports specification of a position of virtual content relative to one or more persisted maps in a database of persisted maps, the system comprising: a localization service configured to receive from a portable electronic device a localization request comprising a batch of snapshots, each snapshot of the batch of snapshots comprising a collection of features posed with respect to a coordinate frame local to the portable electronic device and a vector indicating an estimated direction of gravity by the portable electronic device in the coordinate frame local to the portable electronic device An XR system that supports specification of a position of virtual content relative to one or more persisted maps in a database of persisted maps, the XR system comprising a localization service configured to receive from a portable electronic device information about a plurality of collections of features in images of a three-dimensional (3D) environment, the information comprising positions for the features of the plurality of collections of features expressed in a coordinate frame, wherein: the localization service comprises at least one processor configured to execute computer- executable instructions, the computer-executable instructions implementing a localization component, the localization component configured to: wherein the localization service comprises at least one processor configured to execute computer-executable instructions, the computer-executable instructions implementing a localization component, the localization component configured to: compute a plurality of candidate localizations for the batch of snapshots by constraining degrees of freedom based on the vectors indicating the estimated directions of gravity of the batch of snapshots to determine transformations that each aligns the collection of features of a respective snapshot in the coordinate frame local to the portable electronic device with features of a persisted map in the database of persisted maps in a persisted coordinate frame of the persisted map compute a plurality of candidate localizations for the plurality of collection of features by, for each of the plurality of collections of features, performing a process to determine as a candidate localization a transformation between the collection of features and a portion of a persisted map in the database of persisted maps, wherein: the portion of the persisted map has an associated estimated direction of gravity; and the process of determining is constrained, based on an orientation of the coordinate frame, to determine transformations that align the coordinate frame with the associated estimated direction of gravity by: fixing two rotational degrees of freedom based, at least in part, on the orientation of the coordinate frame with respect to the associated estimated direction of gravity, and computing one rotational degree of freedom and three translational degrees of freedom based, at least in part, on the fixed two rotational degrees of freedom and generate a localization of the portable electronic device based on consensus among the plurality of candidate localizations and generate a localization of the portable electronic device based on consensus among the plurality of candidate localizations Patent: 11,900,547 (claim 4) The XR system of claim 1, wherein the orientations of the coordinate frames for the plurality of collections of features with respect to the estimated directions of gravity comprise vectors indicating the estimated directions of gravity. Patent: 11,900,547 (claim 5) wherein computing a plurality of candidate localizations is computed for each collection of features of the plurality of snapshots using a two-step process including a rough localization and a refined localization, wherein the rough localization is performed on subsets of features in each collection of features computing a candidate localization for a collection of features comprises: performing rough localization of the collection of features with respect to the persisted maps in the database of persisted maps, wherein rough localization comprises computing a rough transformation of the portable electronic device with respect to a persisted map in the database of persisted maps, and performing refined localization of the collection of features with respect to the persisted map, wherein refined localization of the collection of features comprises computing a candidate localization of the portable electronic device based on the rough transformation *Bold type above indicates the feature of claim 1 in the current application that corresponds to a dependent claim of the Patent (11,900,547). Application: 18/396,682 (claim 9) Patent: 11,900,547 (claim 9) A system that supports specification of a position of virtual content relative to persisted maps in a database of persisted maps, the system comprising: a localization service configured to receive from a portable electronic device a localization request comprising a batch of snapshots, each snapshot of the batch of snapshots comprising a collection of features posed in a coordinate frame local to the portable electronic device having a dimension aligned with respect to an estimated direction of gravity An XR system that supports specification of a position of virtual content relative to persisted maps in a database of persisted maps, the XR system comprising a localization service configured to receive from a portable electronic device information about a plurality of collections of features in images of a three-dimensional (3D) environment, the information comprising positions for the features of the plurality of collections of features expressed in a coordinate frame, wherein an estimated direction of gravity is sent to the portable electronic device wherein: the localization service comprises at least one processor configured to execute computer- executable instructions, the computer-executable instructions implementing a localization component, the localization component configured to: compute a plurality of candidate localizations for the batch of snapshots by, for each snapshot of the batch of snapshots wherein the localization service comprises at least one processor configured to execute computer-executable instructions, the computer-executable instructions implementing a localization component, the localization component configured to: compute a plurality of candidate localizations for the plurality of collections of features by, for each of the plurality of collections of features performing a process to determine as a candidate localization a transformation between the collection of features of the snapshot and a portion of a persisted map in the database of persisted maps performing a process to determine as a candidate localization a transformation between the collection of features and a portion of a persisted map in the database of persisted maps, wherein: the portion of the persisted map has an associated estimated direction of gravity and the process of determining is constrained to determine transformations that align the direction of gravity for the collection of features of the snapshot with the associated estimated direction of gravity of the portion of the persisted map wherein: the portion of the persisted map has an associated estimated direction of gravity; and the process of determining is constrained to determine transformations that align the estimated direction of gravity for the collection of features with the associated estimated direction of gravity of the portion of the persisted map and generate a localization of the portable electronic device based on consensus among the plurality of candidate localizations. and generate a localization of the portable electronic device based on consensus among the plurality of candidate localizations wherein the process of determining the transformations of the candidate localizations is constrained by: fixing two rotational degrees of freedom based, at least in part, on an orientation of the respective coordinate frame with respect to the estimated direction of gravity, and computing one rotational degree of freedom and three translational degrees of freedom based, at least in part, on the fixed two rotational degrees of freedom *Bold type above indicates differences in the claim. Patent: 11,900,547 (claim 5) wherein computing a plurality of candidate localizations is computed for each collection of features of the plurality of snapshots using a two-step process including a rough localization and a refined localization, wherein the rough localization is performed on subsets of features in each collection of features computing a candidate localization for a collection of features comprises: performing rough localization of the collection of features with respect to the persisted maps in the database of persisted maps, wherein rough localization comprises computing a rough transformation of the portable electronic device with respect to a persisted map in the database of persisted maps, and performing refined localization of the collection of features with respect to the persisted map, wherein refined localization of the collection of features comprises computing a candidate localization of the portable electronic device based on the rough transformation *Bold type above indicates the feature of claim 1 in the current application that corresponds to a dependent claim of the Patent (11,900,547). Application: 18/396,682 (claim 17) Patent: 11/900,547 (claim 12) An electronic device configured to operate within a system, the electronic device comprising: one or more sensors configured to capture information about a three-dimensional (3D) environment, the captured information comprising a plurality of images An electronic device configured to operate within a cross reality system, the electronic device comprising: one or more sensors configured to capture information about a three-dimensional (3D) environment, the captured information comprising a plurality of images and at least one processor configured to execute computer executable instructions, wherein the computer executable instructions comprise instructions for: extracting a plurality of collections of features from the plurality of images of the 3D environment and at least one processor configured to execute computer executable instructions, wherein the computer executable instructions comprise instructions for: extracting a plurality of collections of features from the plurality of images of the 3D environment determining an estimated direction of gravity with respect to a coordinate frame local to the electronic device; determining an estimated direction of gravity with respect to a coordinate frame local to the electronic device; expressing positions of the features of the plurality of collections of features in the coordinate frame aligning a dimension of the coordinate frame local to the electronic device with the estimated direction of gravity expressing positions of the features of the plurality of collections of features in the aligned coordinate frame local to the electronic device expressing positions of the features of the plurality of collections of features in the coordinate frame sending to a localization service of the system a plurality of snapshots, each snapshot of the plurality of snapshots comprising a collection of features posed in the aligned coordinate frame local to the electronic device sending to a localization service of the cross reality system information about the plurality of collections of features, the information indicating the positions of the features of the plurality of collections of features and the estimated direction of gravity with respect to the coordinate frame and receiving from the localization service a pose of the electronic device with respect to a persisted map in a database of persisted maps and receiving from the localization service a pose of the electronic device with respect to a persisted map in a database of persisted maps, wherein determining the pose of the electronic device is constrained by fixing two rotational degrees of freedom based, at least in part, on an orientation of the respective coordinate frame with respect to the estimated direction of gravity, and computing one rotational degree of freedom and three translational degrees of freedom based, at least in part, on the fixed two rotational degrees of freedom such that the pose of the electronic device is received in less than 6 seconds after the sending to the localization service. *Bold type above indicates differences in the claim. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 17-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Anush Mohan et al., US 2020/0051328 A1. Independent claim 17, Mohan discloses an electronic device configured to operate within a system, the electronic device comprising: one or more sensors configured to capture information about a three-dimensional (3D) environment, the captured information comprising a plurality of images (i.e. user warding an AR display system rendering AR content as the user moves through a physical world environment, e.g. 3D environment — Para 151; Fig. 5A; AR system includes sensor to capture information about the physical world — Para 126); and at least one processor configured to execute computer executable instructions (i.e. local data processing module includes a processor and executes instructions — Para 143), wherein the computer executable instructions comprise instructions for: extracting a plurality of collections of features from the plurality of images of the 3D environment (i.e. user moves through a physical world environment, e.g. 3D environment; user positions AR display system at positions for AR display system to record ambient information of a passable world; the information is stored as poses in combination with images, features and other data - Para 151; Fig. 5A); determining an estimated direction of gravity with respect to a coordinate frame local to the electronic device (i.e. IMU determines direction of gravitational force relative to the head unit — Para 220; heads pose tracking represents a head pose of a user in a coordinate frame — Para 129; the head pose tracking component computes relative position and orientation of the AR device to physical object base on camera image information and IMU inertial information; head pose tracking component computes a head pose of the AR device by comparing eh relative position and orientation of the AR device to the physical object with feature of the physical object — Para 130); aligning a dimension of the coordinate frame local to the electronic device with the estimated direction of gravity (i.e. aligning a head coordinate frame to gravity – Para 196); expressing positions of the features of the plurality of collections of features in the aligned coordinate frame local to the electronic device (i.e. persistent poses may be reflected as persistent coordinate frames (PCF); each may be associated with a map and a set of features or other information that a device uses to determine its orientation with respect to the PCF — Para 259); sending to a remote localization service of the system a plurality of snapshots, each snapshot of the plurality of snapshots comprising a collection of features posed in the aligned coordinate frame local to the electronic device (i.e. server, e.g. remote localization service, receives spatial information provided by portable client devices – Para 25, 100-101; local processing module executes instructions to generate the map and/or physical world representations — Para 144; persistent poses may be reelected as persistent coordinate frames, e.g. PCF; PCF may associated with a map and includes features a device can use to determine its orientation; PCF includes transformation defined with respect to the origin of its map; device position corelated to a PCF enables determination of position with respect to objects in the physical world reflected in the map — Para 259; a tracking map may be built in a world coordinate frame including a world origin aligned to gravity — Para 196; the inertial measurement unit includes gravitation sensor for determining the direction of gravitational force relative to the head unit — Para 129, 220; a PCF used locally by a device may have translations and rotations relative to a world coordinate frame of the device — Para 243); compute a plurality of candidate localizations for the plurality of snapshots (i.e. persistent poses may be reelected as persistent coordinate frames, e.g. PCF; PCF may associated with a map and includes features a device can use to determine its orientation; PCF includes transformation defined with respect to the origin of its map; device position corelated to a PCF enables determination of position with respect to objects in the physical world reflected in the map — Para 259), wherein computing a plurality of candidate localizations is computed for each collection of features of the plurality of snapshots using a two-step process including a rough localization and a refined localization (i.e. comparing a new image frame at the current device location with stored image frame in connection with points in a map, e.g. the persistent coordinate frame; identifying the nearest key frame based on coarse spatial information and previously determined spatial information — Para 331; use frame descriptors to determine whether the new image matches frames selected as being associated with a nearby persistent pose — Para 332; identifying a matching image frame includes performing feature matching against 3D features in the maps that correspond to the identified nearest key frames — Para 333), wherein the rough localization is performed on subsets of features in each collection of features (i.e. identifying (Act 2304) one or more nearest key frames in a database comprising key frames used to generate one or more maps; a nearest key frame may be identified based on coarse spatial information… indicating that the XR device is in a geographic region – Para 331; determine whether the new image matches any of the frames selected as being associated with a nearby persistent pose based on a comparison with a subset of keyframes – Para 332; Once a matching image frame is identified… performing (Act 2306) feature matching against 3D features in the maps that correspond to the identified nearest key frames, and computing (Act 2308) pose of the device worn by the user based on the feature matching results – Para 333); and receiving from the remote localization service a pose of the electronic device with respect to a persisted map in a database of persisted maps (i.e. server, e.g. remote localization service, provides spatial information retrieved by portable client devices – Para 25, 100-101; when canonical maps are small, an XR device attempting to localize may filter the universe of canonical maps using Wi-Fi fingerprints and Key Frame -; a comparison of two maps may result in identifying common persistent points, such as persistent poses or PCFs that appear in both the new map and the selected map, in such case, descriptors may be associated with persistent points and may be compared — Para 360, 380). Claim 18, Mohan discloses the electronic device of claim 17, wherein determining the estimated direction of gravity comprises: receiving the estimated direction of gravity from a map merge service of the system (i.e. map merge portion – Para 345; a tracking map may be built in a world coordinate frame, which may have a world origin that is aligned to gravity — Para 196; the IMU has a gravitation sensor for determining the direction of gravitational force — Para 220). Claim 19, Mohan discloses the electronic device of claim 17, wherein determining the estimated direction of gravity comprises: sending at least a portion of the plurality of collections of features to the map merge service of the system such that the estimated direction of gravity is based on the at least a portion of the plurality of collections of features (i.e. surface data may be combined with a data from a gravitational sensor to establish head pose). Claim 20, Mohan discloses the electronic device of claim 17, wherein the plurality of collections of features comprises descriptors for individual features (i.e. computing feature descriptors for each feature of the one or more features – Para 431). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 9 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Anush Mohan et al., US 2020/0051328 A1, and further in view of Gallagher, US 2006/0078214. Independent claim 1, Mohan discloses a system that supports specification of a position of virtual content relative to one or more persisted maps in a database of persisted maps, the system comprising: a remote localization service configured to receive from a portable electronic device a localization request comprising a batch of snapshots, each snapshot of the batch of snapshots comprising a collection of features posed with respect to a coordinate frame local to the portable electronic device (i.e. server, e.g. remote localization service, provides spatial information retrieved by portable client devices – Para 25, 100; as user wearing an AR display system moves through the physical environment the AR display system records ambient information — Para 151; Fig. 5A - that is stored as poses and additional information, such as images and features — Para 137; representations for the physical world may be in different coordinate frames — Para 173), wherein: the localization service comprises at least one processor configured to execute computer- executable instructions, the computer-executable instructions implementing a localization component (i.e. server, e.g. remote localization service, receives and processes spatial information from portable client devices – Para 25, 100; the XR system may include components that, based on data about the physical world collected with sensors on user devices, develop, maintain, and use persistent spatial information, including one or more stored maps. These components may be distributed across the XR system or may operate at a remote location, such as at one or more servers – Para 104; local processing module executes instructions to generate the map and/or physical world representations — Para 144; a tracking map may be localized to the stored map, e.g. localization component — Para 145), the remote localization component configured to: compute a plurality of candidate localizations for the batch of snapshots by constraining degrees of freedom based on the vectors indicating the estimated directions of gravity of the batch of snapshots to determine transformations that each aligns the collection of features of a respective snapshot in the coordinate frame local to the portable electronic device with features of a persisted map in the database of persisted maps in a persisted coordinate frame of the persisted map (i.e. persistent poses may be reelected as persistent coordinate frames, e.g. PCF; PCF may associated with a map and includes features a device can use to determine its orientation; PCF includes transformation defined with respect to the origin of its map; device position corelated to a PCF enables determination of position with respect to objects in the physical world reflected in the map — Para 259), and wherein computing a plurality of candidate localizations is computed for each collection of features of the batch of snapshots using a two-step process including a rough localization and a refined localization (i.e. comparing a new image frame at the current device location with stored image frame in connection with points in a map, e.g. the persistent coordinate frame; identifying the nearest key frame based on coarse spatial information and previously determined spatial information — Para 331; use frame descriptors to determine whether the new image matches frames selected as being associated with a nearby persistent pose — Para 332; identifying a matching image frame includes performing feature matching against 3D features in the maps that correspond to the identified nearest key frames — Para 333); and generate a localization of the portable electronic device based on consensus among the plurality of candidate localizations (i.e. XR device receives canonical maps and attempts to localize into the received canonical maps — Para 357; the system may filter canonical maps based on global feature strings; when the viewing device localizes its tracking map to the canonical map, it may do so by matching the global feature strings of the local tacking map with the global features strings of the canonical map — Para 358-359). Mohan fails to disclose each snapshot of the batch of snapshots comprising a vector indicating an estimated direction of gravity by the portable electronic device in the coordinate frame local to the portable electronic device, which Gallagher discloses (i.e. in image transformation vectors indicate the direction of gravity — abstract; gravity sensor determines the position of the direction of gravity and identifies the position of the camera with respect to the gravitational field — Para 37). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention at the time the invention was made to modify the known method of Mohan with the teaching of Gallagher for the purpose of producing a transform and applying the transform to output an improved image (Gallagher, Para 39). Independent claim 9, the claim is similar in scope to claim 1. Therefore, similar rationale as applied in the rejection of claim 1 applies herein. Claim 16, Mohan discloses the system of claim 9, further comprising a map merge service comprising: a map alignment component configured to determine an alignment between a tracking map of the portable electronic device and a persisted map in the database of persisted maps, the tracking map comprising at least a portion of the batch of snapshots (i.e. tacking maps may be merged by other maps of the environment — Para 344 merge processing merges the tracking map with some or all of ranked environment maps and transmitting new merged maps to a passable world model; maps depicting overlapping portions of the physical world may be aligned and aggregated into a final map — Para 345); and a gravity estimate component configured to compute an estimated direction of gravity for the tracking map of the portable electronic device based, at least in part, on an estimate of a direction of gravity with respect to the persisted map and the determined alignment between the tracking map and the persisted map (i.e. a tracking map may be built in a world coordinate frame, which may have a world origin that is aligned to gravity — Para 196; the IMU has a gravitation sensor for determining the direction of gravitational force — Para 220; persistent poses may be reflected as persistent coordinate frames (PCF), that may be associated with a map and a set of features or other information that a device can use to determine its orientation with respect to the PCF; PCF includes a transformation that is defined with respect to the origin of its map; a device can determine its position with respect to object in the physical world reflected in the map by correlating its piston to a PCF - Para 259), wherein the estimated direction of gravity for the tracking map is sent to the portable electronic device (i.e. a tracking map may be built in a world coordinate frame, which may have a world origin that is aligned to gravity — Para 196; the IMU has a gravitation sensor for determining the direction of gravitational force — Para 220). Claim(s) 2-8 and 10-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Anush Mohan et al., US 2020/0051328 A1, in view of Gallagher, US 2006/0078214, as applied in claims 1 and 9, and further in view of Ke Huo et al., Wo 2019/0221800 A1. Claim 2, Mohan in view of Gallagher disclose the system of claim 1. Mohan in view of Gallagher fails to disclose, wherein, for each snapshot of the batch of snapshots: the vector indicating the estimated direction of gravity is represented as an offset between the estimated direction of gravity and the coordinate frame local to the portable device, which Huo discloses (i.e. a distance vector indicates the measured distance between devices as image frames are captured – Para 33, 42, 48, 50; the measurements determined by synchronization program that determines the rotational and translation transformations between reference frames having axes that are not aligned with gravity – Para 33, 37, 46). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention at the time the invention was made to combine Huo’s known method wherein, for each snapshot of the batch of snapshots: the vector indicating the estimated direction of gravity is represented as an offset between the estimated direction of gravity and the coordinate frame local to the portable device with the method of Mohan in view of Gallagher because determining the alignment of an axis of map/image reference frames relative to gravity indicates synchronization of frames and provides improvement of localized positioning (Huo, Para 58). Claim 3, Mohan discloses the system of claim 1, wherein: computing the plurality of candidate localizations for the batch of snapshots comprises: computing a rough transformation of the coordinate frame local to the portable electronic device with respect to the persisted coordinate frame of the persisted map (i.e. comparing a new image frame at the current device location with stored image frame in connection with points in a map, e.g. the persistent coordinate frame; identifying the nearest keyframe based on coarse spatial information and previously determined spatial information — Para 331), and computing the transformations that each aligns the collection of features of a respective snapshot in the coordinate frame local to the portable electronic device with features of the persist map in the persisted coordinate frame based on the rough transformation (i.e. use frame descriptors to determine whether the new image matches frames selected as being associated with a nearby persistent pose — Para 332; identifying a matching image frame includes performing feature matching against 3D features in the maps that correspond to the identified nearest key frames — Para 333). Mohan in view of Gallagher fails to disclose and the rough transformation is computed by constraining the degrees of freedom based on respective vectors indicating the estimated directions of gravity, which Huo discloses (i.e. when a first coordinate axis of maps align without synchronization, the search domain can be reduces from six degrees of freedom to four degrees of freedom – Para 58). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention at the time the invention was made to combine Huo’s known method wherein the rough transformation is computed by constraining the degrees of freedom based on respective vectors indicating the estimated directions of gravity with the method of Mohan in view of Gallagher because determining an axis of map reference frames coincide with gravity indicates the alignment of map frames without synchronization processing to provide the advantage of improved optimization (Huo, Para 58). Claim 4, Mohan in view of Gallagher disclose the system of claim 3. However, Mohan in view of Gallagher fail to disclose wherein: the plurality of candidate localizations are computed without constraining the degrees of freedom based on respective vectors indicating the estimated directions of gravity, which Huo discloses (i.e. when maps having a first coordinate axis are aligned with gravity, the second and third axes, e.g. x-axis and z-axis, of each map form a horizontal plane and the first coordinate axes of the references frame are aligned without synchronization – Para 46; where synchronization of frames determines the relative translational and rotational transformations between reference frames – Para 37). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention at the time the invention was made to combine Huo’s known method wherein: the plurality of candidate localizations are computed without constraining the degrees of freedom based on respective vectors indicating the estimated directions of gravity with the method of Mohan in view of Gallagher because determining an axis of map reference frames coincide with gravity indicates the alignment of map frames without synchronization processing. Thus, the combination yields predictable results. Claims 5-7 and 12-14, the corresponding rationale as applied in the rejection of claims 3 and 10 apply herein. Claim 8, Mohan in view of Gallagher disclose the system of claim 7. However, Mohan in view of Gallagher fail to disclose, wherein the rough transformation is computed by: fixing two rotational degrees of freedom based, at least in part, on the vectors indicating the estimated directions of gravity, and computing one rotational degree of freedom and three translational degrees of freedom based, at least in part, on the fixed two rotational degrees of freedom, which Huo discloses (i.e. set the rotation angles about the x-axis and the z-axis, determine a value for the rotation angle about the y-axis – Para 58 - and determine translational movements along the x, y, and z axes – Para 59). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention at the time the invention was made to combine Huo’s known method wherein the rough transformation is computed by: fixing two rotational degrees of freedom based, at least in part, on the vectors indicating the estimated directions of gravity, and computing one rotational degree of freedom and three translational degrees of freedom based, at least in part, on the fixed two rotational degrees of freedom with the method of Mohan in view of Gallagher because determining an axis of map reference frames coincide with gravity indicates the alignment of map frames without synchronization processing to provide the advantage of improved optimization by fixing rotational angles (Huo, Para 58). Claim 10, the corresponding rationale as applied in the rejection of claim 3 applies herein. Claim 11, the corresponding rationale as applied in the rejection of claim 4 applies herein. Claim 15, the corresponding rationale as applied in the rejection of claim 8 applies herein. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHANTE HARRISON whose telephone number is (571)272-7659. The examiner can normally be reached Monday - Friday 8:00 am to 5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached on 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHANTE E HARRISON/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Dec 26, 2023
Application Filed
Sep 06, 2024
Non-Final Rejection — §102, §103, §DP
Nov 26, 2024
Applicant Interview (Telephonic)
Nov 26, 2024
Examiner Interview Summary
Dec 09, 2024
Response Filed
Mar 25, 2025
Final Rejection — §102, §103, §DP
Jun 02, 2025
Response after Non-Final Action
Jun 16, 2025
Request for Continued Examination
Jun 17, 2025
Response after Non-Final Action
Oct 14, 2025
Non-Final Rejection — §102, §103, §DP
Nov 18, 2025
Response Filed
Mar 02, 2026
Final Rejection — §102, §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597213
GESTURE BASED TACTILE INTERACTION IN EXTENDED REALITY USING FORM FACTOR OF A PHYSICAL OBJECT
2y 5m to grant Granted Apr 07, 2026
Patent 12592043
Systems, Methods, and Graphical User Interfaces for Displaying and Manipulating Virtual Objects in Augmented Reality Environments
2y 5m to grant Granted Mar 31, 2026
Patent 12592045
AUGMENTED REALITY SYSTEM AND METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12586322
OPTICAL DEVICE FOR AUGMENTED REALITY HAVING GHOST IMAGE PREVENTION FUNCTION
2y 5m to grant Granted Mar 24, 2026
Patent 12561891
GRAPHICS PROCESSORS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
69%
Grant Probability
97%
With Interview (+28.8%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 725 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month