Prosecution Insights
Last updated: April 19, 2026
Application No. 17/823,745

NAVIGATION PATHS FOR DIRECTING USERS TO LOCATIONS WITHIN A PHYSICAL RETAIL STORE USING EXTENDED REALITY

Non-Final OA §101§103
Filed
Aug 31, 2022
Examiner
EVANS, KIMBERLY L
Art Unit
3629
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Micron Technology, Inc.
OA Round
3 (Non-Final)
12%
Grant Probability
At Risk
3-4
OA Rounds
7y 0m
To Grant
26%
With Interview

Examiner Intelligence

Grants only 12% of cases
12%
Career Allow Rate
44 granted / 362 resolved
-39.8% vs TC avg
Moderate +13% lift
Without
With
+13.4%
Interview Lift
resolved cases with interview
Typical timeline
7y 0m
Avg Prosecution
27 currently pending
Career history
389
Total Applications
across all art units

Statute-Specific Performance

§101
30.6%
-9.4% vs TC avg
§103
39.8%
-0.2% vs TC avg
§102
9.3%
-30.7% vs TC avg
§112
16.6%
-23.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 362 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This Non-Final action is in reply to the request for continued examination filed 10/30/2025. Claims 1, 3, 13 and 21 have been amended. Claims 5, 15 and 24 have been cancelled. Claims 26-28 are new claims. Claims 1-4, 6-14, 16-23 and 25-28 are pending. Request for Continued Examination A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/30/2025 has been entered. Response to Arguments/Amendments With respect to applicant’s arguments regarding the 35USC101 rejection, applicant argues, “a human mind cannot practically ‘download, from a server and prior to arriving at a physical retail store of the one or more physical retail stores, store mapping information associated with the physical retail store as recited in amended claim 1” and that, downloading information from a server” and “store mapping information” cannot be done in the human mind, and inherently requires technology. Applicant then states the additional features of amended claim 1 integrate the alleged abstract idea into a practical application because it is directed to improvements in the technical field of generating in-store navigation paths for display using an XR device. Applicant subsequently argues, ‘downloading improves the reliability of generating in-store navigation paths, such as by allowing for generating of in-store navigation paths within retail stores with limited, poor or even no network coverage… Therefore... the features of amended claim 1 provide an improvement to a technology or technical field”. Applicant’s arguments have been reconsidered but they are unpersuasive. Examiner maintains that the claims are directed to the abstract idea for receiving items to be purchased, identifying and ranking physical retails stores that carry the items; and providing an in-store navigation path within a physical retail store to direct a user from a current location to a next location based on items to be purchased in a computing environment. Examiner notes that while applicant argues that the features of amended claim 1 cannot practically be performed in the human mind, the analysis provided in the Advisory action (10/29/2025) and Final action (8/27/2025) evaluated the claims and determined that the claimed invention pertains to (i) commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising; marketing; or sales activities or behaviors; business relations) and (ii) managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); hence, directed to certain methods of organizing human activity groupings of abstract ideas. The steps for receiving an indication of items to be purchased, identifying one or more physical retail stores that carry the items, listing and ranking the one or more physical retail stores, detecting and determining current location within a physical store and based on the items and the current location, a next location; and providing an in-store navigation path to direct a user from the current location to the next location pertains to management of personal behavior, relationships and interactions between people (customer/user, merchants/providers and the like) according to rules and instructions to facilitate commercial or legal interactions for viewing of various image data, recommendation and purchasing purposes; hence, directed to certain methods of organizing human activity groupings of abstract ideas identified above. Further, the use of generic computing components (“extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device”, [claims 1, 13], “a system” [claim 21]), merely used as a tool for data gathering/analysis and manipulation to perform and automate the abstract idea are not patent eligible. The recited computing components are all being used in a manner that is well-understood, routine and conventional activities previously known in the art, and are recited at a high level of generality- (see applicant’s disclosure ¶19: “a computing device (e.g., a mobile device, a tablet computer, or a desktop computer) may provide an application (or web browser) that allows the user to enter (e.g., via text) which items are to be purchased…The computing device may be able to communicate with the server and/or the XR device. The computing device may receive, via an interface of the computing device, an indication of items to be purchased”; ¶23: “the XR device 140 may receive, via the interface, an indication of the category, and the XR device 140 may transmit the indication of the category to the server 150”; ¶36: “the XR device 140 may determine, using a camera of the XR device 140, a current location within the physical retail store. The current location may be associated with the XR device 140 and/or the user wearing (or carrying) the XR device 140. The XR device 140, based on object recognition or other related techniques employed by the camera, may determine that the user is in a particular area of the physical retail store (e.g., an entryway or a particular aisle). The XR device 140 may detect an entryway sign, aisle sign numbers, etc. using object recognition, which may enable the XR device 140 to determine the current location within the physical retail store”; ¶56: “An XR device 305may be capable of receiving, generating, storing, processing, providing, and/or routing information associated with directing users to locations within a physical retail store using XR, as described elsewhere herein. The XR device 305 may be a head-mounted device (or headset) or a mobile device. The XR device 305may provide XR capabilities, which may include AR, MR, and/or VR. The XR device 305 may include various types of hardware, such as processors, sensors, cameras, input devices, and/or displays. The sensors may include accelerometers, gyroscopes, magnetometers, and/or eye-tracking sensors. The XR device 305may include an optical head-mounted display, which may allow information to be superimposed onto a field of view”; ¶57: “The server 310 includes one or more devices capable of receiving, generating, storing, processing, providing, and/or routing information associated with directing users to locations within a physical retail store using XR, as described elsewhere herein. The server 310may include a communication device and/or a computing device. For example, the server 310ay be an application server, a client server, a web server, a database server, a host server, a proxy server, a virtual server (e.g., executing on computing hardware), or a server in a cloud computing system”). Further, the additional elements “eye-tracking camera”, “accelerometer” recited in new dependent claims 26-28, are also well-understood, routine and conventional activities previously known in the art, and used as a tool for receiving, generating, storing, processing, providing and/or routing information associated with directing users to locations within a physical retail store using XR- see also applicant’s disclosure ¶56. Hence, the claimed invention does not contain an inventive concept sufficient to transform the abstract nature of the claim into a patent-eligible application. Moreover, as it relates to applicant’s XR device used for communicating and transmitting data (navigation path based on location of a user), it fails to integrate the abstract idea into a practical application or significantly more than the abstract idea because it is merely standard computer technology and hardware/software components recited at a high-level of generality and under their broadest reasonable interpretation includes generic computer and networking components performing generic computer functions such that it amounts to no more than mere instruction to apply the exception using a generic computer component-see MPEP 2106.05(f). Similarly, "claiming the improved speed or efficiency inherent with applying the abstract idea on a computer" does not integrate a judicial exception into a practical application or provide an inventive concept. Merely receiving user current location data, processing and generating navigation path to a next location via an XR device is not an actual improvement to a computer, improvement to the functioning of a computer nor an improvement to the XR device itself. Instead, collecting and organizing data is mere automation of a manual process that was performed before computers and thus not an improvement to a computer-see MPEP 2106.05(a)(II) In addition, the XR device is using old and well-known computing techniques to implement the judicial exception and fails to integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. In view of the above, the rejection under 35 USC 101 has been maintained. With respect to the 35USC103 rejection, applicant argues the amended claims; Examiner has modified the rejection based on applicant’s amendments to further explain how the limitations are being interpreted and addressed each of the claim limitations as noted below in this non-Final action. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-4, 6-14, 16-23 and 25-28 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to an abstract idea without significantly more. Claims 1-4, 6-14, 16-23 and 25-28 are directed to a system (machines), and a process (method/series of steps or acts to be performed); thus, each of the claims fall within one of the four statutory categories. Step 2A-Prong 1: Claim 1 recites in part, “one or more components configured to: receive, via an interface of the XR device, an indication of items to be purchased; identify one or more physical retail stores that carry the items; provide, via the interface, a list of the one or more physical retail stores, wherein the one or more physical retail stores are ranked based on one or more factors; download, from a server and prior to arriving at a physical retail store of the one or more physical retail stores, store mapping information associated with the physical retail store, wherein the store mapping information indicates a map of one or more store aisles of the physical retail store detect that the XR device is within the physical retail store based on a geographic location associated with the XR device; detect, by a camera of the XR device, a store aisle of the one or more store aisles determine, using object recognition employed by a camera of the XR device, a current location of the XR device within the physical retail store based on the store aisle; determine, using localized processing of the XR device and based on the items and the current location, a next location; and provide, via the interface, an in-store navigation path to direct a user of the XR device via overlayed audio-visual cues from the current location to the next location” The underlined limitations above demonstrate independent claim 1 is directed toward the abstract idea for receiving items to be purchased, identifying and ranking physical retails stores that carry the items; determining and providing an in-store navigation path within a physical retail store to direct a user from a current location to a next location based on items to be purchased in a computing environment. Applicant’s specification emphasizes a method/system related to identifying physical retail store(s) that carry certain items and for providing navigation paths for directing users to locations within the store whereby the customer (user) may indicate via an interface the items to be purchased. The extended reality (XR) device in conjunction with a server identifies physical retail stores that carry the items based on a category (or theme) associated with the items, user profile and/or real-time item inventory information. After the user arrives at one of the identified retail stores, the XR device provides via an interface an in-store navigation path to direct and/or guide the user to locations (through the aisles) within the physical retail store to locate where the items are held (¶11-¶14). Representative Claim 1 is considered an abstract idea because the claimed invention is directed to (i) commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations), and (ii) managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). The steps for receiving an indication of items to be purchase, identifying one or more physical retail stores that carry the items, listing and ranking the one or more physical retails stores, detecting and determining current location within a physical store and based on the items and the current location, a next location; and providing an in-store navigation path to direct a user from the current location to the next location pertains to management of personal behavior, relationships and interactions between people (customer/user, merchants/providers and the like) with augmented reality devices according to rules and instructions to facilitate commercial or legal interactions for viewing of various video/image data, recommendation and purchasing purposes is directed to the certain methods of organizing human activity groupings of abstract ideas. Therefore, the claimed invention recites an abstract idea--see MPEP 2106.04(II). Independent claims 13 and 21 recite substantially similar limitations as independent claim 1, therefore they are also directed to the same abstract idea. Step 2A-Prong 2: This judicial exception is not integrated into a practical application because the additional elements “extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device”, [claims 1, 13]; “a system” [claim 21], merely provides an abstract-idea based solution using data gathering and analysis; and provides instructions for commercial interactions (marketing or sales activities; business relations), organizing human interactions (receiving an indication of items to be purchase, identifying one or more physical retail stores that carry the items, listing and ranking the one or more physical retails stores, detecting and determining current location within a physical store and based on the items and the current location, a next location; and providing an in-store navigation path to direct a user from the current location to the next location) pertains to management of personal behavior, relationships and interactions between people (customer/user, merchants/providers and the like) with augmented reality devices according to rules and instructions; and implement the abstract idea recited above utilizing the “extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device”, [claims 1, 13], “a system” [claim 21] as tools to perform the abstract idea, and generally links the abstract idea to a particular technological environment-see MPEP 2106.05 (f-h). These elements do not impose any meaningful limits on practicing the abstract idea—see MPEP 2106.05(g). Independent claim 1 fails to operate the recited “extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device”, [claims 1, 13], “a system” [claim 21] (which are merely standard computer technology and hardware/software components)- see applicant’s disclosure, ¶55: “Fig. 3 is a diagram of an example environment 300 in which systems and/or methods described herein may be implemented. As shown in Fig. 3, environment 300 may include one or more XR devices 305, a server 310, and a network315. Devices of environment 300may interconnect via wired connections, wireless connections, or a combination of wired and wireless connections”; ¶56: “An XR device 305may be capable of receiving, generating, storing, processing, providing, and/or routing information associated with directing users to locations within a physical retail store using XR, as described elsewhere herein. The XR device 305 may be a head-mounted device (or headset) or a mobile device. The XR device 305may provide XR capabilities, which may include AR, MR, and/or VR. The XR device 305 may include various types of hardware, such as processors, sensors, cameras, input devices, and/or displays”; ¶57: “The server 310 includes one or more devices capable of receiving, generating, storing, processing, providing, and/or routing information associated with directing users to locations within a physical retail store using XR, as described elsewhere herein. The server 310may include a communication device and/or a computing device… the server 310may be an application server, a client server, a web server, a database server, a host server, a proxy server, a virtual server (e.g., executing on computing hardware), or a server in a cloud computing system. In some implementations, the server 310 includes computing hardware used in a cloud computing environment”) in any exceptional manner, and there is no evidence in the disclosure to suggest achieving an actual improvement in the computer functionality itself, or improvement in any specific computer technology other than utilizing ordinary computational tools to automate and perform the abstract idea for receiving items to be purchased, identifying and ranking physical retails stores that carry the items; and providing an in-store navigation path within a physical retail store to direct a user from a current location to a next location based on items to be purchased in a computing environment —see MPEP 2106.05(a). Accordingly, applicant has not shown an improvement or practical application under the guidance of MPEP section 2106.04(d) or 2106.05(a). The description of these elements evidences that they are generic computing components used to perform generic functions. Hence, applicant’s limitations as recited above do nothing more than supplement the abstract using generic computer and networking components performing generic computer functions (receiving, identifying, providing, downloading, detecting, determining) such that it amounts to no more than mere instruction to apply the exception using a generic computer component-see MPEP 2106.05(f) and linking the use of the judicial exception to a particular technological environment or field of use as discussed in MPEP 2106.05(h). Dependent claims 2-4, 6-12, 14, 16-20, 22, 23, and 25-28 fail to cure the deficiencies of the above noted independent claim from which they depend and are therefore rejected under the same grounds. The dependent claims further recite the abstract idea without imposing any meaningful limits on practicing the abstract idea. Dependent claims 2-4, 6-12, 14, 16-20, 22, 23, and 25-28, recite additional data gathering and processing steps (receiving, verifying, retrieving, determining, providing, identifying, capturing, transmitting, updating). For example dependent claims 2-4, 6-9 recite in part, “…wherein the one or more components are configured to…determine …identify …retrieve …detect”; claim 11 recites in part, “…wherein the in-store navigation path is a …”, claims 12 and 20 recite in part, “…wherein the XR device is a first XR device…”; claim 14 recites in part, “...further comprising: receiving, via the interface, a set of preferences associated with the user...”; claim 16 recites in part, “…further comprising: determining, based on store information associated with the physical retail store, …”; claims 17-19 recite in part, “…further comprising: detecting…”; claims 22 and 23 recite in part, “…wherein the one or more components of the server are configured to:…”; claim 25 recites in part, “…wherein the one or more components of the XR device are configured to…”, claims 26-28 recite in part, “detect, using an eye-tracking camera…”; which are still directed toward the abstract idea identified previously and are no more than mere instructions to apply the exception using a computer or with computing components. Therefore, the abstract idea fails to integrate into any practical application. Thus, under Step 2A-Prong Two the claims are directed to an abstract idea. Step 2B: The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because as discussed above, with respect to integration of the abstract idea into a practical application, the additional elements ““extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device” [claims 1, 13]; “a system” [claim 21] amounts to no more than mere instructions to apply the exception using a generic computer component which does not integrate a judicial exception into a practical application nor provide an inventive concept (significantly more than the abstract idea). The additional elements of the dependent claims, “second XR device” [claims 12, 20]; “sensor of XR device”[claims 9, 19]; “eye-tracking camera of the XR device”, “accelerometer” [claims 26-28] only serves to further limit the abstract idea utilizing the “extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device” [claims 1, 13]; “a system” [claim 21] as a tool, and generally link the use of the abstract idea to a particular technological environment, and hence are nonetheless directed towards fundamentally the same abstract idea as their respective independent claim since they fail to impose any meaningful limits on practicing the abstract idea. Further, giving the broadest reasonable interpretation of the claim limitations in light of the specification, applicant’s “second XR device” [claims 12, 20]; “sensor of XR device”[claims 9, 19]; “eye-tracking camera of the XR device”, “accelerometer” [claims 26-28] amounts to no more than applying the judicial exception using generic computing components (software applications or program (rules)) for carrying out the method/system steps linking the use of the judicial exception to a computing environment. In this case, the “second XR device” [claims 12, 20]; “sensor of XR device” [claims 9, 19]; “eye-tracking camera of the XR device”, “accelerometer” [claims 26-28] are broadly used to further process /transmit/communicate received data utilizing rules logic and fails to integrate the abstract idea into a practical application. The capturing, transmitting a video and receiving communication is recited at a high level of generality without technical implementation details of the operations to indicate how they improve computers or other technologies- see applicant’s specification, ¶11-¶14; ¶12 “providing navigation paths for directing users to locations within a physical retail store using extended reality (XR)”. The steps for capturing, transmitting video/image data and receiving communication via a second XR device are generic computer functions. Determining and detecting using a camera and/or sensor of XR device are also generic computer functions -see applicant’s disclosure, ¶55: “environment 300 may include one or more XR devices 305, a server 310, and a network315. Devices of environment 300may interconnect via wired connections, wireless connections, or a combination of wired and wireless connections”; ¶56: “An XR device 305may be capable of receiving, generating, storing, processing, providing, and/or routing information associated with directing users to locations within a physical retail store using XR, as described elsewhere herein. The XR device 305 may be a head-mounted device (or headset) or a mobile device. The XR device 305may provide XR capabilities, which may include AR, MR, and/or VR. The XR device 305 may include various types of hardware, such as processors, sensors, cameras, input devices, and/or displays. The sensors may include accelerometers, gyroscopes, magnetometers, and/or eye-tracking sensors”; ¶66: “Additionally, or alternatively, one or more components of the one or more XR devices (e.g., processor 420, memory 430, input component 440, output component 450, and/or communication component 460) may perform or may be configured to perform one or more process blocks of Fig. 5”. Further, the disclosure generically recites, regarding the XR device, ¶12: “providing navigation paths for directing users to locations within a physical retail store using extended reality (XR)” and “The XR device, in conjunction with a server, may identify physical retail stores (e.g., a single physical retail store or multiple physical retail store) that carry the items … based on a category (or theme) associated with the items, user profile information, and/or real-time item inventory information …identify the physical retail stores based on distance, price, and other factors indicated in the user profile information… may identify recommended items that are related to the items indicated by the user …after the XR device (and the user carrying the XR device) arrive at a physical retail store, from the list of physical retail stores, the XR device may provide, via the interface, an in-store navigation path to direct the user via overlayed audio-visual cues to locations within the physical retail store at which the items are held”; ¶36: “The XR device 140, based on object recognition or other related techniques employed by the camera, may determine that the user is in a particular area of the physical retail store (e.g., an entryway or a particular aisle). The XR device 140 may detect an entryway sign, aisle sign numbers, etc. using object recognition, which may enable the XR device 140 to determine the current location”; ¶39: “the XR device 140 may detect, via the camera, that the next item has been added to a shopping cart (e.g., the user physically places the next item into the shopping cart). The camera may use object recognition or other related techniques, such that when the user picks up the next item, the camera may scan the next item and determine that the next item corresponds to one of the items from the list” Claims are not saved from abstraction merely because they recite components more specific than a generic computer”. Here the recited ““extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device”, [claims 1, 13]; “a system” [claim 21] and additional elements, “second XR device” [claims 12, 20]; “sensor of XR device”[claims 9, 19]; “eye-tracking camera of the XR device”, “accelerometer” [claims 26-28] similarly provides a generic environment in which the claimed method is performed”) In re TLI Commc’ns LLPC Patent Litig., 823F.3d607,611 (Fed. Cir.2016) Moreover, there is no improvement to the “extended reality device” (XR), “one or more components”, “an interface of the XR device”, “server”, “camera”, “object recognition employed by the camera of the XR device” [claims 1, 13]; “a system” [claim 21] and the additional element(s) (“second XR device” [claims 12, 20]; “sensor of XR device”[claims 9, 19]; “eye-tracking camera of the XR device”, “accelerometer” [claims 26-28]) amount to no more than applying the judicial exception using generic computing components, linking the use of the judicial exception to a computing environment. Hence, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Accordingly, even when considered as a whole, the claims do not transform the abstract idea into a patent-eligible invention since the claim limitations do not amount to a practical application or significantly more than an abstract idea for receiving items to be purchased, identifying and ranking physical retails stores that carry the items; and providing an in-store navigation path within a physical retail store to direct a user from a current location to a next location based on items to be purchased in a computing environment. Hence, claims 1-4, 6-14, 16-23 and 25-28 are directed to non-statutory subject matter and are rejected as ineligible subject matter under 35 USC 101. See MPEP 2106. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-4, 6-14, 16-23 and 25-28 are rejected under 35 U.S.C. 103 as being unpatentable over Kaehler et al., US Patent Application Publication No US2017/0039613 A1, in view of Bronicki et al., US Patent Application Publication No US 2021/0374836 A1. With respect to claims 1 and 21, Kaehler discloses, one or more components configured to: receive, via an interface of the XR device, an indication of items to be purchased (¶5: “An AR system may comprise a wearable AR device that is configured to capture information associated with an item for sale in a retail location or store. The wearable AR device (ARD) may be configured to monitor the movement and/or location of the particular item as the shopper moves around the store. When one or more predetermined criteria are met, the item may be designated as “carried” or “purchased” …When the user selects a purchase option on the AR device while viewing or carrying the item, information associated with the item may be relayed to a remote server that may process the purchase transaction …a user's ARD may maintain a database that stores information about the items that are carried or purchased by the user, and the information in the database may be used to prompt the user to pay for unpurchased items carried by the user prior to leaving the retail location. An AR system may optionally provide information about purchasing alternatives to a shopper who is about to purchase an item or product (hereafter “the target”) in a physical retail location”; ¶33: “The AR system may also access a shopping list maintained by the user and cross-reference the items on the list with the product inventory of the retailer, and remind the user that they may wish to consider purchasing an item that is on their shopping list at that particular retail location”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) a server comprising one or more components configured to: receive, from an extended reality (XR) device, an indication of items to be purchased (¶5: “An AR system may comprise a wearable AR device that is configured to capture information associated with an item for sale in a retail location or store. The wearable AR device (ARD) may be configured to monitor the movement and/or location of the particular item as the shopper moves around the store. When one or more predetermined criteria are met, the item may be designated as “carried” or “purchased” …When the user selects a purchase option on the AR device while viewing or carrying the item, information associated with the item may be relayed to a remote server that may process the purchase transaction …a user's ARD may maintain a database that stores information about the items that are carried or purchased by the user, and the information in the database may be used to prompt the user to pay for unpurchased items carried by the user prior to leaving the retail location. An AR system may optionally provide information about purchasing alternatives to a shopper who is about to purchase an item or product (hereafter “the target”) in a physical retail location”; ¶33: “The AR system may also access a shopping list maintained by the user and cross-reference the items on the list with the product inventory of the retailer, and remind the user that they may wish to consider purchasing an item that is on their shopping list at that particular retail location”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) identify one or more physical retail stores that carry the items; (¶6: “a system for presenting purchase offers to a user may comprise an augmented reality (AR) device configured to identify a target product being considered by a user for purchase and to identify the price of the target product at a retail location, and a remote server”; ¶42: “When the user is at a particular retail location, the user may receive offers from the ONS that originate from an EP that is a merchant in competition with the merchant of the particular retail location. Alternatively, or additionally, the user may receive an offer from the ONS that originates from an EP that is affiliated with the merchant at that particular retail location. In some variations, the ONS may display a ranked list of products in the same category as the target product, where the ranking is determined by user reviews aggregated by the ONS or an EP. Optionally, the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”) the XR device comprising one or more components configured to: receive, from the server, the indication of the one or more physical retail stores(¶6: “a system for presenting purchase offers to a user may comprise an augmented reality (AR) device configured to identify a target product being considered by a user for purchase and to identify the price of the target product at a retail location, and a remote server”; ¶41: “the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”; ¶42: “When the user is at a particular retail location, the user may receive offers from the ONS that originate from an EP that is a merchant in competition with the merchant of the particular retail location. Alternatively, or additionally, the user may receive an offer from the ONS that originates from an EP that is affiliated with the merchant at that particular retail location. In some variations, the ONS may display a ranked list of products in the same category as the target product, where the ranking is determined by user reviews aggregated by the ONS or an EP. Optionally, the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”) provide, via the interface, a list of the one or more physical retail stores, wherein the one or more physical retail stores are ranked based on one or more factors (¶41: “the ONS may store information regarding the features of the product being considered by the user, pricing of the same product at the retail location where the user is currently located, at other nearby retail locations, and/or at an online retailer, and provide such information to an ARD… In some variations, the ONS may display a ranked list of products in the same category as the target product, where the ranking is determined by user reviews aggregated by the ONS or an EP. Optionally, the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location. If the ONS determines that the user is interested in a product that is out of stock at the current retail location, then information may be provided about ordering, restocking dates, or discounts or other incentives offered to the customer willing to pick up the product, or have it shipped to them, at a later date”) detect that the XR device is within the physical retail store, based on a geographic location associated with the XR device; (¶10: “The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector, where the computation component may be configured to determine whether a wearer is in possession of an item using data from the proximity detector and the motion detector and to generate and transmit a signal to the control server, where the signal may indicate the identity of the item and whether the wearer is in possession of the item. The proximity detector may comprise at least one of a RFID reader, a camera, and a scanner. The motion detector may comprise a location estimator configured to detect a change of location of the user device. The position detector may comprise a global positioning system or a wireless based location determining system”; ¶33: “the AR system may detect that the user is geographically co-located with the physical address of a retail location, and that the user has slowed or stopped their walking pace.”;¶36: “An ARD worn by a shopper may detect (e.g., based on data from the motion detector and/or location sensor) when the shopper enters a retail location having a plurality of products for purchase. The products may be items or objects that are physically located in the retail location or store, and/or may be services that may be purchased and utilized at a time specified by the user (e.g., gift certificates, massage, salon, automobile services and the like)”; ¶47: “A user ARD may transmit information to the MIS, for example, information regarding the location and identity of the user ARD (which may correspond to the location and identity of the user), a list of items associated with the user, and if the user is currently at a retail location, a store ID (or merchant ID) corresponding to the merchant at that retail location”; ¶65: “As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”; ¶66: “an AR system may be able to present suggested products and services for purchase by a user as they stop or slow down near to the retail location. That is, the user need not enter the retail location before the AR system, but certain cues (e.g., slowing pace, gazing towards the storefront, pointing to the store, whether the time of day is approaching a meal time, etc.) may indicate that the user is considering whether to purchase the goods or services sold at that venue”) detect, by a camera of the XR device, a store aisle of the one or more store aisles (¶10: “the system comprising a wearable augmented reality device (ARD) having wireless communication capability and a control server in wireless communication with the ARD. The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector, where the computation component may be configured to determine whether a wearer is in possession of an item using data from the proximity detector and the motion detector and to generate and transmit a signal to the control server, where the signal may indicate the identity of the item and whether the wearer is in possession of the item. The proximity detector may comprise at least one of a RFID reader, a camera, and a scanner”; ¶107: “the capturing device may include a handheld device (e.g., a smartphone, a tablet, a mobile station, a personal digital assistant, a laptop, and more) or a wearable device (e.g., smart glasses, a smartwatch, a clip-on camera)”; ¶141: “FIG. 4A illustrates how an aisle 400 of retail store 105 may be imaged using a plurality of capturing devices 125 fixedly connected to store shelves. FIG. 4B illustrates how aisle 400 of retail store 105 may be imaged using a handheld communication device. FIG. 4C illustrates how aisle 400 of retail store 105 may be imaged by robotic devices equipped with cameras”) determining, using object recognition employed by the camera of the XR device, a current location of the XR device within the physical retail store based on the store aisle (¶10: “the system comprising a wearable augmented reality device (ARD) having wireless communication capability and a control server in wireless communication with the ARD. The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector, where the computation component may be configured to determine whether a wearer is in possession of an item using data from the proximity detector and the motion detector and to generate and transmit a signal to the control server, where the signal may indicate the identity of the item and whether the wearer is in possession of the item. The proximity detector may comprise at least one of a RFID reader, a camera, and a scanner”; ¶32: “the ARD 210 may comprise, for example, a bar code reader, a quick response (QR) code reader, a scanner, a camera, and/or any other types of information readers”; ¶33: “the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase … the computation component of the ARD may have a first computer-implemented method that recognizes the objects within the user's geographical location that are available for purchase … the AR system may detect that the user is geographically co-located with the physical address of a retail location, and that the user has slowed or stopped their walking pace. The AR system may also access a shopping list maintained by the user and cross-reference the items on the list with the product inventory of the retailer, and remind the user that they may wish to consider purchasing an item that is on their shopping list at that particular retail location”; ¶34: “The ARD may be configured to recognize the product that is being viewed by the user. It may do this through the operation of computer vision algorithms, RFID, bar code scanning, or any such object recognition mechanisms, and may do so automatically, or only when an action is initiated by the shopper. In some variations, an ARD (e.g., ARD 210) may detect the presence of an item based on data from a proximity sensor and/or scanner (e.g., RFID scanner) of the ARD, and based on relative movement between the user and the item, the system may determine whether the user is interested in the item”; ¶36: “an ARD may be able to detect and identify a product by image recognition and processing methods”; ¶107: “the capturing device may include a handheld device (e.g., a smartphone, a tablet, a mobile station, a personal digital assistant, a laptop, and more) or a wearable device (e.g., smart glasses, a smartwatch, a clip-on camera)”; ¶113: “image processing unit 130 may use any suitable image analysis technique including, for example, object recognition, object detection, image segmentation, feature extraction, optical character recognition (OCR), object-based image analysis, shape region techniques, edge detection techniques, pixel-based detection, artificial neural networks, convolutional neural networks, etc”; ¶141: “FIG. 4A illustrates how an aisle 400 of retail store 105 may be imaged using a plurality of capturing devices 125 fixedly connected to store shelves. FIG. 4B illustrates how aisle 400 of retail store 105 may be imaged using a handheld communication device. FIG. 4C illustrates how aisle 400 of retail store 105 may be imaged by robotic devices equipped with cameras”) Applicant’s disclosure teaches at ¶36: “The current location may be associated with the XR device 140 and/or the user wearing (or carrying) the XR device 140. The XR device 140, based on object recognition or other related techniques employed by the camera, may determine that the user is in a particular area of the physical retail store (e.g., an entryway or a particular aisle). The XR device 140 may detect an entryway sign, aisle sign numbers, etc. using object recognition, which may enable the XR device 140 to determine the current location within the physical retail store”. Examiner interprets that applicant’s XR device is determining the current location of the user within the retail store. Further, giving the broadest reasonable interpretation of the claim limitation in light of the specification, Examiner interprets at least the wearable augmented reality device (ARD) and/or capturing devices including at least a camera and/or any other types of information readers and computation component programmed with object recognition software for identifying and recognizing objects in the user’s environment/geographical location and detecting user location in a retail environment as taught by Kaehler as teaching applicant’s limitation, “determining, using object recognition employed by a camera of the XR device, a current location of the XR device within the physical retail store”. Kaehler discloses all of the above limitations, Kaehler does not distinctly describe the following limitation, but Bronicki however as shown discloses, download, from a server and prior to arriving at a physical retail store of the one or more physical retail stores, store mapping information associated with the physical retail store, wherein the store mapping information indicates a map of one or more store aisles of the physical retail store (¶240: “Database 1305 may store information and data for the components of system 1300 (e.g., server 1301, user devices 1302, and/or one or more cameras 1303). In some embodiments, server 1301, user devices 1302, and/or one or more cameras 1303 may be configured to access database 1305, and obtain data stored from and/or upload data to database 1305 via digital communication network 1304. Database 1305 may include a cloud-based database or an on-premises database. Database 1305 may include images captured by one or more cameras 1303, simulated images generated by server 1301 and/or user device 1302, configuration data, expression data, datasets, model data (e.g., model parameters, training criteria, performance metrics, etc.), and/or other data, consistent with disclosed embodiments”) determining, using localized processing of the XR device and based on the items and the current location, a next location (¶19: “after providing the first navigation data, receiving a second indoor location of the user within the retail store; determining that the second indoor location is within a selected area around the target destination, the selected area not including the first indoor location; and in response to the determination that the second indoor location is within the selected area around the target destination, providing second navigation data to the user through a second visual interface, the second visual interface differing from the first visual interface”; Fig 24, Fig 25, ¶373: “a user device may determine its location within a retail store by using electromagnetic signals, such as GPS signals, Wi-Fi signals, or signals from an indoor localization system. In some embodiments, map 2404 may also include a route indicator 2410, which may indicate a route (e.g., a walking route) through a portion of the retail store. For example, route indicator 2410 may include one or more lines that visually connect user location indicator 2408 to a destination indicator 2412. In some embodiments, route indicator 2410 may include distance information, such as text that denotes a distance between two points along a user's route (e.g., 5 meters, 50 meters, etc.). A destination indicator 2412 may indicate or correspond to a user's destination, which may be particular shelf and/or product location within the retail store”) provide, via the interface, an in-store navigation path to direct a user of the XR device via overlayed audio-visual cues from the current location to the next location (¶19: “a method for providing visual navigation assistance in retail stores may comprise receiving a first indoor location of a user within a retail store; receiving a target destination within the retail store; providing first navigation data to the user through a first visual interface… in response to the determination that the second indoor location is within the selected area around the target destination, providing second navigation data to the user through a second visual interface, the second visual interface differing from the first visual interface”; Fig 1, Fig 2, ¶119: “The term “output device” is intended to include all possible types of devices capable of outputting information from server 135 to users or other computer systems (e.g., a display screen, a speaker, a desktop computer, a laptop computer, mobile device, tablet, a PDA, etc.”; ¶125: “audio controller 214 and speaker 222 may facilitate audio output from server 135 In addition to or instead of touch screen 218, I/O system 210 may include a display screen (e.g., CRT, LCD, etc.), virtual reality device, augmented reality device, and so forth. Specifically, touch screen controller 212 (or display screen controller) and touch screen 218 (or any of the alternatives mentioned above) may facilitate visual output from server 135. Audio controller 214 may be coupled to a microphone 220 and a speaker 222 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. Specifically, audio controller 214 and speaker 222 may facilitate audio output from server 135”; Fig 24, Fig 25, ¶373: “a user device may determine its location within a retail store by using electromagnetic signals, such as GPS signals, Wi-Fi signals, or signals from an indoor localization system. In some embodiments, map 2404 may also include a route indicator 2410, which may indicate a route (e.g., a walking route) through a portion of the retail store. For example, route indicator 2410 may include one or more lines that visually connect user location indicator 2408 to a destination indicator 2412. In some embodiments, route indicator 2410 may include distance information, such as text that denotes a distance between two points along a user's route (e.g., 5 meters, 50 meters, etc.). A destination indicator 2412 may indicate or correspond to a user's destination, which may be particular shelf and/or product location within the retail store.”; ¶375: “navigation assistance augmented-reality-view user interface 2500. In some embodiments, capturing device 125, output device 145, or any other device may display user interface 2500”; ¶376: “User interface 2500 may include an augmented reality display area 2502, which may include an image (e.g., from a video stream) of a user's environment and/or visual overlays. For example, a device, such as capturing device 125 or output device 145, may capture one or more images using a camera, and may integrate at least portions of the one or more images into augmented reality display area 2502, such as by displaying an image and placing one or more visual overlays on the image”; ¶377: “augmented reality display area 2502 may include a destination overlay indicator 2506, which may be a symbol, icon, outline, color, image distortion, or other visual indicator associated with a user's destination. In some embodiments, destination overlay indicator 2506 may correspond to a shelf and/or product. For example, destination overlay indicator 2506 may correspond to a product selected by a user at a user interface other than user interface 2500, to a product from a shopping list corresponding to the user, to a product associated with a coupon corresponding to the user, to an area (e.g., a shelf) and/or a product corresponding to a task assigned to the user, and so forth… In some embodiments, destination overlay indicator 2506 and/or route overlay indicator 2508 may have a partial degree of transparency, which may allow a user to view a portion of an environment covered by a visual overlay, while still being able to understand information conveyed by the visual overlay. Any number of destination overlay indicators 2506 and/or route overlay indicators 2508, as well as other types of overlay indicators, may be placed within augmented reality display area 2502”; ¶379: “map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses). For example, destination overlay indicator 2506 may be presented in a location adjacent to or over the location of a selected item (such as a product, a shelf, a label, etc.), route overlay indicator 2508 may be presented in a location indicative of the route, and so forth. In some examples, when a head of a user wearing augmented reality glasses moves (or other part of a user associated with angling an augmented reality device), the location of the destination overlay indicator 2506 and/or route overlay indicator 2508 moves to maintain the relative location of the overlays to selected items in the environment of the user, for example to maintain the location of destination overlay indicator 2506 in a location adjacent to or over the location of the selected item, to keep the location of route overlay indicator 2508 in the location indicative of the route”; ¶388: “the second visual interface may differ from the first visual interface. For example, the second visual interface may be an augmented reality visual interface, which may provide local navigation assistance information to a user (e.g., navigation assistance augmented-reality-view user interface 2500) and the first visual interface may be a map view of the retail store, which may provide a map-like view of an area to a user (e.g., navigation assistance map-view user interface 2400). In some embodiments, process 2600 may include receiving, from a mobile device of the user, at least one image of a physical environment of the user. The physical environment may correspond to a retail store, such that the image may include an aisle, a shelving unit, a product, or any other store structure or item”; ¶532-¶537; ¶536: “server 135 may select a retail store among the plurality of retail stores based on the indication of the external assignment. The indication of the external assignment may include a transportation assignment associated with a particular location, and the selected retail store may be determined based on the particular location, as discussed previously by reference to FIGS. 40A and 40B. For instance, the selected retail store may be determined based on a distance from the particular location to the selected retail store”; ¶537: “Server 135 may analyze the travel route and select the retail store based on a measure of a detour to the selected retail store when traveling from the first location to the second location. Server 135 may iteratively, for each of the plurality of retail stores, determine an optimal route and measure an expected detour time or distance to each of the plurality of retail stores”) Kaehler discloses an augmented reality (AR) system that provides information about purchasing alternatives to a user who is about to purchase an item or product (e.g., a target product) in a physical retail location. Kaehler teaches he that the AR system may comprise a control system or server that is configured to be in communication with one or more augmented reality devices (ARD). The one or more AR devices may comprise one or more ARDs worn by one or more users or shoppers at a retail location to wirelessly communicate with the control server to procure data related to a product being considered for purchase, and present this data to the user to aid in their purchase decision. Bronicki teaches systems, methods, and devices for identifying products in retail stores, capturing, collecting, and automatically analyzing images of products via an augmented reality system. Bronicki also teaches a navigation assistance augmented-reality view user interface including a map-like view of an area to user. Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for navigating a walkable environment as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, and providing an augmented reality display/view/map and user interface of a user’s environment with visual overlays (Fig 1, Fig 2, Fig 24, Fig 25, ¶19, ¶119, ¶125, ¶371-¶379, ¶388, ¶523-¶537). With respect to claims 2 and 22, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: determine a category associated with the items; and identify the one or more physical retail stores based on the category associated with the items, wherein the one or more physical retail stores offer items for sale that correspond to the category (¶7: “Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price”; ¶42: “The ONS may also present offers provided by EPs to the user. When the user is at a particular retail location, the user may receive offers from the ONS that originate from an EP that is a merchant in competition with the merchant of the particular retail location. Alternatively, or additionally, the user may receive an offer from the ONS that originates from an EP that is affiliated with the merchant at that particular retail location. In some variations, the ONS may display a ranked list of products in the same category as the target product, where the ranking is determined by user reviews aggregated by the ONS or an EP. Optionally, the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”; claim 7: “…wherein target product data comprises product category data”) With respect to claim 3, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: determine a category associated with the items; and determine one or more recommended items associated with the category, wherein the one or more recommended items are aligned with a set of preferences indicated by the user; and determine that the one or more physical retail stores carry the one or more recommended items (¶5: “Information about purchasing alternatives may be provided by the merchant and/or competitors to that merchant. Information provided by the merchant may include recommendations for a similar product that better suits the needs of the shopper and/or related or correlated products that are associated with the target, where the recommended products are sold by the merchant. The recommended products may be sold at the physical retail location, and/or on the merchant's website”; The ONS may then sort 312 the generated offers based on a variety of parameters, for example, paid sponsorship of a particular OS by a particular EP, by lowest or highest price, by earliest or latest delivery dates, and/or by any user-specified preferences. For example, users may wish to see offers from only a certain type of EP (e.g., eco-friendly EPs, fair-trade, EPs, etc.), and/or may wish to block offers from other EPs (e.g., EPs that have spammed them in the past, EPs that the user wishes to boycott for personal reasons, etc.). Users may also wish to see only certain types of offers, for example, offers for the target at a lower price and with immediate delivery or same-day acquisition (and not offers that can provide a lower price but delayed delivery). Alternatively, or additionally, offers may be sorted according to value criteria, which may indicate that certain types of offers may be more interesting to the user than other types of offers. The value criteria may be selected by the user and/or generated by a computer-implemented method based on prior actions of the user. For example, offers may be sorted by price, and/or relevance, and/or average user rating, etc., and the user may select the manner in which the offers are sorted. In some variations, the offers may be sorted and presented to the user according to how the user has preferred to see the offers in the past, and/or their demographic information”; ¶65: “ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”; ¶67: “shopper data (e.g., demographic data, personal shopping lists, purchase history and patterns, budget and/or financial profile, etc.). Such data is cross-indexed so that merchants can collect data analytics (subject to the privacy setting approved by the user) on the types of shoppers they attract and products sold, and shoppers can compare prices across different merchants and select and/or purchase items or services within their desired budget and in accordance with their needs. This information may also be used by the ONS and/or an EP to dynamically formulate offers that may be presented to the user on their ARD as the user peruses the products at a retail location”; ¶71: “Various types of information about the user may be shared from their ARD with the ONS and/or the EPs. For example, demographic information (e.g., age, gender, and ethnicity), purchase habits, history and preferences, shopping lists and the like may be shared with the ONS and/or the Eps”) Bronicki further discloses, provide, via the interface, the in-store navigation path to direct the user to locations within the physical retail store at which the one or more recommended items are held (Fig 24, Fig 25, ¶373: “a user device may determine its location within a retail store by using electromagnetic signals, such as GPS signals, Wi-Fi signals, or signals from an indoor localization system. In some embodiments, map 2404 may also include a route indicator 2410, which may indicate a route (e.g., a walking route) through a portion of the retail store. For example, route indicator 2410 may include one or more lines that visually connect user location indicator 2408 to a destination indicator 2412. In some embodiments, route indicator 2410 may include distance information, such as text that denotes a distance between two points along a user's route (e.g., 5 meters, 50 meters, etc.). A destination indicator 2412 may indicate or correspond to a user's destination, which may be particular shelf and/or product location within the retail store.”; ¶375: “navigation assistance augmented-reality-view user interface 2500. In some embodiments, capturing device 125, output device 145, or any other device may display user interface 2500”; ¶376: “User interface 2500 may include an augmented reality display area 2502, which may include an image (e.g., from a video stream) of a user's environment and/or visual overlays. For example, a device, such as capturing device 125 or output device 145, may capture one or more images using a camera, and may integrate at least portions of the one or more images into augmented reality display area 2502, such as by displaying an image and placing one or more visual overlays on the image”; ¶377: “augmented reality display area 2502 may include a destination overlay indicator 2506, which may be a symbol, icon, outline, color, image distortion, or other visual indicator associated with a user's destination. In some embodiments, destination overlay indicator 2506 may correspond to a shelf and/or product. For example, destination overlay indicator 2506 may correspond to a product selected by a user at a user interface other than user interface 2500, to a product from a shopping list corresponding to the user, to a product associated with a coupon corresponding to the user, to an area (e.g., a shelf) and/or a product corresponding to a task assigned to the user, and so forth… In some embodiments, destination overlay indicator 2506 and/or route overlay indicator 2508 may have a partial degree of transparency, which may allow a user to view a portion of an environment covered by a visual overlay, while still being able to understand information conveyed by the visual overlay. Any number of destination overlay indicators 2506 and/or route overlay indicators 2508, as well as other types of overlay indicators, may be placed within augmented reality display area 2502”; ¶378: “map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses). For example, destination overlay indicator 2506 may be presented in a location adjacent to or over the location of a selected item (such as a product, a shelf, a label, etc.), route overlay indicator 2508 may be presented in a location indicative of the route, and so forth. In some examples, when a head of a user wearing augmented reality glasses moves (or other part of a user associated with angling an augmented reality device), the location of the destination overlay indicator 2506 and/or route overlay indicator 2508 moves to maintain the relative location of the overlays to selected items in the environment of the user, for example to maintain the location of destination overlay indicator 2506 in a location adjacent to or over the location of the selected item, to keep the location of route overlay indicator 2508 in the location indicative of the route”; ¶379: “user interface 2400, map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses)”; ¶388: “the second visual interface may differ from the first visual interface. For example, the second visual interface may be an augmented reality visual interface, which may provide local navigation assistance information to a user (e.g., navigation assistance augmented-reality-view user interface 2500) and the first visual interface may be a map view of the retail store, which may provide a map-like view of an area to a user (e.g., navigation assistance map-view user interface 2400). In some embodiments, process 2600 may include receiving, from a mobile device of the user, at least one image of a physical environment of the user. The physical environment may correspond to a retail store, such that the image may include an aisle, a shelving unit, a product, or any other store structure or item”) Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for identifying products in retail stores as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, whereby a user’s destination corresponding to a shelf and/or product on a user’s shopping list may be displayed via the navigation assistance augmented-reality-view user interface (Fig 24, Fig 25, ¶373-¶379). With respect to claims 4 and 23, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: identify the one or more physical retail stores based on user profile information and real-time item inventory information, (¶26: “FIG. 1B depicts one variation of an AR system 110 with examples of the types of data and information transmitted between the ARD 112, ONS 114 of a control system or server and an EP 118… ARD may create a data packet or data structure called a Transaction Interval (TI). The TI object 116 may comprise identification information about the target being considered, as well as price data, merchant data, and the like… The information contained in the TI may be updated throughout the duration of the negotiation interval, for example, to indicate any real-time price changes”; ¶67: “shopper data (e.g., demographic data, personal shopping lists, purchase history and patterns, budget and/or financial profile, etc… shoppers can compare prices across different merchants and select and/or purchase items or services within their desired budget and in accordance with their needs. This information may also be used by the ONS and/or an EP to dynamically formulate offers that may be presented to the user on their ARD as the user peruses the products at a retail location”; ¶71: “Various types of information about the user may be shared from their ARD with the ONS and/or the EPs. For example, demographic information (e.g., age, gender, and ethnicity), purchase habits, history and preferences, shopping lists and the like may be shared with the ONS and/or the Eps”) wherein the user profile information indicates one or more of: a base location associated with the user and a maximum distance that the user is willing to travel, a cost budget, past physical retail stores visited by the user, or user reviews of physical retail stores (¶67: “shopper data (e.g., demographic data, personal shopping lists, purchase history and patterns, budget and/or financial profile, etc… shoppers can compare prices across different merchants and select and/or purchase items or services within their desired budget and in accordance with their needs. This information may also be used by the ONS and/or an EP to dynamically formulate offers that may be presented to the user on their ARD as the user peruses the products at a retail location”) With respect to claim 6, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: determine additional information associated with the items to be purchased, wherein the additional information includes an expected shopping duration, a distance associated with traveling to the one or more physical retail stores, and a cost associated with the items; and provide, via the interface, the additional information (¶40: “transmitting a signal from a RFID reader to the user's ARD or AR system control server that identifies the item, determining whether that item has been purchased by retrieving the status of the item in the ARD or AR system control server database, and then generating a notification to the user ARD to that prompts the user to initiate the purchase process… Once an item has been purchased, data regarding the details of the purchase (e.g., item identification, purchase price, time, and any user-specific data that has been released by the user) may be sent to the MIS to update an inventory database”) With respect to claim 7, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: detect, via the camera of the XR device, that one of the items is added to a shopping cart; (¶32: “the ARD 210 may comprise, for example, a bar code reader, a quick response (QR) code reader, a scanner, a camera, and/or any other types of information readers”; ¶49: “FIG. 6B is one variation of a database that contains the specific instance(s) of item(s) in the possession of a user (i.e., item is not on the shelf and/or is carried by a user and/or is purchased by a user), the status of the item(s), and the location of the item(s). FIG. 6C is another variation of a database that tracks the specific instance(s) of item(s) in the possession of a user, for example, item instance ID, customer ID, and location data. …The databases of FIGS. 6B and 6C may each comprise an array or hash table of individual item data structures that are each instantiated when a particular item is in the possession of a user (e.g., picked up from the shelf and put into the user's cart, as detected by the user ARD and communicated to the control server/MIS) provide, via the interface, the in-store navigation path to direct the user to a next location within the physical retail store, wherein the next location is associated with a next item included in the items or a checkout location in the physical retail store (¶51: “the list of carried items (their item IDs or just the fact that unpurchased items are identified as “carried”) may be continuously (or periodically) presented on the user ARD as he/she continues to browse the retail location. The user ARD may also provide a checkout option to the shopper so that the shopper can complete the purchase of one or more of the carried item at any time while still in the store”; ¶52: “when the tag associated with an item is read, the user ARD may prompt the shopper to purchase the item. For example, the user ARD may present on a display a purchase screen that allows a user to select the item for purchase, and/or may display representations of all the items that are in the possession of the user that have not yet been purchased. The user may then select and/or authorize the payment of one or more of the items displayed by the ARD”; ¶65: “The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) With respect to claim 8, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: detect, using the camera of the XR device, a gaze associated with the user; (¶7: “identifying the target product using the AR device may comprise detecting that the user is interested in the target product. For example, the AR device may determine, based on the direction of the user's head as measured by a motion sensor or orientation sensor and/or eye-tracking sensors, that the user is looking at a target product (e.g., the duration of user gaze on one product is relatively longer than the gaze on other products). Alternatively, or additionally, image sensors on the AR device may detect that the user has physically engaged with the target product, for example, by grasping or holding it in the field-of-view of the AR device image sensors”) identify, based on an object recognition, an item for sale in the physical retail store that is aligned with the gaze associated with the user; determine that the item corresponds to a category associated with the items; and provide, via the interface, a suggestion to purchase the item (¶7: “The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price. A method for presenting purchase offers to a user may also comprise transmitting a signal from the AR device to the remote server indicating whether the user has accepted the purchase offer; ¶9: “Another variations of a method of presenting purchase offers to a user may comprise identifying, using an augmented reality (AR) device, a target product that is being considered for purchase by a user, transmitting target product data from the augmented reality device to a remote server, wherein target product data comprises target product identification data and target product price, and executing a computer-implemented method on the remote server to generate a purchase offer. In some variations, the computer-implemented method may comprise identifying, using the product identification data, a purchase offer data structure having an alternate product and an alternate product offer price, comparing the target product price and the alternate product offer price, transmitting the purchase offer data structure to the augmented reality device if the alternate product offer price is less than the target product price, and displaying the purchase offer from the remote server to the user via the augmented reality device”; ¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase”) determining that the item corresponds to the theme associated with the list of items; and providing, via the interface, a suggestion to purchase the item (¶5: “An AR system may optionally provide information about purchasing alternatives to a shopper who is about to purchase an item or product (hereafter “the target”) in a physical retail location. Information about purchasing alternatives may be provided by the merchant and/or competitors to that merchant. Information provided by the merchant may include recommendations for a similar product that better suits the needs of the shopper and/or related or correlated products that are associated with the target, where the recommended products are sold by the merchant. The recommended products may be sold at the physical retail location, and/or on the merchant's website. They may include incentives to purchase from the retailer based on time, location, inventory, or facts about that particular customer”; ¶7: “Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price. A method for presenting purchase offers to a user may also comprise transmitting a signal from the AR device to the remote server indicating whether the user has accepted the purchase offer; ¶9: “Another variations of a method of presenting purchase offers to a user may comprise identifying, using an augmented reality (AR) device, a target product that is being considered for purchase by a user, transmitting target product data from the augmented reality device to a remote server, wherein target product data comprises target product identification data and target product price, and executing a computer-implemented method on the remote server to generate a purchase offer. In some variations, the computer-implemented method may comprise identifying, using the product identification data, a purchase offer data structure having an alternate product and an alternate product offer price, comparing the target product price and the alternate product offer price, transmitting the purchase offer data structure to the augmented reality device if the alternate product offer price is less than the target product price, and displaying the purchase offer from the remote server to the user via the augmented reality device”; ¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase”) With respect to claim 9, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are configured to: detect, using a sensor of the XR device, an attention-based behavior of the user while the user is in a certain area of the physical retail store; (¶7: “the AR device may determine, based on the direction of the user's head as measured by a motion sensor or orientation sensor and/or eye-tracking sensors, that the user is looking at a target product (e.g., the duration of user gaze on one product is relatively longer than the gaze on other products). Alternatively, or additionally, image sensors on the AR device may detect that the user has physically engaged with the target product, for example, by grasping or holding it in the field-of-view of the AR device image sensors. The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data”; ¶66: “FIGS. 5D and 5E depict another variation of offers that may be presented to a user, for example, at a restaurant 520. The user may select an entrée (FIG. 5D) and the AR system may suggest a wine, appetizer, and/or side dishes that complement the main entrée (FIG. 5E). In some variations, an AR system may be able to present suggested products and services for purchase by a user as they stop or slow down near to the retail location. That is, the user need not enter the retail location before the AR system, but certain cues (e.g., slowing pace, gazing towards the storefront, pointing to the store, whether the time of day is approaching a meal time, etc.) may indicate that the user is considering whether to purchase the goods or services sold at that venue. FIGS. 5F and 5G depict a user approaching the entrance of a restaurant 530”) identify, based on an object recognition, an item for sale in the certain area of the physical retail store; determine that the item corresponds to a category associated with the items; and provide, via the interface, a suggestion to purchase the item (¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase… The beginning of the negotiation interval may also be triggered by an AR system suggesting a product for purchase to the user after the AR system has recognized a pattern of behaviors and conditions that indicate the user is in a retail environment”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) With respect to claim 10, Kaehler and Bronicki disclose all of the above limitations, Bronicki further discloses, wherein the one or more components, to identify the one or more physical retail stores, are configured to: transmit, to a cloud computing system or an edge computing system, an indication of the items (¶113: “Image processing unit 130 may include one or more servers connected by a communication network, a cloud platform, and so forth. Consistent with the present disclosure, image processing unit 130 may receive raw or processed data from capturing device 125 via respective communication links, and provide information to different system components using a network 150… image processing unit 130 may use classification algorithms to distinguish between the different products in the retail store. In some embodiments, image processing unit 130 may utilize suitably trained machine learning algorithms and models to perform the product identification”; ¶115: “server 135 may be a cloud server that processes images received directly (or indirectly) from one or more capturing device 125 and processes the images to detect and/or identify at least some of the plurality of products in the image based on visual characteristics of the plurality of products”) and receive, from the cloud computing system or the edge computing system, an indication of the one or more physical retail stores that carry the items (¶2: “systems, methods, and devices for identifying products in retail stores”; ¶32: “selecting a retail store among the plurality of retail stores”; ¶103: “FIGS. 40A and 40B are illustrations of selecting retail stores based on a route”; ¶116: “server 135 may be part of a system associated with a retail store that communicates with capturing device 125 using a wireless local area network (WLAN) and may provide similar functionality as a cloud server... the store server may be configured to generate a record indicative of changes in product placement that occurred when there was a limited connection (or no connection) between the store server and the cloud server, and to forward the record to the cloud server once connection is reestablished”; ¶240: “Database 1305 may include a cloud-based database or an on-premises database. Database 1305 may include images captured by one or more cameras 1303, simulated images generated by server 1301 and/or user device 1302, configuration data, expression data, datasets, model data (e.g., model parameters, training criteria, performance metrics, etc.), and/or other data”) Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for navigating a walkable environment as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, and products at retail stores via cloud server/database (Fig 24, Fig 25, ¶32, ¶103, ¶116, ¶240). With respect to claim 11, Kaehler and Bronicki disclose all of the above limitations, Bronicki further discloses, wherein the in-store navigation path is a shortest path for visiting the locations within the physical retail store at which the items are held (¶103: “FIGS. 40A and 40B are illustrations of selecting retail stores based on a route”; ¶526: “server 135 may identify an assignment relating to retail store 4014 to provide to user 4004, based on retail store 4014 being the only retail store with an assignment and within a threshold radius of location 4006”; ¶527: “an assignment external to a retail store may not be time-sensitive. For example, user 4004 may be delivering a non-perishable good to location 4006. Server 135 may then provide user 4004 with assignments relating to retail stores near an initial location of user 4004, such as an assignment relating to any of retail stores 4008, 4010, and 4012. Further, server 135 may provide multiple assignments related to retail stores in a batch, such as an assignment relating to each of retail stores 4008, 4010, and 4012”; ¶533: “Assignments in retail stores may also be selected based on one or more preferences of the user. User preferences may be stored in the user's device, and the user's device may select assignments from a list that match the one or more preferences… users may prefer assignments that may be completed with one stop, such as taking a picture of an aisle in a grocery store, rather than multiple stops, such as picking up an item from one location and transporting it to another location”) Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for navigating a walkable environment as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, and products at retail stores via cloud server/database (Fig 24, Fig 25, ¶32, ¶103, ¶116, ¶240). With respect to claim 12, Kaehler and Bronicki disclose all of the above limitations, Bronicki further discloses, wherein the XR device is a first XR device, wherein the one or more components are configured to: capture video associated with a view of the user when the user is within the physical retail store; (¶81: “FIG. 24 is a visual depiction of an exemplary navigation assistance map-view user interface, consistent with the present disclosure”; ¶82: “FIG. 25 is a visual depiction of an exemplary navigation assistance augmented-reality-view user interface, consistent with the present disclosure; ¶107: “the image data may include pixel data streams, digital images, digital video streams, data derived from captured images, and data that may be used to construct a 3D image”; ¶108: “capturing device may include one or more image sensors. The term “image sensor” refers to a device capable of detecting and converting optical signals in the near-infrared, infrared, visible, and ultraviolet spectrums into electrical signals. The electrical signals may be used to form image data (e.g., an image or a video stream) based on the detected signal”; ¶113: “an image processing unit 130 to execute the analysis of images captured by the one or more capturing devices 125. Image processing unit 130 may include one or more servers connected by a communication network, a cloud platform, and so forth. Consistent with the present disclosure, image processing unit 130 may receive raw or processed data from capturing device 125 via respective communication links, and provide information to different system components using a network 150. Specifically, image processing unit 130 may use any suitable image analysis technique including, for example, object recognition, object detection, image segmentation, feature extraction, optical character recognition (OCR), object-based image analysis, shape region techniques, edge detection techniques, pixel-based detection, artificial neural networks, convolutional neural networks, etc”; ¶115: “server 135 may be a cloud server that processes images received directly (or indirectly) from one or more capturing device 125 and processes the images to detect and/or identify at least some of the plurality of products in the image based on visual characteristics of the plurality of products”; ¶376: “User interface 2500 may include an augmented reality display area 2502, which may include an image (e.g., from a video stream) of a user's environment and/or visual overlays. For example, a device, such as capturing device 125 or output device 145, may capture one or more images using a camera, and may integrate at least portions of the one or more images into augmented reality display area 2502, such as by displaying an image and placing one or more visual overlays on the image. In some embodiments, augmented reality display area 2502 may include a number of shelves 2504 or other objects in an environment of a user. For example, shelves 2504 may be shelves within a retail environment, and may correspond to shelf indicators 2406”; ¶377: “destination overlay indicator 2506 may be an outline in a shape of a selected product. In some examples, the image in augmented reality display area 2502 may be a display of a live feed captured using an image sensor (for example, an image sensor included in a device displaying user interface 2500), for example together with a display of one or move overlays, such as destination overlay indicator 2506 and/or route overlay indicator 2508. In one example, when the image in augmented reality display area 2502 changes (for example, due to movement of a device displaying user interface 2500, due to movement of a camera capturing the image, etc.), the position of the overlays may change according to the changes in the image”; ¶378: ”Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses)”; ¶380: “ providing visual navigation assistance in retail stores… any combination of steps of process 2600 may be performed by at least one processor of a device such a handheld device (e.g., a smartphone, a tablet, a mobile station, a personal digital assistant, a laptop, and more), a wearable device (e.g., smart glasses, a smartwatch, a clip-on camera), and/or server…capturing device 125, server 35”; ¶383: “at least one processor may provide first navigation information, which may be first navigation data provided to a user through a first visual interface. In some embodiments, the first visual interface may include at least one of an aisle identifier, a retail area identifier (e.g., “produce”, “household items”, etc.), a shelf identifier, or a product identifier. For example, the second visual interface may include aspects of navigation assistance map-view user interface 2400. Additionally or alternatively, the first visual interface may include an image of a product, which may be a product that a user selected at a user interface of a mobile device”; ¶501: “recording audio in a retail location, recording a video of a customer interaction, real-time video and/or audio analysis capabilities”; ¶517: “As used herein, an assignment in a retail store may refer to a task wherein a user enters a store …purchase an item … or may refer to a task wherein the user visits a store…such assignments may include entering the retail store… removing products from a shelf or a display … capturing images and/or videos from the retail store …scanning barcodes (or other visual codes) in the retail store”) Kaehler further discloses, transmit the video to a second XR device; and receive, from the second XR device, a communication based on the video associated with the view of the user, wherein the communication is associated with the items to be purchased at the physical retail store (¶20: “The AR system 100 may comprise a control system or server 104 that is configured to be in communication with one or more augmented reality devices (ARD). The one or more AR devices may comprise one or more ARDs 102a worn by one or more users or shoppers at a retail location and optionally one or more ARDs 102b worn by one or more merchant sales associates at that retail location. The control system or server 104 may be a remote server and may comprise one or more databases stored in one or more machine-readable memories and one or more computer processors that facilitate communication of information/data between the one or more databases and between the one or more databases and one or more ARDs”) Bronicki discloses an image processing unit for analyzing images (video streams) captured by one or more capturing devices and providing information to different system components. Bronicki further discloses that when the image in augmented reality display area changes (for example, due to movement of a device displaying user interface 2500, due to movement of a camera capturing the image, etc.), the position of the overlays may change according to the changes in the image; and that additionally, or alternatively to user interface, destination overlay indicator and/or route overlay indicator may be presented using an augmented reality system (such as augmented reality glasses. Kaehler discloses an AR system which comprises a control system or server that is configured to be in communication with one or more augmented reality devices (ARD). The one or more AR devices may comprise one or more ARDs worn by one or more users or shoppers at a retail location to wirelessly communicate with the control server to procure data related to a product being considered for purchase, and present this data to the user to aid in their purchase decision. Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system comprising one or more ARDs of Kaehler with the techniques for navigating a walkable environment as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, and providing an augmented reality display/view/map and user interface of a user’s environment with visual overlays and/or route overlay indicator using an augmented reality system (Fig 24, Fig 25, ¶81, ¶82, ¶107, ¶108, ¶113, ¶115, ¶371-¶380, ¶383, ¶388, ¶501, ¶517). With respect to claim 13, Kaehler discloses, receiving, via an interface of an extended reality (XR) device, an indication of items to be purchased (¶5: “An AR system may comprise a wearable AR device that is configured to capture information associated with an item for sale in a retail location or store. The wearable AR device (ARD) may be configured to monitor the movement and/or location of the particular item as the shopper moves around the store. When one or more predetermined criteria are met, the item may be designated as “carried” or “purchased” …When the user selects a purchase option on the AR device while viewing or carrying the item, information associated with the item may be relayed to a remote server that may process the purchase transaction …a user's ARD may maintain a database that stores information about the items that are carried or purchased by the user, and the information in the database may be used to prompt the user to pay for unpurchased items carried by the user prior to leaving the retail location. An AR system may optionally provide information about purchasing alternatives to a shopper who is about to purchase an item or product (hereafter “the target”) in a physical retail location”; ¶33: “The AR system may also access a shopping list maintained by the user and cross-reference the items on the list with the product inventory of the retailer, and remind the user that they may wish to consider purchasing an item that is on their shopping list at that particular retail location”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) identifying, by the XR device, a theme associated with the items; identifying, by the XR device, one or more recommended items based on the theme (¶5: “An AR system may comprise a wearable AR device that is configured to capture information associated with an item for sale in a retail location or store… …When the user selects a purchase option on the AR device while viewing or carrying the item, information associated with the item may be relayed to a remote server that may process the purchase transaction …a user's ARD may maintain a database that stores information about the items that are carried or purchased by the user… An AR system may optionally provide information about purchasing alternatives to a shopper who is about to purchase an item or product (hereafter “the target”) in a physical retail location”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) providing, via the interface, a list of items that includes the items to be purchased and the one or more recommended items (¶33: “The AR system may also access a shopping list maintained by the user and cross-reference the items on the list with the product inventory of the retailer, and remind the user that they may wish to consider purchasing an item that is on their shopping list at that particular retail location”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) identifying, by the XR device and based on the theme, one or more physical retail stores that carry items in the list of items; (¶6: “a system for presenting purchase offers to a user may comprise an augmented reality (AR) device configured to identify a target product being considered by a user for purchase and to identify the price of the target product at a retail location, and a remote server”; ¶7: “Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price”; ¶42: “When the user is at a particular retail location, the user may receive offers from the ONS that originate from an EP that is a merchant in competition with the merchant of the particular retail location. Alternatively, or additionally, the user may receive an offer from the ONS that originates from an EP that is affiliated with the merchant at that particular retail location. In some variations, the ONS may display a ranked list of products in the same category as the target product, where the ranking is determined by user reviews aggregated by the ONS or an EP. Optionally, the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”; claim 7: “…wherein target product data comprises product category data”) providing, via the interface, a list of the one or more physical retail stores (¶6: “a system for presenting purchase offers to a user may comprise an augmented reality (AR) device configured to identify a target product being considered by a user for purchase and to identify the price of the target product at a retail location, and a remote server”; ¶41: “the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”; ¶42: “When the user is at a particular retail location, the user may receive offers from the ONS that originate from an EP that is a merchant in competition with the merchant of the particular retail location. Alternatively, or additionally, the user may receive an offer from the ONS that originates from an EP that is affiliated with the merchant at that particular retail location. In some variations, the ONS may display a ranked list of products in the same category as the target product, where the ranking is determined by user reviews aggregated by the ONS or an EP. Optionally, the ONS may have information stored regarding the inventory of the target product and the suggested products, as well as the location of the inventory. For example, the ONS may have inventory data indicating that the target product and/or suggested products are in stock at an alternate location near the user, and may offer to sell the same or similar product at the alternate location”) detecting, by the XR device, that the XR device is within the physical retail store based on a geographic location associated with the XR device (¶10: “The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector, where the computation component may be configured to determine whether a wearer is in possession of an item using data from the proximity detector and the motion detector and to generate and transmit a signal to the control server, where the signal may indicate the identity of the item and whether the wearer is in possession of the item. The proximity detector may comprise at least one of a RFID reader, a camera, and a scanner. The motion detector may comprise a location estimator configured to detect a change of location of the user device. The position detector may comprise a global positioning system or a wireless based location determining system”; ¶33: “the AR system may detect that the user is geographically co-located with the physical address of a retail location, and that the user has slowed or stopped their walking pace.”; ¶36: “An ARD worn by a shopper may detect (e.g., based on data from the motion detector and/or location sensor) when the shopper enters a retail location having a plurality of products for purchase. The products may be items or objects that are physically located in the retail location or store, and/or may be services that may be purchased and utilized at a time specified by the user (e.g., gift certificates, massage, salon, automobile services and the like)”; ¶47: “A user ARD may transmit information to the MIS, for example, information regarding the location and identity of the user ARD (which may correspond to the location and identity of the user), a list of items associated with the user, and if the user is currently at a retail location, a store ID (or merchant ID) corresponding to the merchant at that retail location”; ¶65: “As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”; ¶66: “an AR system may be able to present suggested products and services for purchase by a user as they stop or slow down near to the retail location. That is, the user need not enter the retail location before the AR system, but certain cues (e.g., slowing pace, gazing towards the storefront, pointing to the store, whether the time of day is approaching a meal time, etc.) may indicate that the user is considering whether to purchase the goods or services sold at that venue”) detect, by a camera of the XR device, a store aisle of the one or more store aisles (¶10: “the system comprising a wearable augmented reality device (ARD) having wireless communication capability and a control server in wireless communication with the ARD. The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector, where the computation component may be configured to determine whether a wearer is in possession of an item using data from the proximity detector and the motion detector and to generate and transmit a signal to the control server, where the signal may indicate the identity of the item and whether the wearer is in possession of the item. The proximity detector may comprise at least one of a RFID reader, a camera, and a scanner”; ¶107: “the capturing device may include a handheld device (e.g., a smartphone, a tablet, a mobile station, a personal digital assistant, a laptop, and more) or a wearable device (e.g., smart glasses, a smartwatch, a clip-on camera)”; ¶141: “FIG. 4A illustrates how an aisle 400 of retail store 105 may be imaged using a plurality of capturing devices 125 fixedly connected to store shelves. FIG. 4B illustrates how aisle 400 of retail store 105 may be imaged using a handheld communication device. FIG. 4C illustrates how aisle 400 of retail store 105 may be imaged by robotic devices equipped with cameras”) determining, using object recognition employed by the camera of the XR device, a current location of the XR device within the physical retail store based on the store aisle (¶10: “the system comprising a wearable augmented reality device (ARD) having wireless communication capability and a control server in wireless communication with the ARD. The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector, where the computation component may be configured to determine whether a wearer is in possession of an item using data from the proximity detector and the motion detector and to generate and transmit a signal to the control server, where the signal may indicate the identity of the item and whether the wearer is in possession of the item. The proximity detector may comprise at least one of a RFID reader, a camera, and a scanner”; ¶32: “the ARD 210 may comprise, for example, a bar code reader, a quick response (QR) code reader, a scanner, a camera, and/or any other types of information readers”; ¶33: “the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase … the computation component of the ARD may have a first computer-implemented method that recognizes the objects within the user's geographical location that are available for purchase … the AR system may detect that the user is geographically co-located with the physical address of a retail location, and that the user has slowed or stopped their walking pace. The AR system may also access a shopping list maintained by the user and cross-reference the items on the list with the product inventory of the retailer, and remind the user that they may wish to consider purchasing an item that is on their shopping list at that particular retail location”; ¶34: “The ARD may be configured to recognize the product that is being viewed by the user. It may do this through the operation of computer vision algorithms, RFID, bar code scanning, or any such object recognition mechanisms, and may do so automatically, or only when an action is initiated by the shopper. In some variations, an ARD (e.g., ARD 210) may detect the presence of an item based on data from a proximity sensor and/or scanner (e.g., RFID scanner) of the ARD, and based on relative movement between the user and the item, the system may determine whether the user is interested in the item”; ¶36: “an ARD may be able to detect and identify a product by image recognition and processing methods”; ¶107: “the capturing device may include a handheld device (e.g., a smartphone, a tablet, a mobile station, a personal digital assistant, a laptop, and more) or a wearable device (e.g., smart glasses, a smartwatch, a clip-on camera)”; ¶113: “image processing unit 130 may use any suitable image analysis technique including, for example, object recognition, object detection, image segmentation, feature extraction, optical character recognition (OCR), object-based image analysis, shape region techniques, edge detection techniques, pixel-based detection, artificial neural networks, convolutional neural networks, etc”; ¶141: “FIG. 4A illustrates how an aisle 400 of retail store 105 may be imaged using a plurality of capturing devices 125 fixedly connected to store shelves. FIG. 4B illustrates how aisle 400 of retail store 105 may be imaged using a handheld communication device. FIG. 4C illustrates how aisle 400 of retail store 105 may be imaged by robotic devices equipped with cameras”) Applicant’s disclosure teaches at ¶36: “The current location may be associated with the XR device 140 and/or the user wearing (or carrying) the XR device 140. The XR device 140, based on object recognition or other related techniques employed by the camera, may determine that the user is in a particular area of the physical retail store (e.g., an entryway or a particular aisle). The XR device 140 may detect an entryway sign, aisle sign numbers, etc. using object recognition, which may enable the XR device 140 to determine the current location within the physical retail store”. Examiner interprets that applicant’s XR device is determining the current location of the user within the retail store. Further, giving the broadest reasonable interpretation of the claim limitation in light of the specification, Examiner interprets at least the wearable augmented reality device (ARD) and/or capturing devices including at least a camera and/or any other types of information readers and computation component programmed with object recognition software for identifying and recognizing objects in the user’s environment/geographical location and detecting user location in a retail environment as taught by Kaehler as teaching applicant’s limitation, “determining, using object recognition employed by a camera of the XR device, a current location of the XR device within the physical retail store”. Kaehler discloses all of the above limitations, Kaehler does not distinctly describe the following limitation, but Bronicki however as shown discloses, download, from a server and prior to arriving at a physical retail store of the one or more physical retail stores, store mapping information associated with the physical retail store, wherein the store mapping information indicates a map of one or more store aisles of the physical retail store (¶240: “Database 1305 may store information and data for the components of system 1300 (e.g., server 1301, user devices 1302, and/or one or more cameras 1303). In some embodiments, server 1301, user devices 1302, and/or one or more cameras 1303 may be configured to access database 1305, and obtain data stored from and/or upload data to database 1305 via digital communication network 1304. Database 1305 may include a cloud-based database or an on-premises database. Database 1305 may include images captured by one or more cameras 1303, simulated images generated by server 1301 and/or user device 1302, configuration data, expression data, datasets, model data (e.g., model parameters, training criteria, performance metrics, etc.), and/or other data, consistent with disclosed embodiments”) determining, using localized processing of the XR device and based on the items and the current location, a next location (¶19: “after providing the first navigation data, receiving a second indoor location of the user within the retail store; determining that the second indoor location is within a selected area around the target destination, the selected area not including the first indoor location; and in response to the determination that the second indoor location is within the selected area around the target destination, providing second navigation data to the user through a second visual interface, the second visual interface differing from the first visual interface”; Fig 24, Fig 25, ¶373: “a user device may determine its location within a retail store by using electromagnetic signals, such as GPS signals, Wi-Fi signals, or signals from an indoor localization system. In some embodiments, map 2404 may also include a route indicator 2410, which may indicate a route (e.g., a walking route) through a portion of the retail store. For example, route indicator 2410 may include one or more lines that visually connect user location indicator 2408 to a destination indicator 2412. In some embodiments, route indicator 2410 may include distance information, such as text that denotes a distance between two points along a user's route (e.g., 5 meters, 50 meters, etc.). A destination indicator 2412 may indicate or correspond to a user's destination, which may be particular shelf and/or product location within the retail store”) provide, via the interface, an in-store navigation path to direct a user of the XR device via overlayed audio-visual cues from the current location to the next location (¶19: “a method for providing visual navigation assistance in retail stores may comprise receiving a first indoor location of a user within a retail store; receiving a target destination within the retail store; providing first navigation data to the user through a first visual interface… in response to the determination that the second indoor location is within the selected area around the target destination, providing second navigation data to the user through a second visual interface, the second visual interface differing from the first visual interface”; Fig 1, Fig 2, ¶119: “The term “output device” is intended to include all possible types of devices capable of outputting information from server 135 to users or other computer systems (e.g., a display screen, a speaker, a desktop computer, a laptop computer, mobile device, tablet, a PDA, etc.”; ¶125: “audio controller 214 and speaker 222 may facilitate audio output from server 135 In addition to or instead of touch screen 218, I/O system 210 may include a display screen (e.g., CRT, LCD, etc.), virtual reality device, augmented reality device, and so forth. Specifically, touch screen controller 212 (or display screen controller) and touch screen 218 (or any of the alternatives mentioned above) may facilitate visual output from server 135. Audio controller 214 may be coupled to a microphone 220 and a speaker 222 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. Specifically, audio controller 214 and speaker 222 may facilitate audio output from server 135”; Fig 24, Fig 25, ¶373: “a user device may determine its location within a retail store by using electromagnetic signals, such as GPS signals, Wi-Fi signals, or signals from an indoor localization system. In some embodiments, map 2404 may also include a route indicator 2410, which may indicate a route (e.g., a walking route) through a portion of the retail store. For example, route indicator 2410 may include one or more lines that visually connect user location indicator 2408 to a destination indicator 2412. In some embodiments, route indicator 2410 may include distance information, such as text that denotes a distance between two points along a user's route (e.g., 5 meters, 50 meters, etc.). A destination indicator 2412 may indicate or correspond to a user's destination, which may be particular shelf and/or product location within the retail store.”; ¶375: “navigation assistance augmented-reality-view user interface 2500. In some embodiments, capturing device 125, output device 145, or any other device may display user interface 2500”; ¶376: “User interface 2500 may include an augmented reality display area 2502, which may include an image (e.g., from a video stream) of a user's environment and/or visual overlays. For example, a device, such as capturing device 125 or output device 145, may capture one or more images using a camera, and may integrate at least portions of the one or more images into augmented reality display area 2502, such as by displaying an image and placing one or more visual overlays on the image”; ¶377: “augmented reality display area 2502 may include a destination overlay indicator 2506, which may be a symbol, icon, outline, color, image distortion, or other visual indicator associated with a user's destination. In some embodiments, destination overlay indicator 2506 may correspond to a shelf and/or product. For example, destination overlay indicator 2506 may correspond to a product selected by a user at a user interface other than user interface 2500, to a product from a shopping list corresponding to the user, to a product associated with a coupon corresponding to the user, to an area (e.g., a shelf) and/or a product corresponding to a task assigned to the user, and so forth… In some embodiments, destination overlay indicator 2506 and/or route overlay indicator 2508 may have a partial degree of transparency, which may allow a user to view a portion of an environment covered by a visual overlay, while still being able to understand information conveyed by the visual overlay. Any number of destination overlay indicators 2506 and/or route overlay indicators 2508, as well as other types of overlay indicators, may be placed within augmented reality display area 2502”; ¶379: “map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses). For example, destination overlay indicator 2506 may be presented in a location adjacent to or over the location of a selected item (such as a product, a shelf, a label, etc.), route overlay indicator 2508 may be presented in a location indicative of the route, and so forth. In some examples, when a head of a user wearing augmented reality glasses moves (or other part of a user associated with angling an augmented reality device), the location of the destination overlay indicator 2506 and/or route overlay indicator 2508 moves to maintain the relative location of the overlays to selected items in the environment of the user, for example to maintain the location of destination overlay indicator 2506 in a location adjacent to or over the location of the selected item, to keep the location of route overlay indicator 2508 in the location indicative of the route”; ¶388: “the second visual interface may differ from the first visual interface. For example, the second visual interface may be an augmented reality visual interface, which may provide local navigation assistance information to a user (e.g., navigation assistance augmented-reality-view user interface 2500) and the first visual interface may be a map view of the retail store, which may provide a map-like view of an area to a user (e.g., navigation assistance map-view user interface 2400). In some embodiments, process 2600 may include receiving, from a mobile device of the user, at least one image of a physical environment of the user. The physical environment may correspond to a retail store, such that the image may include an aisle, a shelving unit, a product, or any other store structure or item”; ¶532-¶537; ¶536: “server 135 may select a retail store among the plurality of retail stores based on the indication of the external assignment. The indication of the external assignment may include a transportation assignment associated with a particular location, and the selected retail store may be determined based on the particular location, as discussed previously by reference to FIGS. 40A and 40B. For instance, the selected retail store may be determined based on a distance from the particular location to the selected retail store”; ¶537: “Server 135 may analyze the travel route and select the retail store based on a measure of a detour to the selected retail store when traveling from the first location to the second location. Server 135 may iteratively, for each of the plurality of retail stores, determine an optimal route and measure an expected detour time or distance to each of the plurality of retail stores”) Kaehler discloses an augmented reality (AR) system that provides information about purchasing alternatives to a user who is about to purchase an item or product (e.g., a target product) in a physical retail location. Kaehler teaches he that the AR system may comprise a control system or server that is configured to be in communication with one or more augmented reality devices (ARD). The one or more AR devices may comprise one or more ARDs worn by one or more users or shoppers at a retail location to wirelessly communicate with the control server to procure data related to a product being considered for purchase, and present this data to the user to aid in their purchase decision. Bronicki teaches systems, methods, and devices for identifying products in retail stores, capturing, collecting, and automatically analyzing images of products via an augmented reality system. Bronicki also teaches a navigation assistance augmented-reality view user interface including a map-like view of an area to user. Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for navigating a walkable environment as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, and providing an augmented reality display/view/map and user interface of a user’s environment with visual overlays (Fig 1, Fig 2, Fig 24, Fig 25, ¶19, ¶119, ¶125, ¶371-¶379, ¶388, ¶523-¶537). With respect to claim 14, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, further comprising: receiving, via the interface, a set of preferences associated with the user and verifying that the one or more recommended items are aligned with the set of preferences (Fig 3, ¶44: “One example of a method that may be performed by an AR system when a user considering a target for purchase is depicted in the flow diagram of FIG. 3. The AR system may comprise an ONS, which may have a computer or controller that is configured to implement one or more of the steps in the method 300. When the ARD detects a potential purchase event 302, the ARD may initiate the negotiation interval…. The ONS may then sort 312 the generated offers based on a variety of parameters, for example, paid sponsorship of a particular OS by a particular EP, by lowest or highest price, by earliest or latest delivery dates, and/or by any user-specified preferences … users may wish to see offers from only a certain type of EP (e.g., eco-friendly EPs, fair-trade, EPs, etc.), and/or may wish to block offers from other EPs (e.g., EPs that have spammed them in the past, EPs that the user wishes to boycott for personal reasons, etc.). Users may also wish to see only certain types of offers… offers may be sorted according to value criteria, which may indicate that certain types of offers may be more interesting to the user than other types of offers. The value criteria may be selected by the user … the offers may be sorted and presented to the user according to how the user has preferred to see the offers in the past, and/or their demographic information… After the offer is presented, the user has the option to respond 318 by finally accepting, provisionally accepting and/or rejecting the offer. If the user accepts the finally accepts the offer, the ONS may send a notice 320 to the EP to inform them that their offer has been accepted. Optionally, the ONS may ask the user to re-confirm 322 their acceptance of the offer”; ¶54: “Method 350 may further comprise the user ARD transmitting (step 364) data about the item to the AR system control server. Item data may include, for instance, item identification, price, quantity, etc. The user ARD may also transmit the user's ID to the AR system control server so that the user's payment preferences and methods (if any) may be obtained from a database”; ¶67: “shopper data (e.g., demographic data, personal shopping lists, purchase history and patterns, budget and/or financial profile, etc.). Such data is cross-indexed so that merchants can collect data analytics (subject to the privacy setting approved by the user) on the types of shoppers they attract and products sold, and shoppers can compare prices across different merchants and select and/or purchase items or services within their desired budget and in accordance with their needs. This information may also be used by the ONS and/or an EP to dynamically formulate offers that may be presented to the user on their ARD as the user peruses the products at a retail location”; ¶70: “Shopping settings and preferences may be set by the user via their ARD which may be resident on the ARD or stored in the ONS”; ¶71: “Various types of information about the user may be shared from their ARD with the ONS and/or the EPs. For example, demographic information (e.g., age, gender, and ethnicity), purchase habits, history and preferences, shopping lists and the like may be shared with the ONS and/or the Eps”; ¶71: “Various types of information about the user may be shared from their ARD with the ONS and/or the EPs. For example, demographic information (e.g., age, gender, and ethnicity), purchase habits, history and preferences, shopping lists and the like may be shared with the ONS and/or the Eps”) With respect to claim 16, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, determining, based on store information associated with the physical retail store, an additional item that is offered at a discounted price at the physical retail store, wherein the additional item is related to the theme (¶7: “The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price. A method for presenting purchase offers to a user may also comprise transmitting a signal from the AR device to the remote server indicating whether the user has accepted the purchase offer; ¶9: “Another variations of a method of presenting purchase offers to a user may comprise identifying, using an augmented reality (AR) device, a target product that is being considered for purchase by a user, transmitting target product data from the augmented reality device to a remote server, wherein target product data comprises target product identification data and target product price, and executing a computer-implemented method on the remote server to generate a purchase offer. In some variations, the computer-implemented method may comprise identifying, using the product identification data, a purchase offer data structure having an alternate product and an alternate product offer price, comparing the target product price and the alternate product offer price, transmitting the purchase offer data structure to the augmented reality device if the alternate product offer price is less than the target product price, and displaying the purchase offer from the remote server to the user via the augmented reality device”; ¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) Bronicki further discloses, providing, via the interface, the in-store navigation path to direct the user to a location within the physical retail store at which the one or more recommended items are held (Fig 24, Fig 25, ¶373: “a user device may determine its location within a retail store by using electromagnetic signals, such as GPS signals, Wi-Fi signals, or signals from an indoor localization system. In some embodiments, map 2404 may also include a route indicator 2410, which may indicate a route (e.g., a walking route) through a portion of the retail store. For example, route indicator 2410 may include one or more lines that visually connect user location indicator 2408 to a destination indicator 2412. In some embodiments, route indicator 2410 may include distance information, such as text that denotes a distance between two points along a user's route (e.g., 5 meters, 50 meters, etc.). A destination indicator 2412 may indicate or correspond to a user's destination, which may be particular shelf and/or product location within the retail store.”; ¶375: “navigation assistance augmented-reality-view user interface 2500. In some embodiments, capturing device 125, output device 145, or any other device may display user interface 2500”; ¶376: “User interface 2500 may include an augmented reality display area 2502, which may include an image (e.g., from a video stream) of a user's environment and/or visual overlays. For example, a device, such as capturing device 125 or output device 145, may capture one or more images using a camera, and may integrate at least portions of the one or more images into augmented reality display area 2502, such as by displaying an image and placing one or more visual overlays on the image”; ¶377: “augmented reality display area 2502 may include a destination overlay indicator 2506, which may be a symbol, icon, outline, color, image distortion, or other visual indicator associated with a user's destination. In some embodiments, destination overlay indicator 2506 may correspond to a shelf and/or product. For example, destination overlay indicator 2506 may correspond to a product selected by a user at a user interface other than user interface 2500, to a product from a shopping list corresponding to the user, to a product associated with a coupon corresponding to the user, to an area (e.g., a shelf) and/or a product corresponding to a task assigned to the user, and so forth… In some embodiments, destination overlay indicator 2506 and/or route overlay indicator 2508 may have a partial degree of transparency, which may allow a user to view a portion of an environment covered by a visual overlay, while still being able to understand information conveyed by the visual overlay. Any number of destination overlay indicators 2506 and/or route overlay indicators 2508, as well as other types of overlay indicators, may be placed within augmented reality display area 2502”; ¶378: “map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses). For example, destination overlay indicator 2506 may be presented in a location adjacent to or over the location of a selected item (such as a product, a shelf, a label, etc.), route overlay indicator 2508 may be presented in a location indicative of the route, and so forth. In some examples, when a head of a user wearing augmented reality glasses moves (or other part of a user associated with angling an augmented reality device), the location of the destination overlay indicator 2506 and/or route overlay indicator 2508 moves to maintain the relative location of the overlays to selected items in the environment of the user, for example to maintain the location of destination overlay indicator 2506 in a location adjacent to or over the location of the selected item, to keep the location of route overlay indicator 2508 in the location indicative of the route”; ¶379: “user interface 2400, map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses)”; ¶388: “the second visual interface may differ from the first visual interface. For example, the second visual interface may be an augmented reality visual interface, which may provide local navigation assistance information to a user (e.g., navigation assistance augmented-reality-view user interface 2500) and the first visual interface may be a map view of the retail store, which may provide a map-like view of an area to a user (e.g., navigation assistance map-view user interface 2400). In some embodiments, process 2600 may include receiving, from a mobile device of the user, at least one image of a physical environment of the user. The physical environment may correspond to a retail store, such that the image may include an aisle, a shelving unit, a product, or any other store structure or item”) Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for identifying products in retail stores as taught by Bronicki since it allows for identifying a physical environment corresponding to retail stores, whereby a user’s destination corresponding to a shelf and/or product on a user’s shopping list may be displayed via the navigation assistance augmented-reality-view user interface (Fig 24, Fig 25, ¶373-¶379) With respect to claim 17, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, detecting, via a camera of the XR device, that an item included in the list of items is added to a shopping cart (¶32: “the ARD 210 may comprise, for example, a bar code reader, a quick response (QR) code reader, a scanner, a camera, and/or any other types of information readers”; ¶49: “FIG. 6B is one variation of a database that contains the specific instance(s) of item(s) in the possession of a user (i.e., item is not on the shelf and/or is carried by a user and/or is purchased by a user), the status of the item(s), and the location of the item(s). FIG. 6C is another variation of a database that tracks the specific instance(s) of item(s) in the possession of a user, for example, item instance ID, customer ID, and location data. …The databases of FIGS. 6B and 6C may each comprise an array or hash table of individual item data structures that are each instantiated when a particular item is in the possession of a user (e.g., picked up from the shelf and put into the user's cart, as detected by the user ARD and communicated to the control server/MIS) providing, via the interface, the in-store navigation path to direct the user to a next location within the physical retail store, wherein the next location is associated with a next item in the list of items or a checkout location in the physical retail store (¶49: “FIG. 6B is one variation of a database that contains the specific instance(s) of item(s) in the possession of a user (i.e., item is not on the shelf and/or is carried by a user and/or is purchased by a user), the status of the item(s), and the location of the item(s). FIG. 6C is another variation of a database that tracks the specific instance(s) of item(s) in the possession of a user, for example, item instance ID, customer ID, and location data. …The databases of FIGS. 6B and 6C may each comprise an array or hash table of individual item data structures that are each instantiated when a particular item is in the possession of a user (e.g., picked up from the shelf and put into the user's cart, as detected by the user ARD and communicated to the control server/MIS”; ¶51: “the list of carried items (their item IDs or just the fact that unpurchased items are identified as “carried”) may be continuously (or periodically) presented on the user ARD as he/she continues to browse the retail location. The user ARD may also provide a checkout option to the shopper so that the shopper can complete the purchase of one or more of the carried item at any time while still in the store”; ¶52: “the user ARD may present on a display a purchase screen that allows a user to select the item for purchase, and/or may display representations of all the items that are in the possession of the user that have not yet been purchased. The user may then select and/or authorize the payment of one or more of the items displayed by the ARD. Optionally, the ARD may prompt the user for payment information and transmit the payment information along with other information (e.g., customer ID, item ID, status) associated with the purchase to the AR system control server for processing”; ¶65: “The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) With respect to claim 18, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, detecting, using the camera of the XR device, a gaze associated with the user (¶7: “identifying the target product using the AR device may comprise detecting that the user is interested in the target product. For example, the AR device may determine, based on the direction of the user's head as measured by a motion sensor or orientation sensor and/or eye-tracking sensors, that the user is looking at a target product (e.g., the duration of user gaze on one product is relatively longer than the gaze on other products). Alternatively, or additionally, image sensors on the AR device may detect that the user has physically engaged with the target product, for example, by grasping or holding it in the field-of-view of the AR device image sensors”) identifying, based on an object recognition, an item for sale in the physical retail store that is aligned with the gaze associated with the user; (¶7: “The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price. A method for presenting purchase offers to a user may also comprise transmitting a signal from the AR device to the remote server indicating whether the user has accepted the purchase offer; ¶9: “Another variations of a method of presenting purchase offers to a user may comprise identifying, using an augmented reality (AR) device, a target product that is being considered for purchase by a user, transmitting target product data from the augmented reality device to a remote server, wherein target product data comprises target product identification data and target product price, and executing a computer-implemented method on the remote server to generate a purchase offer. In some variations, the computer-implemented method may comprise identifying, using the product identification data, a purchase offer data structure having an alternate product and an alternate product offer price, comparing the target product price and the alternate product offer price, transmitting the purchase offer data structure to the augmented reality device if the alternate product offer price is less than the target product price, and displaying the purchase offer from the remote server to the user via the augmented reality device”; ¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase”) determining that the item corresponds to the theme associated with the list of items; and providing, via the interface, a suggestion to purchase the item. (¶5: “An AR system may optionally provide information about purchasing alternatives to a shopper who is about to purchase an item or product (hereafter “the target”) in a physical retail location. Information about purchasing alternatives may be provided by the merchant and/or competitors to that merchant. Information provided by the merchant may include recommendations for a similar product that better suits the needs of the shopper and/or related or correlated products that are associated with the target, where the recommended products are sold by the merchant. The recommended products may be sold at the physical retail location, and/or on the merchant's website. They may include incentives to purchase from the retailer based on time, location, inventory, or facts about that particular customer”; ¶7: “The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product. Displaying the purchase offer may comprise displaying a graphic representing the alternate product and the alternate product offer price. A method for presenting purchase offers to a user may also comprise transmitting a signal from the AR device to the remote server indicating whether the user has accepted the purchase offer; ¶9: “Another variations of a method of presenting purchase offers to a user may comprise identifying, using an augmented reality (AR) device, a target product that is being considered for purchase by a user, transmitting target product data from the augmented reality device to a remote server, wherein target product data comprises target product identification data and target product price, and executing a computer-implemented method on the remote server to generate a purchase offer. In some variations, the computer-implemented method may comprise identifying, using the product identification data, a purchase offer data structure having an alternate product and an alternate product offer price, comparing the target product price and the alternate product offer price, transmitting the purchase offer data structure to the augmented reality device if the alternate product offer price is less than the target product price, and displaying the purchase offer from the remote server to the user via the augmented reality device”; ¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase”) With respect to claim 19, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, detecting using a sensor of the XR device, an attention-based behavior of the user while the user is in a certain area of the physical retail store (¶7: “the AR device may determine, based on the direction of the user's head as measured by a motion sensor or orientation sensor and/or eye-tracking sensors, that the user is looking at a target product (e.g., the duration of user gaze on one product is relatively longer than the gaze on other products). Alternatively, or additionally, image sensors on the AR device may detect that the user has physically engaged with the target product, for example, by grasping or holding it in the field-of-view of the AR device image sensors. The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data”; ¶66: “FIGS. 5D and 5E depict another variation of offers that may be presented to a user, for example, at a restaurant 520. The user may select an entrée (FIG. 5D) and the AR system may suggest a wine, appetizer, and/or side dishes that complement the main entrée (FIG. 5E). In some variations, an AR system may be able to present suggested products and services for purchase by a user as they stop or slow down near to the retail location. That is, the user need not enter the retail location before the AR system, but certain cues (e.g., slowing pace, gazing towards the storefront, pointing to the store, whether the time of day is approaching a meal time, etc.) may indicate that the user is considering whether to purchase the goods or services sold at that venue. FIGS. 5F and 5G depict a user approaching the entrance of a restaurant 530”) identifying based on an object recognition, an item for sale in the certain area of the physical retail store; wherein the item corresponds to the theme associated with the list of items; determining that the item corresponds to a category associated with the items; and providing via the interface, a suggestion to purchase the item (¶33: “The computation component of the ARD may be configured to recognize cues and commands from the user that indicate the beginning and/or end of the negotiation interval. For example, the computation component may be programmed with object recognition software that is able identify objects in the user's environment that are available for purchase… The beginning of the negotiation interval may also be triggered by an AR system suggesting a product for purchase to the user after the AR system has recognized a pattern of behaviors and conditions that indicate the user is in a retail environment”; ¶65: “FIGS. 5B and 5C depict another way in which an AR system may present an offer to a user. As depicted there, the ARD 500 may display an offer 512 above the shopping cart 510, where the offer 512 may provide suggested purchases based on the user's shopping list. The offer 512 may be presented in addition to any reminders the ARD may provide to the user regarding their shopping list. As the user peruses the retail location (e.g., grocery market) for the items on the shopping list, the ARD may also provide recommendations 514, 516 of various products, prompting the user to direct their attention to those products on the shelves, as shown in FIG. 5C. These offers 514, 516 may provide suggestions to purchase items that are in-stock and available at the retail location”) With respect to claim 20, Kaehler and Bronicki disclose all of the above limitations, Bronicki further discloses, wherein the XR device is a first XR device, and further comprising: capturing video associated with a view of the user when the user is within the physical retail store; (¶81: “FIG. 24 is a visual depiction of an exemplary navigation assistance map-view user interface, consistent with the present disclosure”; ¶82: “FIG. 25 is a visual depiction of an exemplary navigation assistance augmented-reality-view user interface, consistent with the present disclosure; ¶107: “the image data may include pixel data streams, digital images, digital video streams, data derived from captured images, and data that may be used to construct a 3D image”; ¶108: “capturing device may include one or more image sensors. The term “image sensor” refers to a device capable of detecting and converting optical signals in the near-infrared, infrared, visible, and ultraviolet spectrums into electrical signals. The electrical signals may be used to form image data (e.g., an image or a video stream) based on the detected signal”; ¶115: “server 135 may be a cloud server that processes images received directly (or indirectly) from one or more capturing device 125 and processes the images to detect and/or identify at least some of the plurality of products in the image based on visual characteristics of the plurality of products”; ¶376: “User interface 2500 may include an augmented reality display area 2502, which may include an image (e.g., from a video stream) of a user's environment and/or visual overlays. For example, a device, such as capturing device 125 or output device 145, may capture one or more images using a camera, and may integrate at least portions of the one or more images into augmented reality display area 2502, such as by displaying an image and placing one or more visual overlays on the image. In some embodiments, augmented reality display area 2502 may include a number of shelves 2504 or other objects in an environment of a user. For example, shelves 2504 may be shelves within a retail environment, and may correspond to shelf indicators 2406”; ¶501: “recording audio in a retail location, recording a video of a customer interaction, real-time video and/or audio analysis capabilities”) transmitting the video to a second XR device; and receiving, from the second XR device, a communication based on the video associated with the view of the user, wherein the communication is associated with the items to be purchased at the physical retail store. (Fig 1A, ¶20: “The AR system 100 may comprise a control system or server 104 that is configured to be in communication with one or more augmented reality devices (ARD). The one or more AR devices may comprise one or more ARDs 102a worn by one or more users or shoppers at a retail location and optionally one or more ARDs 102b worn by one or more merchant sales associates at that retail location. The control system or server 104 may be a remote server and may comprise one or more databases stored in one or more machine-readable memories and one or more computer processors that facilitate communication of information/data between the one or more databases and between the one or more databases and one or more ARDs”; ¶115: “server 135 may be a cloud server that processes images received directly (or indirectly) from one or more capturing device 125 and processes the images to detect and/or identify at least some of the plurality of products in the image based on visual characteristics of the plurality of products”; ¶379: “Additionally or alternatively to user interface 2400, map 2404 may be presented using an augmented reality system (such as augmented reality glasses), for example with a high opacity parameter. Additionally, or alternatively to user interface 2500, destination overlay indicator 2506 and/or route overlay indicator 2508 may be presented using an augmented reality system (such as augmented reality glasses). ¶380; ¶383; ¶517: “As used herein, an assignment in a retail store may refer to a task wherein a user enters a store …purchase an item … or may refer to a task wherein the user visits a store…such assignments may include entering the retail store… removing products from a shelf or a display … capturing images and/or videos from the retail store …scanning barcodes (or other visual codes) in the retail store”) Kaehler and Bronicki are directed to the same endeavor since they are directed to identifying consumer products for purchase in retail stores and providing navigational assistance in a computing environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the augmented reality (AR) system of Kaehler with the techniques for navigating a walkable environment as taught by Bronicki since it allows for recording a video of a customer interaction, real-time video and/or audio analysis capability (¶20, ¶115, ¶379, ¶501). With respect to claim 25, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components of the XR device are configured to: detect, via the camera of the XR device, that one of the items is picked up by the user; and update a status associated with the items to be purchased (¶5: “An AR system may comprise a wearable AR device that is configured to capture information associated with an item for sale in a retail location or store. The wearable AR device (ARD) may be configured to monitor the movement and/or location of the particular item as the shopper moves around the store. When one or more predetermined criteria are met, the item may be designated as “carried” or “purchased.” For example, when the item is detected to be at a location different from its original location but within the store perimeter, it may be designated as “carried”. When the user selects a purchase option on the AR device while viewing or carrying the item, information associated with the item may be relayed to a remote server that may process the purchase transaction. At the completion of the purchase transaction, the item may be designated as “purchased.” The AR systems and methods disclosed herein may facilitate the purchase of an item (e.g., the transfer of funds from the user to the merchant in exchange for one or more items selected by the user at the retail location) by allowing the purchase transaction to occur between a user's ARD and a server”) With respect to claims 26-28, Kaehler and Bronicki disclose all of the above limitations, Kaehler further discloses, wherein the one or more components are further configured to: detect, using an eye-tracking camera of the XR device, a gaze associated with the user; capture, using the camera and based on correlating the gaze, a field of view associated with the user (¶7: “the AR device may determine, based on the direction of the user's head as measured by a motion sensor or orientation sensor and/or eye-tracking sensors, that the user is looking at a target product (e.g., the duration of user gaze on one product is relatively longer than the gaze on other products). Alternatively, or additionally, image sensors on the AR device may detect that the user has physically engaged with the target product, for example, by grasping or holding it in the field-of-view of the AR device image sensors. The AR device may be configured to do this by acquiring, using an image sensor of the AR device, an image of the user and executing instructions stored on computer-readable media of the augmented reality device that recognizes visual features in the image that indicate close proximity between the target product and the user. Target product data may comprise product category data. In some variations, the alternate product may be the same as the target product and/or may be in the same product category as the target product”; ¶33: “the AR system may determine that the user is interested in an object by analyzing the gaze of the shopper, recognizing that the shopper has touched, or picked up the product, or by explicit signaling from the shopper using any user interface (UI) idiom or totem available through the ARD or its associated components (e.g. a wired or wirelessly connected accessory). The presence of the items in combination with the shopper's movement or location can determine whether the shopper is interested in purchasing the items”; ¶37: “an ARD may use image recognition methods to determine when an item that is available for purchase is within the field-of-view (e.g., in the center of the field-of-view) of the glasses of the ARD”) determine, using an accelerometer to detect that a period of time of the user standing in the store aisle satisfies a threshold, that the user is interested in an item in the field of view; and provide, via the interface, a suggestion to purchase the item (¶7: “using an augmented reality (AR) device, a target product that is being considered for purchase by a user, transmitting target product data from the augmented reality device to a remote server, where target product data may comprise target product identification data and target product price, and executing a computer-implemented method on the remote server to generate a purchase offer… The target product data may further comprise the geographic location of the target product, and optionally, the geographic location may be represented by GPS coordinates. In some variations, identifying the target product using the AR device may comprise detecting that the user is interested in the target product. For example, the AR device may determine, based on the direction of the user's head as measured by a motion sensor or orientation sensor and/or eye-tracking sensors, that the user is looking at a target product (e.g., the duration of user gaze on one product is relatively longer than the gaze on other products). Alternatively, or additionally, image sensors on the AR device may detect that the user has physically engaged with the target product, for example, by grasping or holding it in the field-of-view of the AR device image sensors”; ¶10: “The ARD may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector… The motion detector may comprise a location estimator configured to detect a change of location of the user device. The position detector may comprise a global positioning system or a wireless based location determining system”; ¶33: “a computer-implemented feature detection method may use visual cues (e.g., signals from the ARD image sensors) indicating that the user has looked at one item longer than surrounding items, and generate a signal indicating user interest in that item. The beginning of the negotiation interval may also be triggered by an AR system suggesting a product for purchase to the user after the AR system has recognized a pattern of behaviors and conditions that indicate the user is in a retail environment. For example, the AR system may detect that the user is geographically co-located with the physical address of a retail location, and that the user has slowed or stopped their walking pace… the AR system may determine that the user is interested in an object by analyzing the gaze of the shopper, recognizing that the shopper has touched, or picked up the product, or by explicit signaling from the shopper using any user interface (UI) idiom or totem available through the ARD or its associated components (e.g. a wired or wirelessly connected accessory). The presence of the items in combination with the shopper's movement or location can determine whether the shopper is interested in purchasing the items”; ¶34: “if a motion detector of the ARD senses that the user is moving and the scanner continually senses the presence or proximity of the item, the AR system may prompt the user to initiate a process to purchase the item that is in the possession of the user. In other examples, the orientation of the user's head with respect to the location of the item may indicate that the user is visually interested in the item, even if the user ultimately does not pick up the item.”; ¶51: “the list of carried items (their item IDs or just the fact that unpurchased items are identified as “carried”) may be continuously (or periodically) presented on the user ARD as he/she continues to browse the retail location. The user ARD may also provide a checkout option to the shopper so that the shopper can complete the purchase of one or more of the carried item at any time while still in the store”; ¶54: “the user ARD and/or AR system controller (e.g., MIS) may periodically prompt the user to purchase items that are in their possession (e.g., every 15 minutes, after the user has reached a threshold number of unpurchased items in their possession, if the user is detected as nearing an exit of the retail location, etc.)”) Kaehler discloses that the AR system may comprise a proximity detector, a motion detector, a position detector, and a computation component in communication with the proximity detector, motion detector and position detector. Kaehler further discloses that the AR system may determine that the user is interested in an object, viewing an item or within the user’s field of view of the glasses of the ARD by analyzing the gaze of the shopper via image recognition methods, motion sensor/detector and/or eye-tracking sensors, or by explicit signaling from the shopper via a user interface via the ARD or its associated components (wired or wirelessly connected accessory). Kaehler also teaches detecting when a user has slowed or stopped their walking pace via a motion detector and a negotiation interval which is used to determine a time interval between the customer’s interest in a target (viewing, touching or picking up a target and/or when the user explicitly indicates considering a product for purchase). Examiner interprets at least the motion detector comprising a location estimator to detect a change of location of the user device as taught by Kaehler as teaching applicant’s accelerometer. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Chachek et al., US Patent Application Publication No US 2020/0302510 A1, “System, Device, and Method of Augmented Reality Based Mapping of a Venue and Navigation within a Venue”, relating to Augmented Reality (AR) based mapping of a venue and its real-time inventory, as well as real-time navigation within such venue. Marguello, US Patent Application Publication No US 2020/0286161 A1, “Service Provider System and Method”, relating to method/system including a plurality of client computing devices and one or more servers so that system users can communicate and store data via downloaded software to buy goods. Nigul, US Patent Application Publication No US 2023/0385875 A1, “Smart Shopping Cart with Onboard Computing System Gathering Contextual Data and Displaying Information Related to an Item Based Theron” relating to a shopping cart system which detects the initiation of a shopping session within a physical retail store by a customer, and displaying information related to an item in a physical retail store based on contextual information associated with a shopping cart system. Perks et al., US Patent Application Publication No US 2012/0123673 A1, “Generating a Map that Includes Location and Price of Products in a Shopping List” relating to searching inventories of retail stores in a geographic region of interest to the user generating a map that includes graphical icons representative of the retail stores, data that indicates that product(s) in the shopping list are available at the retail stores, and price data that indicates prices of product(s) in the shopping list at the respective retail stores. Galvin et al., US Patent Application Publication No US 20090234700 A1, “Systems and Methods for Electronic Interaction with Customers in a Retail Establishment”, relating to a mobile communication appliance carried and utilized by a customer within the premises of the physical retail establishment and coupled to the server for communication. Information services regarding products displayed for sale are provided to the customer through the appliance within the physical retail establishment. Argue et al., US Patent No US 9082149 B2, “System and Method for Providing Sales Assistance to a Consumer Wearing an Augmented Reality Device in a Physical Store”, relating to a method/system for receiving, with a processing device of a commerce server, a help request signal from an augmented reality device worn by a current consumer shopping in a retail store. The method/system also includes the step of selecting, with the processing device, a sales assistant to help the current consumer. Any inquiry of a general nature or relating to the status of this application or concerning this communication or earlier communications from the Examiner should be directed to Kimberly L. Evans whose telephone number is 571.270.3929. The Examiner can normally be reached on Monday-Friday, 9:30am-5:00pm. If attempts to reach the examiner by telephone are unsuccessful, the Examiner’s supervisor, Lynda Jasmin can be reached at 571.272.6782. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://portal.uspto.gov/external/portal/pair <http://pair-direct.uspto.gov >. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866.217.9197 (toll-free). Any response to this action should be mailed to: Commissioner of Patents and Trademarks, P.O. Box 1450, Alexandria, VA 22313-1450 or faxed to 571-273-8300. Hand delivered responses should be brought to the United States Patent and Trademark Office Customer Service Window: Randolph Building 401 Dulany Street, Alexandria, VA 22314. /KIMBERLY L EVANS/Examiner, Art Unit 3629 /LYNDA JASMIN/Supervisory Patent Examiner, Art Unit 3629
Read full office action

Prosecution Timeline

Aug 31, 2022
Application Filed
Mar 12, 2025
Non-Final Rejection — §101, §103
May 01, 2025
Interview Requested
May 07, 2025
Applicant Interview (Telephonic)
May 11, 2025
Examiner Interview Summary
May 21, 2025
Response Filed
Aug 22, 2025
Final Rejection — §101, §103
Sep 24, 2025
Interview Requested
Oct 14, 2025
Response after Non-Final Action
Oct 30, 2025
Request for Continued Examination
Nov 08, 2025
Response after Non-Final Action
Dec 22, 2025
Non-Final Rejection — §101, §103
Feb 18, 2026
Interview Requested
Mar 04, 2026
Applicant Interview (Telephonic)
Mar 12, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602661
SYSTEM FOR SEARCHING AND CORRELATING ONLINE ACTIVITY WITH INDIVIDUAL CLASSIFICATION FACTORS
2y 5m to grant Granted Apr 14, 2026
Patent 12277615
DETECTING AND VALIDATING IMPROPER RESIDENCY STATUS THROUGH DATA MINING, NATURAL LANGUAGE PROCESSING, AND MACHINE LEARNING
2y 5m to grant Granted Apr 15, 2025
Patent 12118558
ESTIMATING QUANTILE VALUES FOR REDUCED MEMORY AND/OR STORAGE UTILIZATION AND FASTER PROCESSING TIME IN FRAUD DETECTION SYSTEMS
2y 5m to grant Granted Oct 15, 2024
Patent 12056745
Machine-Learning Driven Data Analysis and Reminders
2y 5m to grant Granted Aug 06, 2024
Patent 11990213
METHODS AND SYSTEMS FOR VISUALIZING PATIENT POPULATION DATA
2y 5m to grant Granted May 21, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
12%
Grant Probability
26%
With Interview (+13.4%)
7y 0m
Median Time to Grant
High
PTA Risk
Based on 362 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month