Prosecution Insights
Last updated: April 19, 2026
Application No. 18/083,411

SYSTEM AND METHOD FOR COLLECTING ITEM LOCATION INFORMATION BASED ON CROWD-SOURCED DATA

Non-Final OA §101§103
Filed
Dec 16, 2022
Examiner
SULLIVAN, THOMAS J
Art Unit
3689
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Ncr Voyix Corporation
OA Round
5 (Non-Final)
28%
Grant Probability
At Risk
5-6
OA Rounds
3y 8m
To Grant
52%
With Interview

Examiner Intelligence

Grants only 28% of cases
28%
Career Allow Rate
36 granted / 127 resolved
-23.7% vs TC avg
Strong +24% interview lift
Without
With
+23.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
41 currently pending
Career history
168
Total Applications
across all art units

Statute-Specific Performance

§101
34.4%
-5.6% vs TC avg
§103
38.1%
-1.9% vs TC avg
§102
11.4%
-28.6% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 127 resolved cases

Office Action

§101 §103
Detailed Action Status of Claims The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Action is in reply to the Amendment filed on 1/30/2026. Claims 1, 5-6, 8, 10-11, 15-16, 18, 20-22 are currently pending and have been examined. Claims 2-4, 7, 9, 12-14, 17, and 19 stand cancelled. Claims 1, 8, 11, 18 have been amended. Request for Continued Examination A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 2/11/2026 has been entered. Claim Rejection - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 5-6, 8, 10-11, 15-16, 18, and 20-22 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. First, it is determined whether the claims are directed to a statutory category of invention. In the instant case, claims 1, 5-6, 8, 10, and 21 are directed to a process, and claims 11, 15-6, 18, 20, and 22 are directed to a machine. Therefore, claims 1, 5-6, 8, 10-11, 15-16, 18, and 20-22 are directed to statutory subject matter under Step 1 as described in MPEP 2106 (Step 1: YES). The claims are then analyzed to determine whether the claims are directed to a judicial exception. In determining whether the claims are directed to a judicial exception, the claims are analyzed to evaluate whether the claims recite a judicial exception (Prong One of Step 2A), as well as analyzed to evaluate whether the claims recite additional elements that integrate the judicial exception into a practical application of the judicial exception (Prong Two of Step 2A). Claims 1 and 11 recite at least the following limitations that are believed to recite an abstract idea: selectively requesting a user to locate an item in a retail store in exchange for a coupon for the item, the value of the coupon proportional to an amount of use of location identifying features by the user, the user being one of a plurality of users; directing the user to an expected location of the item via directional objects to facilitate movement in the retail store to the expected location; receiving confirmation from the user that the item has been located; capturing a current location of the item, wherein the current location of the item is derived by a tracker module that: monitors movement as the user traverses a path through the retail store; processes data to determine a position relative to predefined path segments or nodes; and derives the current location of the item as a function of the determined position when the user confirms the item has been located; and updating an item location record to include the captured current location of the item. The above limitations recite the concept of inventory mapping. These limitations, under their broadest reasonable interpretation, fall within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas, enumerated in MPEP 2106, in that they recite commercial interactions, e.g. sales activities/behaviors, and managing personal behavior or relationships or interactions between people, e.g., following rules or instructions. Accordingly, under Prong One of Step 2A, claims 1, 5-6, 8, 10-11, 15-16, 18, and 20-22 an abstract idea (Step 2A, Prong One: YES). Prong Two of Step 2A is the next step in the eligibility analyses and looks at whether the abstract idea is integrated into a practical application. This requires an additional element or combination of additional elements in the claims to apply, rely on, or user the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. In this instance, the claims recite the additional elements of: A mobile shopping application on a mobile device An augmented reality visual interface provided by the mobile shopping application in which data is overlaid onto a video signal of the mobile device A module operating in conjunction with the application Sensor data from at least one of a camera, accelerometer, gyroscope, or depth sensor of the mobile device A database A system, comprising: a server having a processor and a non-transitory computer-readable storage medium, the server coupled to a mobile shopping application on a mobile device of a user, the non-transitory computer-readable storage medium having executable instructions, which when executed, cause the processor to perform operations However, these elements do not amount to an improvement in the functioning of a computer or any other technology or technical field; apply the judicial exception with, or by use of, a particular machine; or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort to monopolize the exception. In addition, the recitations are recited at a high level of generality and also do not amount to an improvement in the functioning of a computer or any other technology or technical field; apply the judicial exception with, or by use of, a particular machine; or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort to monopolize the exception. The dependent claims also fail to recite elements which amount to an improvement in the functioning of a computer or any other technology or technical field; apply the judicial exception with, or by use of, a particular machine; or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort to monopolize the exception. For example, claims 8, 10, 18, and 20-22 are directed to the abstract idea itself and do not amount to an integration according to any one of the considerations above. As for claims 5-6, 15-6, these claims are similar to the independent claims except that they recite the further additional elements of capturing a photograph, capturing a barcode. These additional elements are recited at a high level of generality and also do not amount to an improvement in the functioning of a computer or any other technology or technical field; apply the judicial exception with, or by use of, a particular machine; or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort to monopolize the exception. Therefore the dependent claims do not create an integration for the same reasons. Step 2B is the next step in the eligibility analyses and evaluates whether the claims recite additional elements that amount to an inventive concept (i.e., “significantly more”) than the recited judicial exception. According to Office procedure, revised Step 2A overlaps with Step 2B, and thus, many of the considerations need not be re-evaluated in Step 2B because the answer will be the same. In Step 2A, several additional elements were identified as additional limitations: A mobile shopping application on a mobile device An augmented reality visual interface provided by the mobile shopping application in which data is overlaid onto a video signal of the mobile device A module operating in conjunction with the application Sensor data from at least one of a camera, accelerometer, gyroscope, or depth sensor of the mobile device A database A system, comprising: a server having a processor and a non-transitory computer-readable storage medium, the server coupled to a mobile shopping application on a mobile device of a user, the non-transitory computer-readable storage medium having executable instructions, which when executed, cause the processor to perform operations These additional limitations, including the limitations in the dependent claims, do not amount to an inventive concept because they were already analyzed under Step 2A and did not amount to a practical application of the abstract idea. Therefore, the claims lack one or more limitations which amount to an inventive concept in the claims. For these reasons, the claims are rejected under 35 U.S.C. 101. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejection – 35 USC § 103 This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or non- obviousness. Claims 1, 5-6, 10-11, 15-16, and 20-22 are rejected under 35 U.S.C. 103 as being unpatentable over Taylor et al (US 20160350709 A1), hereinafter Taylor, in view of Chachek et al (US 20200302510 A1), hereinafter Chachek. Regarding Claim 1, Taylor discloses a method, comprising: selectively requesting a user of a mobile shopping application on a mobile device to locate an item in a retail store via the mobile shopping application in exchange for a coupon for the item, the user being one of a plurality of users (Taylor: “Communication module 114 transmits informational data capture request 124 to mobile computing device 106” [0026] – “Informational data capture request 124 is formulated to request an additional data capture from customer 104 that will provide useful inventory information to inventory management system 110.” [0025] – “receiving data capture 122 that is the scan of … product 180” [0031] – “When customer information request module 116 receives customer response 126 … that includes informational data capture 128 from customer 104, … transmits a customer reward …to mobile computing device …Reward 132 might be a coupon for product 180” [0035-0036] – “mobile computing devices of customers.” [0010]); receiving confirmation from the user that the item has been located via the mobile shopping application (Taylor: “Customer response 126 will include informational data capture 128 when customer 104 responds to informational data capture request 124 by capturing the requested data. In the example embodiment shown in FIG. 2 and FIG. 3, customer … 104 received informational data capture request 124 with mobile computing device 106, looked for, and found, the second store product that a barcode scan was requested for in informational data capture request 124, and scanned the second store product barcode with mobile computing device 106.” [0027]); capturing a current location of the item in conjunction with the mobile shopping application (Taylor: “Customer 104 transmits a customer response 226 that includes photo 140 of the location of product 180. Inventory module 118 is able to tell that product 180 is on feature by photo 140.” [0048] – “identify a location in photo 140 by recognizing features other than location barcodes, such as aisle signs or other identifying features in photo 140.” [0045]), wherein the current location of the item is derived by a module operating in conjunction with the mobile shopping application that: processes sensor data [photo/image] from at least one of a camera, accelerometer, gyroscope, or depth sensor of the mobile device to determine a position of the mobile device relative to predefined path segments or nodes [location, e.g. shelf, aisle, section] (Taylor: “Customer 104 can use a camera or mobile computing device 106 to capture an image of a barcode, a QR code, a product, or a location in the store, for example. Each time mobile computing device 106 captures data from a store product, location, or other apparatus, mobile computing device 106 sends this data to system 110 of server 102 as data capture 122. ” [0020] – “Customer 104 transmits a customer response 226 that includes photo 140 of the location of product 180. Inventory module 118 is able to tell that product 180 is on feature by photo 140” [0048] – “photo analysis module 136 identifies one of an aisle, a section, or a category in photo 140 … If a location identifier barcode is readable in photo 140, the information in the location identifier barcode can be included in photo inventory data 148. In some embodiments, photo analysis module 136 can identify a location in photo 140 by recognizing features other than location barcodes, such as aisle signs or other identifying features in photo 1” [0045]); and derives the current location of the item as a function of the determined position of the mobile device when the user confirms the item has been located (Taylor: “inventory module 118 checks whether a product shelf location matches a product location indicator in response to receiving photo 140. In some embodiments, inventory module 118 updates a product location indicator in response to determining from photo 140 that a product location indicator does not match the product shelf location for that product in inventory. ” [0049] – “inventory module 118 checks whether a product shelf location matches a product location indicator in response to receiving informational data capture 128. In some embodiments, inventory module 118 updates a product location indicator in response to determining from informational data capture 128 that a product location indicator does not match the product shelf location for that product in inventory. ” [0032]); and updating an item location database to include the captured current location of the item (Taylor: “Updating inventory database 120 …depending on the information received in photo inventory information 148 of customer response 226. … inventory module 118 updates a product location indicator in response to determining from photo 140 that a product location indicator does not match the product shelf location for that product in inventory.” [0048-0049]). While Taylor discloses the user receives a coupon for each location identifying task performed [0056], requesting a user to scan a location identifier for a product [0033], and tracking users as the move about the store [0012] and capturing a current location of the item using photo data from the mobile shopping application [0048], it does not specifically teach that the value of the coupon is proportional to an amount of use of locating identifying features of the mobile shopping application by the user; directing the user to an expected location of the item via an augmented reality visual interface provided by the mobile shopping application in which directional objects are overlaid onto a video signal of the mobile device in order to facilitate movement in the retail store to the expected location; that the current location of the item is derived by a tracker module operating in conjunction with the mobile shopping application that: monitors movement of the mobile device as the user traverses a path through the retail store. However, Chachek teaches a crowd-sourced venue mapping process for a retail store and products sold within (Chachek: Abstract), including that the value of the coupon proportional to an amount of use of locating identifying features of the mobile shopping application by the user (Chachek: “a user that enabled the uploading and/or the sharing of images and/or video frames from his AR-based application to the system's server (and thus has helped to improve or to update the system/s database and/or map …) may be rewarded with 1 credit point per image or per each 10-seconds of video sharing … the collected credit points may be exchanged …for coupons” [0138]); directing the user to an expected location [find a particular product] of the item via an augmented reality visual interface [AR-based navigation instructions] provided by the mobile shopping application in which directional objects [arrows] are overlaid onto a video signal of the mobile device in order to facilitate movement in the retail store to the expected location (Chachek: “AR-based navigation instructions that are generated, displayed and/or conveyed to the user, may include AR -based arrows or indicators that are shown as an overlay on top of an aisle or shelf of products, which guide the user to walk or move or turn to a particular direction in order to find a particular product; such as, the user stands in front of the Soda shelf…he requested to be navigated to Sprite …system generates AR - based content, such as an AR - based arrow that points to the left and has a textual label of “walk 3 meters to your left to see Sprite”, and that content is shown on the visualization of the Pepsi shelf on the user device. … the system proceeds to perform rapid real-time recognition of the products that are currently imaged in the field - of - view of the imager of the end-user device, … then proceeds to generate and displayed AR-based … emphasis of such sub - set of products [0063] – “collect the input … video of the current surrounding of the user), and to extract form such input (I) an indication of the current location of the user, and/or (II) an indication of the destination that the user wants to reach within the store (or the mapped environment).” [0065] – See also [0139]); and that the current location of the item is derived by a tracker module operating in conjunction with the mobile shopping application that monitors movement of the mobile device as the user traverses a path through the retail store (Chachek: “As the user walks within the store, the system continuous [sic] to monitor his current real-time location” [0064] – “tracking the movements of each user within the venue” [0069]). It would have been obvious to one of ordinary skill in the art before the effective filing date of invention to combine these references because the results would be predictable. Specifically, Taylor would continue to teach requesting a user to scan a location identifier for a product in exchange for a coupon and capturing a current location of the item in conjunction with the mobile shopping application, except that now it would also teach that the value of the coupon proportional to an amount of use of locating identifying features of the mobile shopping application by the user, and directing the user to an expected location of the item via an augmented reality visual interface provided by the mobile shopping application in which directional objects are overlaid onto a video signal of the mobile device in order to facilitate movement in the retail store to the expected location; and that the current location of the item is derived by a tracker module operating in conjunction with the mobile shopping application that: monitors movement of the mobile device as the user traverses a path through the retail store, according to the teachings of Chachek. This is a predictable result of the combination. In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of invention to combine these references because it would result in an improved accuracy in identifying a product and its in-store location (Chachek: [0011]). Regarding Claim 5, Taylor/Chachek teach the method of claim 1, wherein the user confirms a location of the item by capturing a photograph of the item via the mobile shopping application (Taylor: “customer response 226 … includes a photo 140 of the inside of the retail store, which is the photo capture requested by informational data capture request 224. Photo 140 of the inside of the retail store can contain useful photo inventory information 148.” [0041] – “recognize photo 140, analyze photo 140, and extract photo inventory information 148 from photo 140. Photo inventory information 148 can be any type of information that is useful to inventory module 118 and can be identified from photo 140, such as products that are recognizable in photo 140, where products are located” [0045]). Regarding Claim 6, Taylor/Chachek teach the method of claim 1, wherein the user confirms a location of the item by capturing a barcode on the item via the mobile shopping application (Taylor: “customer … 104 received informational data capture request 124 with mobile computing device 106, looked for, and found, the second store product that a barcode scan was requested for in informational data capture request 124, and scanned the second store product barcode with mobile computing device 106.” [0027]). Regarding Claim 10, Taylor/Chachek teach the method of claim 1, further comprising receiving a notification from the user of a location of a potentially hazardous condition within the retail store (Chachek: “the store map has recently been updated … to indicate that there is a liquid spill on the floor there … based on user-submitted reports (e.g., a shopper and/or an employee have reported a hazard in Aisle 3).” [0074]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Taylor with Chachek for the reasons identified above with respect to claim 1. Regarding Claims 11, 15-16, and 20, the limitations of claims 11, 15-17, and 20 are closely parallel to the limitations of claims 1, 5-6, 10, with the additional limitations of a system, comprising: a server having a processor and a non-transitory computer-readable storage medium, the server coupled to a mobile shopping application on a mobile device of a user, the non-transitory computer-readable storage medium having executable instructions, which when executed, cause the processor to perform operations (Taylor: [0015-0018]), and are rejected on the same basis. Regarding Claim 21, Taylor/Chachek teach the method of claim 1, wherein the item is an upsell item (Taylor: “Informational data capture request 124, in an example embodiment, is a request for a barcode scan of a second store product that inventory management system 110 thinks should be nearby product 180.” [0025] – “Providing customer 104 with a reward for providing informational data capture 128 incentivizes customer 104 to participate in gathering inventory data.” [0035] – “Reward 132 can be a coupon” [0036]). Regarding Claim 22, the limitations of claim 22 are closely parallel to the limitations of claim 21, and are rejected on the same basis. Claims 8 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Taylor, in view of Chachek, and further in view of Richardson (US 20140129378 A1), hereinafter Richardson. Regarding Claim 8, Taylor/Chachek teach the method of claim 1, but do not specifically teach that the item location database is updated based on assigning greater weight to more recent location identifications by users. However, Richardson teaches crowdsourcing of intra-store product locations (Richardson: Abstract), including that the item location database is updated based on assigning greater weight to more recent location identifications by users (Richardson: “if a retail chain or other party operating store location 102A relocates a product, Such as first item 306, to a different aisle and/or shelf, the shopping assistant system may collect and update the intra-store location information for first item 306 (e.g., via … user input entered at mobile devices 116). In examples, the shopping assistant system may utilize date- and/or time-stamps in determining which information is current, thus providing up to-date product-related information to users of mobile devices 116 in real time. For instance, the shopping assistant system may execute one or more algorithms that use date and/or time-stamps as parameters. The executed algorithm(s) may employ one or both of voting-based and authority-based techniques, which in turn may … take into account recent tendencies in location reporting, cross-check incoming data for statistical consistency, etc.” [0053]). It would have been obvious to one of ordinary skill in the art before the effective filing date of invention to combine these references because the results would be predictable. Specifically, Taylor/Chachek would continue to teach updating an item location database to include the captured current location of the item, except that now it would also teach that the item location database is updated based assigning greater weight to more recent location identifications by users, according to the teachings of Richardson. This is a predictable result of the combination. In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of invention to combine these references because it would result in an improved ability to present consumers with accurate, up-to-date product location information (Richardson: [0012]). Regarding Claim 18¸the limitations of claim 18 are closely parallel to the limitations of claim 8, and are rejected on the same basis. Response to Arguments Applicant's arguments filed 1/30/2026 have been fully considered but they are not persuasive. Claim Rejection – 35 USC §101 Applicant argues that “claims 1 and 11 now expressly require directing a user via an augmented reality visual interface in which directional objects are overlaid onto a live video signal of a mobile device, monitoring physical movement of the mobile device as the user traverses a path through a retail store, processing sensor data from one or more device sensors to determine a position of the mobile device relative to predefined path segments or nodes, and deriving a current location of an item as a function of the determined device position at the time the user confirms the item has been located. These limitations define a concrete technical process for computing spatial information in a physical environment using sensor data, not a mental process, business method, or abstract organizational concept.” Examiner disagrees. With reference to the rejection above, the claims recite a process for requesting a user to navigate a store following directions to determine the presence of products within the store as part of a collaborative inventory mapping procedure. These limitations fall within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas, enumerated in MPEP 2106, in that they recite commercial interactions, e.g. sales activities/behaviors, and managing personal behavior or relationships or interactions between people, e.g., following rules or instructions. Limitations related to providing the user with navigation instructions and determining based on user-provided data, the user’s location and the item’s location within the store upon confirmation of its being located, are part of the abstract idea, except for the recitation of high-level computer-related additional elements as addressed in subsequent steps of the 101 analysis. Applicant further argues that “the claims recite a specific application of sensor processing and augmented reality technology to solve a real-world technical problem, namely the difficulty of accurately maintaining item location data in large, dynamic retail environments. The claimed invention further improves existing mobile shopping and navigation systems by enabling those systems to derive and refine item location data automatically based on tracked user movement, rather than relying on static maps or manual audits. This constitutes an improvement to the functioning of augmented reality navigation systems themselves, not merely an improvement to an abstract business practice.” Examiner disagrees. The alleged improvement in being able to “accurately maintaining item location data in large, dynamic retail environments” is at best a business improvement stemming solely from the abstract idea itself; the additional elements offer only the improved speed or efficiency inherent to a generic computer, which does not integrate the abstract idea into a practical application [MPEP 2106.05(a)]. These elements are recited at a high level of generality, and are invoked as mere instructions to apply the abstract idea to a technological environment, creating only a general linking between the abstract idea and computer technology [MPEP 2106.05(f)]. Applicant further argues that “the claims recite significantly more than any purported abstract idea. The Examiner has not established, and does not cite evidence demonstrating, that the claimed combination of augmented reality navigation, sensor-based tracking of device movement relative to predefined path segments, and derivation of item location from tracked movement at the time of confirmation was well-understood, routine, or conventional.” Examiner disagrees. As addressed above, the additional elements do not amount to significantly more than the abstract idea, and are invoked as mere instructions to apply the abstract idea to a technological environment, creating only a general linking between the abstract idea and computer technology [MPEP 2106.05(f)]. At best, the additional elements provide only the improved speed or efficiency inherent to a generic computer, which does not integrate the abstract idea into a practical application [MPEP 2106.05(a)]. Claim Rejection – 35 USC §103 Applicant argues that neither Taylor nor Chachek teach or suggest “deriving a current location of an item by a tracker module that monitors movement of the mobile device as the user traverses a path through the retail store, processes sensor data from at least one of a camera, accelerometer, gyroscope, or depth sensor to determine a position of the mobile device relative to predefined path segments or nodes, and derives the current location of the item as a function of the determined device position when the user confirms the item has been located.” Applicant further argues that modifying Taylor to include those claim limitations not taught by the reference “would require a fundamental redesign of Taylor’s architecture and operating principles, not a routine substitution or combination.” Examiner disagrees. With reference to the rejection above, Taylor teaches a method for in-store shopping, including providing a data capture request to a user’s device [0026] that requests the user to scan a product to register its location in-store [0031], in exchanger for a reward such as a coupon [0035-0036]. The user looks for and finds the requested product and scans a barcode or otherwise captures its position [0027]. The system captures the current location of the item; the barcode can be a location barcode tied to a particular position within the store, or by recognizing other markings such as aisle signs in the user’s photo [0045]. The user uses a camera of their device to capture the barcode or an image to register the location in the store [0020], and sends the photo to the system, which recognized the location of the product as a particular position or node within the store map, such as an aisle or section [0045]. The system determines if the current, detected position is different than the registered position in the database, and takes action based on the determination [0032], and updates the position of the product within the database [0049]. While Taylor discloses the user receives a coupon for each location identifying task performed [0056], requesting a user to scan a location identifier for a product [0033], and tracking users as the move about the store [0012] and capturing a current location of the item using photo data from the mobile shopping application [0048], it does not specifically teach that the value of the coupon is proportional to an amount of use of locating identifying features of the mobile shopping application by the user; directing the user to an expected location of the item via an augmented reality visual interface provided by the mobile shopping application in which directional objects are overlaid onto a video signal of the mobile device in order to facilitate movement in the retail store to the expected location; that the current location of the item is derived by a tracker module operating in conjunction with the mobile shopping application that: monitors movement of the mobile device as the user traverses a path through the retail store. However, Chachek teaches a crowd-sourced venue mapping process for a retail store and products sold within (Chachek: Abstract), including that the user may be given rewards per-image uploaded, proportional to the total amount of images provided, including for points redeemable as a coupon [0138]. Chachek further teaches that the in-store shopping system can provide navigation support, by providing AR-based navigation instructions to navigate the user with AR elements along a path to a position to find a particular product [0063], and captures images through the camera of the user device to extract the user’s current location [0065]. The system continually monitors the user’s position as they navigate the store [0064], tracking their movements [0069]. In other words, Taylor teaches capturing a current location of the item, by processing camera/sensor data from the user’s device to determine a position of the device, and the product depicted, within the store, by recognizing visual cues or a location barcode, corresponding to a specific node or position, such as an aisle or coordinates, within the store map. Taylor further teaches requesting the user to go to the location of the item to capture this information. Chachek teaches that the user can be provided with directions to reach the location, and that their movements can be tracked as they navigate. Rather than requiring “a fundamental redesign of Taylor’s architecture,” providing the system of Taylor, which instructs a user to go to a location and captures image data at that location to determine the positioning of items within a store, with the ability to navigate a user within the store to a location and to monitor their movements, would have been obvious for the reasons addressed in the rejection above. Both references relate to the navigating of users within a store and capturing of images to determine item locations within the store; Chachek merely teaches that this functionality can further include AR navigation to assist with navigating to the target location. Applicant argues that “the office action does not explain why a person of ordinary skill in the art would have been motivated to modify” Taylor to include the teachings of Chachek, “nor does it explain how such a modification would be implemented with a reasonable expectation of success.” Applicant further argues that the claims “require that the item location be derived as a function of the tracked device position at the moment the user confirms that the item has been located, after the user has been directed to an expected location via an augmented reality visual interface,” and that neither reference “suggests this sequence of operations or the functional interdependence between AR-guided navigation, sensor-based path tracking, user confirmation, and item location derivation.” Examiner disagrees, and notes that the claims do not recite “sensor-based path tracking.” As addressed above, Taylor teaches the ability to request a user finds the location of an item in-store, and uses a photograph to confirm its position in that location, which allows the system to confirm the updated/current location of the product within the store, as it corresponds to a particular point or position/node on the store map. Chachek teaches that the user can be navigated to a particular location using AR, and that their movements can be tracked. It would have been obvious to one of ordinary skill in the art before the effective filing date of invention to combine these references because the results would be predictable. Specifically, Taylor would continue to teach requesting a user to scan a location identifier for a product in exchange for a coupon and capturing a current location of the item in conjunction with the mobile shopping application, except that now it would also teach that the value of the coupon proportional to an amount of use of locating identifying features of the mobile shopping application by the user, and directing the user to an expected location of the item via an augmented reality visual interface provided by the mobile shopping application in which directional objects are overlaid onto a video signal of the mobile device in order to facilitate movement in the retail store to the expected location; and that the current location of the item is derived by a tracker module operating in conjunction with the mobile shopping application that: monitors movement of the mobile device as the user traverses a path through the retail store, according to the teachings of Chachek. This is a predictable result of the combination. In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of invention to combine these references because it would result in an improved accuracy in identifying a product and its in-store location (Chachek: [0011]). Applicant further argues that Richardson “does not, however, disclose the aspects of claims 1 and 11 recited above missing from Taylor and Chachek,” and that “claims 8 and 18 are not unpatentable as obvious over such references.” Examiner disagrees for the reasons addressed in the rejection and response to arguments above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Publicover et al (US 20160253710 A1) teaches the ability to crowdsource a map of a store’s inventory, updating items to new locations as new logs are made, and weighting recent logging of item locations in updating the predicted location of items. Holman et al (US 20180144356 A1) teaches systems for crowdsourcing real-world data, including mapping the location of items on shelves within a retailer and incentivizing the collection of data by observers with coupons. Jones et al (US 20160342939 A1) teaches systems for determining the locations/placement of items within a store using reports generated by shoppers, including gamification to present incentive in exchange for shoppers’ collection of data. Adato et al (US 20210398198 A1) teaches crowdsourced, incentive-based in-store product location collection, including determining shelf location of items by user reports made by mobile device. Reference U (NPL – see attached) discusses geo-locating specific items using crowdsourcing techniques to create a searchable map of products. Any inquiry concerning this communication or earlier communications from the examiner should be directed to THOMAS J SULLIVAN whose telephone number is (571)272-9736. The examiner can normally be reached Mon - Fri 8-5 PT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Marissa Thein can be reached on (571) 272-6764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.J.S./Examiner, Art Unit 3689 /MARISSA THEIN/Supervisory Patent Examiner, Art Unit 3689
Read full office action

Prosecution Timeline

Dec 16, 2022
Application Filed
Oct 29, 2024
Non-Final Rejection — §101, §103
Feb 05, 2025
Response Filed
Feb 20, 2025
Final Rejection — §101, §103
May 15, 2025
Response after Non-Final Action
Jun 25, 2025
Request for Continued Examination
Jun 27, 2025
Response after Non-Final Action
Jul 08, 2025
Non-Final Rejection — §101, §103
Oct 15, 2025
Response Filed
Oct 27, 2025
Final Rejection — §101, §103
Jan 30, 2026
Response after Non-Final Action
Feb 11, 2026
Request for Continued Examination
Mar 02, 2026
Response after Non-Final Action
Mar 16, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12475505
SYSTEM AND METHOD FOR INTRODUCTION OF A TRANSACTION MECHANISM TO AN E-COMMERCE WEBSITE WITHOUT NECESSITATION OF MULTIPARTY SYSTEMS INTEGRATION
2y 5m to grant Granted Nov 18, 2025
Patent 12475444
SYSTEM AND METHOD FOR INTRODUCTION OF A TRANSACTION MECHANISM TO AN E-COMMERCE WEBSITE WITHOUT NECESSITATION OF MULTIPARTY SYSTEMS INTEGRATION
2y 5m to grant Granted Nov 18, 2025
Patent 12380303
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, AND METHOD TO PREVENT DUPLICATE ORDER FOR SUPPLIES
2y 5m to grant Granted Aug 05, 2025
Patent 12321977
METHOD AND APPARATUS FOR ONE-TAP MOBILE CHECK-IN
2y 5m to grant Granted Jun 03, 2025
Patent 12260441
VISUAL CABLE BUILDER
2y 5m to grant Granted Mar 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
28%
Grant Probability
52%
With Interview (+23.9%)
3y 8m
Median Time to Grant
High
PTA Risk
Based on 127 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month