Prosecution Insights
Last updated: April 18, 2026
Application No. 18/402,158

MOBILE DEVICE WITH IN-PERSON ASSISTANCE

Non-Final OA §101§103
Filed
Jan 02, 2024
Examiner
GILKEY, CARRIE STRODER
Art Unit
3626
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Capital One Services LLC
OA Round
3 (Non-Final)
16%
Grant Probability
At Risk
3-4
OA Rounds
5y 8m
To Grant
50%
With Interview

Examiner Intelligence

Grants only 16% of cases
16%
Career Allow Rate
79 granted / 489 resolved
-35.8% vs TC avg
Strong +34% interview lift
Without
With
+33.6%
Interview Lift
resolved cases with interview
Typical timeline
5y 8m
Avg Prosecution
37 currently pending
Career history
526
Total Applications
across all art units

Statute-Specific Performance

§101
29.0%
-11.0% vs TC avg
§103
34.9%
-5.1% vs TC avg
§102
12.4%
-27.6% vs TC avg
§112
21.9%
-18.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 489 resolved cases

Office Action

§101 §103
DETAILED ACTION This is in response to the applicant’s communication filed on 3/9/26 wherein: Claims 1-20 are currently pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1: Claim 1 recites a method and therefore, falls into a statutory category. Similar independent claims 12 and 20 recite a system and a computer readable device, and therefore, also fall into a statutory category. Step 2A – Prong 1 (Is a Judicial Exception Recited?): The underlined limitations of receiving, within a cloud-based platform and from a mobile device, an identification and a geographic location of the mobile device and image data, wherein the image data includes an image of a physical object and object information associated with the physical object, the object information comprising any of: a depth of field, a position of the physical object relative to another physical object, or three-dimensional contour information; determining, within the cloud-based platform, the location of the mobile device is within a virtual geo-fence associated with a digital record; identifying, within the cloud-based platform, the physical object is selected by a user of the mobile device, wherein the physical object corresponds to an inventory item and one or more augmented objects stored in the digital record, and wherein the identifying is performed by: computing a relative proximity of the mobile device to the selected physical object, based on the image data; determining the relative proximity is within a threshold range; and verifying the selected physical object corresponds to the inventory item by performing an image comparison between the image of the physical object and a previous image previously stored in the digital record, wherein the image comparison utilizes the object information; retrieving, within the cloud-based platform, the one or more augmented objects, based on the relative proximity and the location of the mobile device; transmitting, from the cloud-based platform to a user interface (UI) of the mobile device, based on the identifying the physical object is selected, the one or more augmented objects comprising at least a description of the inventory item; receiving, within the cloud-based platform and from the mobile device, a request for in-person assistance with the inventory item; generating user information, within the cloud-based platform, based on the identification of the mobile device, the location of the mobile device, and the inventory item of the digital record; and transmitting, from the cloud-based platform to a computer-device corresponding to the geo-fence, the request for in-person assistance along with the user information are processes that, under their broadest reasonable interpretation, are considered certain methods of organizing human activity – commercial or legal interactions (including agreements in the form of contracts and marketing or sales activities or behaviors) and/or managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). The Specification indicates that the invention is directed to, during a user’s shopping experience, requesting, in person sales assistance from on-premises salespersons. Specification ¶15. This indicates that the invention is directed to sales activities, which falls into the “certain methods of organizing human activity” grouping. Accordingly, the claim recites an abstract idea. Step 2A-Prong 2 (Is the Exception Integrated into a Practical Application?): This judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of Claims 1, 12, and 20: a mobile device, a cloud-based platform, a user interface (UI), a computer-device corresponding to the geo-fence Claim 12 additionally includes: a memory; and one or more processors Claim 20 additionally includes: a non-transitory computer-readable device The computer is recited at a high-level of generality (i.e., as a generic processing device performing generic computer functions), such that it amounts to no more than mere instructions to apply the exception using a generic computer component. Accordingly, the additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea when considered both individually and as a whole. The claim is directed to an abstract idea. Even when viewed in combination, these additional elements do not integrate the recited judicial exception into a practical application, and the claim is directed to the judicial exception. Step 2B (Does the claim recite additional elements that amount to Significantly More than the Judicial Exception?): The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a computer to perform the claimed steps identified above amount to no more than mere instructions to apply the exception using a generic computer component. Further, the claims simply append well-understood, routine, and conventional (WURC) activities previously known to the industry, specified at a high level of generality, to the judicial exception, in the form of the extra-solution activity. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. The claim is not patent eligible, as when viewed individually, and as a whole, nothing in the claim adds significantly more to the abstract idea. Dependent claims 4, 7, 9, 10, 15, 18, and 19 merely recite further embellishments of the abstract idea of independent claims 1 and 12 as discussed above with respect to integration of the abstract idea into a practical application, and these features only serve to further limit the abstract idea of independent claims 1 and 12, however none of the dependent claims recite an improvement to a technology or technical field or provide any meaningful limits. Claims 2, 3, 11, 13, 14 further define the additional element of an augmented reality (AR) interface. The interface is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic component. Even, in combination, this additional element does not integrate the abstract idea into a practical application and do not amount to significantly more than the abstract idea itself. The claims are ineligible. Claims 5, 6, 16, 17 further define the additional element of the cloud-based platform as a vehicle dealer platform or a vehicle dealer lead generation system. The platform/system is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic component. Even, in combination, this additional element does not integrate the abstract idea into a practical application and do not amount to significantly more than the abstract idea itself. The claims are ineligible. Claim 8 further defines the additional element of the mobile device as being any of: a smartphone, tablet, or wearable computer. The device is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic component. Even, in combination, this additional element does not integrate the abstract idea into a practical application and do not amount to significantly more than the abstract idea itself. The claims are ineligible. In light of the detailed explanation and evidence provided above, the Examiner asserts that the claimed invention, when the limitations are considered individually and as whole, is directed towards an abstract idea. Notice In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 4-10, 12, and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over Breed et al. (US 20110225047), in view of Henry (US 20150317713), in view of Laracey et al. (US 20170098210), in view of Pettyjohn et al. (US 20150170256), and further in view of Elliott et al. (US 20200293778). Referring to claim 1: Breed discloses a computer-based method comprising: receiving, within a [cloud-based] platform and from a mobile device, an identification and a geographic location of the mobile device {Breed [0073][0084]-[0086]; the client device (e.g., mobile phone, PDA, etc.) [0073] and the IIC server 108 extracts identifying information (step 208). The identifying information may comprise information identifying the requesting client device 102 (such as phone number, email address, etc.) [0084] and the dealer may be selected and/or forwarded information indicating the geographic location of the user at the time of the request (such as by using e.g., global positioning system (GPS) coordinates of the requesting device 102) [0086] where Breed does not disclose the portion of the claim in brackets, i.e., that the platform is a cloud-based platform, and this is addressed further below} and image data, wherein the image data includes an image of a physical object {Breed [0053]; use a camera function of the client device 102 to take a picture of the VIN number . . . the image is sent to the server [0053]}; generating user information, within the [cloud-based] platform, based on the identification of the mobile device, the location of the mobile device, and the inventory item of the digital record {Breed [0073][0086][0094][0095] [0112][0149]; generation of sales lead information [0073] and the dealer may be selected and/or forwarded information indicating the geographic location of the user at the time of the request (such as by using e.g., global positioning system (GPS) coordinates of the requesting device 102) [0086] and Identifying information is then extracted from the user request and/or the compiled information at step 224. As noted above, the extracted information may comprise without limitation contact information for the requesting device 102 (such as a phone number, email address, etc.), information regarding the user (such as demographic/psychographic information), and/or information identifying or relating to the subject vehicle [0094] where Breed does not disclose the portion of the claim in brackets, i.e., that the platform is a cloud-based platform, and this is addressed further below}. Breed discloses a system for generating and utilizing sales leads and related information (abstract). Breed does not disclose wherein the platform is a cloud-based platform; receiving, within the cloud-based platform and from the mobile device, a request for in-person assistance with the inventory item; transmitting, from the cloud-based platform to a computer-device [corresponding to the geo-fence], the request for in-person assistance along with the user information. However, Henry discloses a similar system for vehicle information delivery and management (abstract). Henry discloses wherein the platform is a cloud-based platform {Henry [0038]; Service provider system 320 and third party system 330 may comprise any type of local and/or remote data processing system (e.g., locally and/or remotely located (e.g., cloud-based systems) computing platforms, such as server 140/150 and/or clients 110/120) [0038]}; receiving, within the cloud-based platform and from the mobile device, a request for in-person assistance with the inventory item {Henry [0038] [0051][0063]-[0068]; The contact a salesperson 612 icon and request a manager 614 icon, when selected, causes ID application 420 to communicate with system 320 to contact a respective salesperson or manager of the dealership. This contact request may be in the form of a telephone contact, text, email, chat dialog, or may cause a signal/alert/notice to be given to a respective salesperson/manager that an end user (customer) onsite at the dealership would like an in-person visit with the respective salesperson/manager [0051]}; transmitting, from the cloud-based platform to a computer-device [corresponding to the geo-fence], the request for in-person assistance along with the user information {Henry [0051][0068]; Service management module 738 may then notify the selected salesperson/manager (e.g., via a telephone call, text, or other type of alert). When notifying a particular salesperson/manager of the contact request, service management module 738 may also forward various information to the client 720 of the selected salesperson/manager (e.g., customer data 746, vehicle ID data 744 of the vehicle 310 being viewed, location data 742, etc.) [0068] where Henry does not disclose the portion of the claim in brackets, and this is addressed below}. It would have been obvious for a person of ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the system disclosed in Breed to incorporate a cloud-based platform and receiving and transmitting a request for in-person assistance as taught by Henry because this would provide a manner for allowing a user to contact a salesperson (Henry [0051]), thus aiding the user by providing assistance when needed. Breed, as modified by Henry, discloses a system for generating and utilizing sales leads and related information (Breed abstract). Breed, as modified by Henry, does not disclose determining, within the cloud-based platform, the location of the mobile device is within a virtual geo-fence associated with a digital record; and a computer-device corresponding to the geo-fence. However, Laracey discloses a similar system for geo-fencing an area corresponding to a merchant’s location where a user may visit to purchase an item (abstract). Laracey discloses determining, within the cloud-based platform, the location of the mobile device is within a virtual geo-fence associated with a digital record {Laracey [0015]-[0017][0039]; the payment provider may receive the user's geo-location and use a map having geo-fenced areas for merchants to determine a merchant for the user's geo-location by determining if the user's geo-location matches (e.g., is found within) a geo-fenced area for a merchant. The merchant may further break up a geo-fenced area into smaller geo-fenced areas each corresponding to a particular item offered by the merchant [0015] and access or receive geo-fencing information for merchants, for example, from merchant data. For example, a geo-location may be detected for the user and found within a geo-fenced area for a merchant (e.g., on a map of merchants each having geo-fenced areas) [0039]}; and a computer-device corresponding to the geo-fence {Laracey [0015]-[0017][0039]; the user may be associated with the merchant location and one or more processes of the communication device and/or a payment provider server may provide user information to a merchant device for the merchant location or retrieve merchant information for use with the communication device [0015] where the merchant device for the merchant location is a computer device corresponding to the geo-fence}. It would have been obvious for a person of ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the system disclosed in Breed and Henry to incorporate determining a mobile device is within a geo-fence and provide a computer device corresponding to the geo-fence as taught by Laracey because this would provide a manner for allowing a user at a location to be associated with the merchant for the geo-fenced area (Laracey [0017]), thus aiding the user by facilitating merchant interaction. Breed, as modified by Henry and Laracey, discloses a system for generating and utilizing sales leads and related information (Breed abstract). Breed, as modified by Henry and Laracey, does not disclose identifying, within the cloud-based platform, the physical object is selected by a user of the mobile device, wherein the physical object is corresponds to an inventory item and one or more augmented objects stored in the digital record; and wherein the identifying is performed by: computing a relative proximity of the mobile device to the selected physical object, based on the image data; and determining the relative proximity is within a threshold range; and transmitting, from the cloud-based platform to a user interface (UI) of the mobile device, based on the identifying the physical object is selected, the one or more augmented objects comprising at least a description of the inventory item. However, Pettyjohn discloses a similar system for using augmented reality in a retail environment (abstract). Pettyjohn discloses identifying, within the cloud-based platform, the physical object is selected by a user of the mobile device, wherein the physical object is corresponds to an inventory item and one or more augmented objects stored in the digital record {Pettyjohn [0028][0115][0118][0120][0123][0124]; When the software application is launched on the device (700), the end-user device (700) gathers/generates image and orientation data (721) about the environment. This data is compared (723) to the area data in the area description, and a matching dataset, or one or more candidate matching datasets, in the area data is identified (725). This may be done, for example, using best fit algorithms, statistical comparison, and other techniques known in the art. When the match is identified (725), locational coordinates associated with the matching data are also identified (727) and used to determine the location of the user in the retail space [0115] and information may be overlayed over the real-time camera image in the display (701), such as coupons and advertising, based on user proximity to the coupon [0118] Thus, from the user experience perspective, when the user pans the camera over certain products (705), the display (701) presents additional information (706) and (735) about the products (705) [0123] where Pettyjohn does not disclose the portion of the claim in brackets, i.e., that the platform is a cloud-based platform, and this is addressed further below} and wherein the identifying is performed by: computing a relative proximity of the mobile device to the selected physical object, based on the image data; and determining the relative proximity is within a threshold range {Pettyjohn [0028][0067][0068][0079][0115][0118][0120][0123][0124]; due to the short range of a beacon, the mere fact that a consumer device has been detected or can communicate with the beacon at all may be sufficient to identify relevant messaging [0067] where the beacon’s range is interpreted as a threshold}; and transmitting, from the cloud-based platform to a user interface (UI) of the mobile device, based on the identifying the physical object is selected, the one or more augmented objects comprising at least a description of the inventory item {Pettyjohn [0123]; Thus, from the user experience perspective, when the user pans the camera over certain products (705), the display (701) presents additional information (706) and (735) about the products (705). This information (706) and (735) may include messaging, such as marketing messages. Marketing messages include, without limitation: sales, deals, bargains, promotions, offers, discounts, coupons, incentives to purchase, or other such messaging. Alternatively, information (706) may be displayed (735) about products (705) based on a characteristic of the product (705). By way of example and not limitation, all gluten-free products may be highlighted, circled, or otherwise indicated in the display. Other characteristics may include product family, manufacturer, on-sale, discounted, age appropriateness, and/or clearance status [0123]}. It would have been obvious for a person of ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the system disclosed in Breed, Henry, and Laracey to incorporate identifying the physical object and transmitting the augmented object as taught by Pettyjohn because this would provide a manner for presenting additional information about products of interest to a customer (Pettyjohn [0123]), thus aiding the customer by providing desired information. Breed, as modified by Henry, Laracey, and Pettyjohn, discloses a system for generating and utilizing sales leads and related information (Breed abstract). Breed, as modified by Henry, Laracey, and Pettyjohn, does not disclose object information associated with the physical object, the object information comprising any of: a depth of field, a position of the physical object relative to another physical object, or three-dimensional contour information; and verifying the selected physical object corresponds to the inventory item by performing an image comparison between the image of the physical object and a previous image previously stored in the digital record, wherein the image comparison utilizes the object information. However, Elliott discloses a similar system for an object recognition process (abstract). Elliott discloses object information associated with the physical object, the object information comprising any of: a depth of field, a position of the physical object relative to another physical object, or three-dimensional contour information {Elliott [0038]; Such datasets may include, for example, comparison pictures (e.g., pictures of vehicle wheels, vehicle lift points, and other vehicle components), pattern matching algorithms (e.g., a process that may help identify an object within a picture based upon color patterns, edge recognition, or other characteristics), environmental matching algorithms (e.g., a process that may help identify an object based upon an environment or location in which the image is captured) [0038]}; and verifying the selected physical object corresponds to the inventory item by performing an image comparison between the image of the physical object and a previous image previously stored in the digital record, wherein the image comparison utilizes the object information {Elliott [0037][0038]; in some implementations the EAS (102) may receive object recognition (108) datasets from a remote source from time to time, then may use those datasets as inputs to an object recognition (108) process, along with a captured image, to identify objects or other attributes present within that image. Such datasets may include, for example, comparison pictures (e.g., pictures of vehicle wheels, vehicle lift points, and other vehicle components), pattern matching algorithms (e.g., a process that may help identify an object within a picture based upon color patterns, edge recognition, or other characteristics), environmental matching algorithms (e.g., a process that may help identify an object based upon an environment or location in which the image is captured) [0038]}. It would have been obvious for a person of ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the system disclosed in Breed, Henry, Laracey, and Pettyjohn to incorporate object information and verifying the object through image comparison as taught by Elliott because this would provide a manner for helping identify an object within a picture (Elliott [0038]), thus aiding the customer by providing more reliable object recognition. Referring to claim 4: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses wherein the physical object selected by the user is a vehicle {Breed [0086]; a user sends a request for information regarding e.g., a black BMW 330i [0086]}. Referring to claim 5: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses wherein the cloud-based platform {Henry [0038]; Service provider system 320 and third party system 330 may comprise any type of local and/or remote data processing system (e.g., locally and/or remotely located (e.g., cloud-based systems) computing platforms, such as server 140/150 and/or clients 110/120) [0038]} is a vehicle dealer platform {Henry [0048][0081]; the request comprises sending an SMS text message to the IIC server 108 via the SMS server 106. This embodiment is desirable, for example, when the user is physically located at the dealership or seller's premises [0081]}. Referring to claim 6: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses wherein the vehicle dealer platform is a vehicle dealer lead generation system {Breed [0048][0073][0081][0112]; generation of sales lead information [0073]. Referring to claim 7: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses wherein the location of the mobile device is proximate to a specific vehicle located at a vehicle dealer storage lot {Breed [0081][0082][0086]; This embodiment is desirable, for example, when the user is physically located at the dealership or seller's premises [0081] and The client request comprises an SMS text (or other) message identifying the vehicle for which information is sought. For example, the message may include the VIN of the subject vehicle, a shortened VIN (e.g., the last 6-8 digits), or other substantially or totally identifier [0082]}. Referring to claim 8: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses wherein the mobile device is any of: a smartphone, tablet, or wearable computer {Breed [0022][0039]; In one variant, the device comprises a mobile device (e.g., smartphone or PDA) [0022]}. Referring to claim 9: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses providing, to the mobile device, financing information at least partially based on the physical object selected by the user {Breed [0063][0118]; The related products/services sources 116 use the information to generate highly targeted advertisements for products and/or services which are related to the customer's interest in the subject vehicle. For example, the related products/services sources 116 may comprise vehicle warranty providers, insurance providers, financing providers, accessory or aftermarket parts providers, etc. [0063]}. Referring to claim 10: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses based on the financing information, completing a purchase transaction with the mobile device {Breed [0063] [0118][0144][0145]; For example, the related products/services sources 116 may comprise vehicle warranty providers, insurance providers, financing providers, accessory or aftermarket parts providers, etc. [0063] and The advertiser is only charged if the customer makes an indication that he/she is interested in the presented information. A customer indication may include, inter alia, the customer clicking on an embedded link, requesting additional information regarding the advertised product/service, and/or actual purchase thereof [0144]}. Referring to claim 12: Claim 12 is rejected on a similar basis to claim 1, with the following additions: Breed discloses a system, comprising: a memory; and one or more processors configured {Breed [0051]; the system 100 generally comprises a plurality of client devices 102 in communication with an item information collection (IIC) server 108 [0051]}. Referring to claim 15: Claim 15 is rejected on a similar basis to claim 4. Referring to claim 16: Claim 16 is rejected on a similar basis to claim 5. Referring to claim 17: Claim 17 is rejected on a similar basis to claim 6. Referring to claim 18: Claim 18 is rejected on a similar basis to claim 9. Referring to claim 19: Claim 19 is rejected on a similar basis to claim 10. Referring to claim 20: Claim 20 is rejected on a similar basis to claim 1, with the following additions: Breed discloses a non-transitory computer-readable device having instructions stored thereon that, when executed by at least one computing device, cause the at least one computing device to perform operations {Breed [0025][0051][0071]; the IIC server 108 further comprises a digital processor 124 configured to run one or more applications stored thereon. The digital processor (e.g., RISC, CISC, DSP, etc.) 124 may run a first computer program 128 configured to facilitate information collection and report generation [0071]}. Claims 2, 3, 11, 13, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Breed et al. (US 20110225047), in view of Henry (US 20150317713), in view of Laracey et al. (US 20170098210), in view of Pettyjohn et al. (US 20150170256), in view of Elliott et al. (US 20200293778), and further in view of Spivack (US 20150302517). Referring to claim 2: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses a system for generating and utilizing sales leads and related information (abstract). Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, does not disclose receiving user interactions with the physical object selected, based on an augmented reality (AR) interface. However, Spivack discloses a related system for facilitating electronic commerce in an augmented reality environment. Spivack discloses receiving user interactions with the physical object selected, based on an augmented reality (AR) interface {Spivack [0023][0024][0047][0061]; enabling selection of a physical product or a real life service in an augmented reality platform via a mobile device [0023]}. It would have been obvious for a person of ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the system disclosed in Breed, Henry Laracey, Pettyjohn, and Elliott, to incorporate receiving user interactions based on an AR interface as taught by Spivack because this would provide a manner for facilitating transactions of a physical product (Spivack [0023]), thus aiding the user by enabling purchases. Referring to claim 3: Breed, as modified by Henry, Laracey, Pettyjohn, Elliott, and Spivack, discloses wherein the augmented reality (AR) interface identifies one or more features of the physical object selected by the user {Spivack [0061]; the host server 324 can access relevant information including, for example, purchase information of the product, price from the vendor of the exact product that the user is viewing in the augmented reality, price from another vendor of the same or similar product, availability of the product, any metadata or tags of the product, annotations or reviews of the product added by another user of the augmented reality environment, images or video clips that are shared by other users or the merchant, and/or sales contact information [0061]}. Referring to claim 11: Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, discloses a system for generating and utilizing sales leads and related information (abstract). Breed, as modified by Henry, Laracey, Pettyjohn, and Elliott, does not disclose interacting with the physical object selected by the user using the UI that includes an augmented reality (AR) interface to interact with the physical object selected by the user. However, Spivack discloses a related system for facilitating electronic commerce in an augmented reality environment. Spivack discloses interacting with the physical object selected by the user using the UI that includes an augmented reality (AR) interface to interact with the physical object selected by the user {Spivack [0023][0024][0047][0061][0106]; enabling selection of a physical product or a real life service in an augmented reality platform via a mobile device [0023] and According to some embodiments, the detection of targets in the augmented reality environment can be performed by pointing the device 402 at the target(s). In some embodiments, the detection of targets can be performed by moving a pointer or a select area on the display of the device 402 to point at or frame the object in a reticle or a circular or rectangular frame (e.g., select area 520, 525, described below with respect to FIG. 5A) [0106]}. It would have been obvious for a person of ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the system disclosed in Breed, Henry, Laracey, Pettyjohn, and Elliott, to incorporate receiving user interactions based on an AR interface as taught by Spivack because this would provide a manner for facilitating transactions of a physical product (Spivack [0023]), thus aiding the user by enabling purchases. Referring to claim 13: Claim 13 is rejected on a similar basis to claim 2. Referring to claim 14: Claim 14 is rejected on a similar basis to claim 3. Response to Arguments Summary of the Interview Examiner has no comment on Applicant’s summary. Rejection under 35 USC 101 Step 2A, Prong 1: Applicant argues, with respect to Step 2A, Prong 1, that the claims do not recite a method of organizing human activity and “each recites features for improving the functioning of a computer and/or improving a technological process of computer software execution” and therefore are not directed to an abstract idea. Remarks 10. Applicant recites several limitations from claim 1 and identifies the support from the amended limitations in the Specification, which are labelled (a)-(c). Examiner respectfully disagrees with Applicant that the claims do not recite a method of organizing human activity and notes that Applicant has not provided evidence that any recited features improve the functioning of a computer or an improvement to any other technology or technical field. Applicant then references Ex Parte Stocker, a PTAB case. However, this case is not precedential. Further, in Stocker, the Examiner identified every limitation of claim 1 other than “at least one tagging unit” as reciting an abstract idea. This is distinguishable from the instant case, in which more than a single limitation has been identified as an additional element. Applicant then argues that the 2025 Memo distinguishes between claims that recite abstract ideas and those that merely involve an abstract idea, stating that the instant claims do not set forth any abstract idea, “but instead recite a cloud-based platform that uses image data, location data, and a digital record associated with a geo-fence.” Remarks 11. Examiner respectfully disagrees. Examiner has identified the abstract idea in the rejection (see above). Further, the Specification indicates that the invention is directed to, during a user’s shopping experience, requesting, in person sales assistance from on-premises salespersons. Specification ¶15. This indicates that the invention is directed to sales activities, which falls into the “certain methods of organizing human activity” grouping. Step 2A, Prong 2: Applicant then argues that the features identified by Applicant as (a)-(c) improve the functioning of a computer at least concerning “[e]xisting mechanisms [] that typically focus on hardware implementations such as buttons or switches.” Remarks 11. Applicant further argues that the claims “improve the efficiency of item scanning and detection because the burden of such scanning and detection is shifted from the unpredictable/volatile environment conditions and deficient hardware implementations to the reliable, digital features, such as features (a)-(c).” Remarks 12. Examiner respectfully disagrees that improving efficiency is a technical improvement. Further, the shifting from one technology with drawbacks (such as the vulnerability to environment conditions) to another which has different drawbacks (such as the vulnerability of image recognition to changes in lighting, angle, etc., and the possible requirement for the user to download an app or go to a website to operate the invention), does not necessarily provide a technical improvement. Applicant indicates there is a removal of maintenance requirements inherent to hardware mechanisms, but the maintenance seems to merely be shifted to the customer. Further, it is unclear how the customer will know how to use the invention. It would seem that the customer would need to be provided with signage (hardware) to be instructed to download an app or go to a website in order to send the image data. Applicant then argues that the claimed features are “distinguished from conventional systems via geo-fencing-based location determining, computing relative proximity using image data, and image comparison of physical object information and combining these features to retrieve relevant augmented objects.” Remarks 13. Examiner respectfully disagrees. Each of these features are conventional, known in the computing arts. There is no evidence that these features are non-conventional. Applicant further argues that the claims, similar to the claims in Desjardins, integrate the abstract idea into a practical application by including improvements identified in the Specification, particularly (1) scanning hardware mechanism deficiencies, (2) unreliable image scanning and detections caused by volatile and unpredictable environment conditions, and (3) verification of physical object selection. Examiner respectfully disagrees that these are technical improvements. The Supreme Court and Federal Circuit have identified a number of considerations as relevant to the evaluation of whether the claimed additional elements demonstrate that a claim is directed to patent-eligible subject matter. MPEP 2106.04(d). Limitations the courts have found indicative that an additional element (or combination of elements) may have integrated the exception into a practical application include: an improvement in the functioning of a computer, or an improvement to other technology or technical field. Here, Applicant does not identify the additional element present in the claims which is allegedly improved (the additional elements are identified in the rejection, supra and, as to claim 1, include: a mobile device, a cloud-based platform, a user interface (UI), a computer-device corresponding to the geo-fence). Nor does Examiner find any evidence that any of the additional elements of the claims are technically improved. Step 2B: Applicant argues that the combination of the claimed steps is not well-understood, routine, or conventional, and “do not operate in a non-conventional and non-generic way for implementing object selection in a real-time view.” Remarks 14. Examiner believes Applicant is arguing that the combination of the steps is other than what is well-understood, routine, and conventional. Examiner respectfully disagrees. If the additional element (or combination of elements) is a specific limitation other than what is well-understood, routine and conventional in the field, for instance because it is an unconventional step that confines the claim to a particular useful application of the judicial exception, then this consideration favors eligibility. MPEP 2106.05(d). If, however, the additional element (or combination of elements) is no more than well-understood, routine, conventional activities previously known to the industry, which is recited at a high level of generality, then this consideration does not favor eligibility. MPEP 2106.05(d). In this case, as is explained above, the additional elements are identified in the rejection, supra, and, as to claim 1, include: a mobile device, a cloud-based platform, a user interface (UI), a computer-device corresponding to the geo-fence. There is no evidence to show that these elements, either alone, or in combination, are other than what is well-understood, routine and conventional in the field. Rejections under 35 USC 103 Applicant argues that Pettyjohn does not disclose the receiving and identifying limitations, especially in view of the object information comprising depth, position, or contour information, as recited in the amended claim. Remarks 17. However, additional prior art is included which addresses these limitations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARRIE S GILKEY whose telephone number is (571)270-7119. The examiner can normally be reached Monday-Thursday 7:30-4:30 CT and Friday 7:30-12 CT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jessica Lemieux can be reached on 571-270-3445. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CARRIE S GILKEY/Primary Examiner, Art Unit 3626
Read full office action

Prosecution Timeline

Jan 02, 2024
Application Filed
Jun 07, 2025
Non-Final Rejection — §101, §103
Aug 27, 2025
Examiner Interview Summary
Aug 27, 2025
Applicant Interview (Telephonic)
Sep 10, 2025
Response Filed
Dec 05, 2025
Final Rejection — §101, §103
Jan 20, 2026
Applicant Interview (Telephonic)
Jan 20, 2026
Examiner Interview Summary
Mar 09, 2026
Request for Continued Examination
Mar 24, 2026
Response after Non-Final Action
Apr 02, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12532819
METHOD FOR TAPPING YIELD POTENTIAL OF SACCHARUM OFFICINARUM BY CONTROLLING TIME OF PLASTIC MULCHING
2y 5m to grant Granted Jan 27, 2026
Patent 12524744
ARTIFICIAL INTELLIGENCE BASED DETERMINATION OF DAMAGE TO PHYSICAL STRUCTURES VIA VIDEO
2y 5m to grant Granted Jan 13, 2026
Patent 12488320
SYSTEMS AND METHODS FOR WASTE MANAGEMENT
2y 5m to grant Granted Dec 02, 2025
Patent 12333556
ENTERPRISE REPUTATION EVALUATION
2y 5m to grant Granted Jun 17, 2025
Patent 12314993
METHODS AND SYSTEMS FOR IDENTIFYING UNDERUSED PROPERTIES AND UTILIZING UNDERUSED PROPERTIES BY LEVERAGING MOBILE UNITS
2y 5m to grant Granted May 27, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
16%
Grant Probability
50%
With Interview (+33.6%)
5y 8m
Median Time to Grant
High
PTA Risk
Based on 489 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month