DETAILED ACTION
This action is responsive to Request for Continued Examination filed on November 25, 2025.
Amendments filed on October 29, 2025 have been acknowledged and considered.
Claims 1, 10 and 15 have been amended. Claim 22 is new. Claim 16 has been canceled.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Applicant's Remarks, filed October 29, 2025, has been fully considered and entered.
Accordingly, Claims 1, 10 and 15 have been amended. Claim 16 has been canceled. Claim 6 was previously canceled. Claim 22 is new. Claims 1, 10 and 15 are independent claim. Claims 1-5, 7-15 and 17-22 are pending in this application.
Response to Arguments
Applicant’s arguments, see pages 9-14, filed October 29, 2025, with respect to the
rejection of claims 1, 10 and 15 have been fully considered, but they are not persuasive.
Argument 1: Applicant argues on page 10 of the Applicant Arguments and Remarks "Applicant submits that the asserted references of record do not disclose, teach, or suggest the subject matter of this amendment. For example, the combination of references fails to teach or suggest features to "receive input data that includes a digital image with a request to complete a data transaction between the computing device and a service provider from a non- transacting remote device that does not participate in the data transaction, the digital image captured by the non-transacting remote device at a location associated with the data transaction," as recited in amended claim 1.”
Response to Argument 1: Examiner respectfully disagree. Makhdumi still teaches the amended argued claim limitation. Makhdumi paragraphs [0028, 0036] & fig. 1b, disclose that at a merchant store, a user using a user device may capture an image of a QR code generated by a POS terminal (or, e.g., presented on paper such as a dining bill), and use it to present a QR code to other users using other devices where the other users make direct payments to the merchant.
See Makhdumi [0028] "The user may capture an image of the QR code generated by the POS terminal using a user device [e.g. computing device, non-transacting remote device], such as a smartphone... the user device may utilize the product and merchant information extracted from the QR code, and financial payment information from the virtual wallet, to create a purchase transaction request, and submit the request to a payment network (e.g., credit card processing network). [Thus, a request to complete a data transaction between the computing device and a service provider]" See also Makhdumi [0036-0037], fig. 1b “one of the users 131 a may obtain a snapshot [e.g. digital image captured by a non-transacting remote device], e.g., 132, of a QR pay code, e.g., 134, generated at a POS terminal (or, e.g., presented on paper such as a dining bill) [Thus, a digital image captured by the non-transacting remote device at a location associated with the data transaction/service provider], e.g., 133. The user may in turn generate a QR split pay code, embodying information on the amounts that the tender has been split into. The user 131 a may present the split tender QR code 135 to the other users 131 b-c [Thus, receive input data that includes a digital image], who may obtain snapshots of the split tender QR code, e.g., 136… the users 131 b-c may be making direct payments via the split tender QR code to the merchant (e.g., when the user 131 a took a snapshot of the merchant's QR code, no payment processing occurred immediately)… Via the separate communication sessions that POS terminal may transmit the product and/or merchant data required by the users' devices to generate individual purchase transaction processing requests.”
PNG
media_image1.png
314
731
media_image1.png
Greyscale
Examiner notes that based on the Specification paragraphs [0012, 0020] “Techniques for managing data transaction location information are described and are implementable to associate accurate and informative location information with data transactions, e.g., payment transactions… a payment transaction represents a data transaction”, term “data transaction” is interpreted as payment transaction.
Thus, in the group mobile payment, the user 131a device captures an image of the QR code generated by the POS terminal using a user device (e.g. non-transacting remote device) and presents the QR code (e.g. 136) to the users 131 b-c (thus, using computing devices) for them to make direct payments to the merchant. Thus, the user 131a device is a non-transacting device (e.g. remote device) because it does not participate in the data/payment transaction (e.g. 136) of users 131 b-c.
See rejection below.
Therefore, the Examiner has determined that this argument is not persuasive.
Argument 2: Applicant argues on pages 13-14 of the Applicant Arguments and Remarks " The combination of references also fails to teach or suggest features to "generate, responsive to completion of the data transaction by the computing device with the service provider, an association between the location of the non-transacting remote device and the data transaction that indicates that a transaction location of the data transaction is the location of the non-transacting remote device" and "store the association between the location and the data transaction in a transaction record at a storage device."… This is because, as discussed above, Makhdumi does not describe a non-transacting remote device that captured the image and therefore cannot teach or suggest "an association between the location of the non-transacting remote device and the data transaction."
Response to Argument 2: Examiner respectfully disagree. See Response to Argument 1 above regarding the non-transacting remote device that captured the image. See rejection below.
Therefore, the Examiner has determined that this argument is not persuasive.
Claim Objections
Claims 1-14 are objected to because of the following informalities:
In Claims 1 and 10, “a content control module implemented at least partially in hardware and configured to:” should read “a content control module implemented in hardware and configured to:”, the term “at least partially” can be interpreted as if at some point, the content control module may run as software only.
Dependent claims 2-9 and 11-14 do not overcome the deficiency of the base claim and, therefore, are objected for the same reasons as the base claim.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 7-15 and 17-22 are rejected under 35 U.S.C. 103 as being unpatentable over Makhdumi (US Patent Application Publication No. US 20150248664 A1) hereinafter Reference1, in view of O’Regan (US Patent Application Publication No. US 20160350742 A1) hereinafter Reference2.
Regarding claim 1, Reference1 teaches computing device, comprising: a content control module implemented at least partially in hardware and configured to: receive input data that includes a digital image with a request to complete a data transaction between the computing device and a service provider from a non-transacting remote device that does not participate in the data transaction, the digital image captured by the non-transacting remote device at a location associated with the data transaction; (See Reference1 abstract “The SNAP MOBILE PAYMENT APPARATUSES, METHODS AND SYSTEMS (“SNAP”) transform real-time-generated merchant-product Quick Response codes via SNAP components into virtual wallet card-based transaction purchase notifications. Payment information and VAS data can also be provided based on location. A request for payment information can be received [Thus, detecting a request to initiate a data transaction]. A location can be determined, and a merchant associated with the location can also be determined. Payment information and/or VAS data can be selected based on the merchant and/or location, and can be provided for a payment transaction.” See also Reference1 [0028] "The user may capture an image of the QR code generated by the POS terminal using a user device [e.g. computing device, non-transacting remote device], such as a smartphone... the user device may utilize the product and merchant information extracted from the QR code, and financial payment information from the virtual wallet, to create a purchase transaction request, and submit the request to a payment network (e.g., credit card processing network). [Thus, a request to complete a data transaction between the computing device and a service provider]" See also Reference1 [0036-0037], fig. 1b “one of the users 131 a may obtain a snapshot [e.g. digital image captured by a non-transacting remote device], e.g., 132, of a QR pay code, e.g., 134, generated at a POS terminal (or, e.g., presented on paper such as a dining bill) [Thus, a digital image captured by the non-transacting remote device at a location associated with the data transaction/service provider], e.g., 133. The user may in turn generate a QR split pay code, embodying information on the amounts that the tender has been split into. The user 131 a may present the split tender QR code 135 to the other users 131 b-c [Thus, receive input data that includes a digital image], who may obtain snapshots of the split tender QR code, e.g., 136… the users 131 b-c may be making direct payments via the split tender QR code to the merchant (e.g., when the user 131 a took a snapshot of the merchant's QR code, no payment processing occurred immediately)… Via the separate communication sessions that POS terminal may transmit the product and/or merchant data required by the users' devices to generate individual purchase transaction processing requests.”
PNG
media_image1.png
314
731
media_image1.png
Greyscale
Thus, in the group mobile payment, the user 131a device captures an image of the QR code generated by the POS terminal using a user device (e.g. non-transacting remote device) and presents the QR code (e.g. 136) to the users 131 b-c (thus, using computing devices) for them to make direct payments to the merchant. Thus, the user 131a device is a non-transacting device (e.g. remote device) that does not participate in the data/payment transaction.
Examiner notes that based on the Specification paragraphs [0012, 0020] “Techniques for managing data transaction location information are described and are implementable to associate accurate and informative location information with data transactions, e.g., payment transactions… a payment transaction represents a data transaction”, term “data transaction” is interpreted as payment transaction.
identify a location of the non-transacting remote device by extracting the location from metadata of the digital image; (See Reference1 Abstract “Payment information and VAS data can also be provided based on location. A request for payment information can be received. A location can be determined, and a merchant associated with the location can also be determined [Thus, identified]. Payment information and/or VAS data can be selected based on the merchant and/or location, and can be provided for a payment transaction.” See also Reference1 [0007] “a snap payment computer-implemented system and method can include determining, by the mobile device, a location of a consumer” See also Reference1 [0119-0120] “FIGS. 11A-F show user interface diagrams illustrating example features of virtual wallet applications in a snap mode… A user may use his or her mobile phone to take a picture [e.g. digital image/input data obtained by the non-transacting remote device] of a QR code… the virtual wallet application may optionally apply [e.g. embed] a Global Positioning System tag [i.e. location within metadata] (see 1118) to the QR code before storing it, or utilizing it in a transaction” See also Reference1 [0188-0191] “mobile devices 1510 [e.g. non-transacting remote device/remote client device, user device] may include any device capable of accessing the Internet, such as a personal computer, portable computers, cellular phones, personal digital assistants (PDAs), tablet PCs… the geolocation module 1514 may determine a set of coordinates or an address associated with the position of the mobile device 1510 [Thus, identify a location of the non-transacting remote device/remote client device, user device]… the geolocation module 1514 may be able to determine if the consumer and mobile device 1510 [e.g. the location of the remote client device associated with the service provider and is different from a location of the user device] are within or near a merchant 1520 location [Thus, identify a location of the non-transacting remote device]. For example, the geolocation module 1514 may store information about one or more merchant 1520 locations, such as one or more sets of coordinates or addresses associated with one or more merchant 1520 locations. The geolocation module 1514 may determine the position of the mobile device 1510, and then compare the position of the mobile device 1510 with the information about one or more merchant 1520 locations.” See also Reference1 [0103] “With reference to FIG. 8G, in another embodiment, the local proximity option 819 may include a store map and a real time map features among others [e.g. during execution of the data transaction]. For example, upon selecting the Walgreens store the user may launch an aisle map 819 l which displays a map 819 m showing the organization of the store and the position of the user… the user [e.g. remote device] may easily configure the map to add one or more other users (e.g., user's kids) to share each other's location [e.g. different location of the user device] within the store.
Thus, by determining a GPS tag/coordinates before storing the captured image of the QR code (e.g. by the non-transacting remote device)], it is identifying a location of the non-transacting remote device.)
However, Reference2 teaches extracting the location from metadata of the digital image in more details. (See Reference2 [0038-0039] “A method and system are described in which transaction information… are transferred in an electronic file [i.e. digital image]… Electronic files may include… image files” See also Reference2 [0044-0046] “the image may capture information relating to the transaction. Examples may include: a code such as a barcode or QR code relating to the product or merchant [Thus, a request to complete a data transaction with a service provider]… a location at which they are being purchased” See also Reference [0057] “The electronic file may be… a file received from another entity” See also Reference2 [0010-0013] "the method performed on a computing device [e.g. non-transacting remote device] and including the steps of: accessing transaction information to be transmitted; selecting an electronic file; editing metadata stored in the electronic file to insert the transaction information into one or more fields of the metadata to provide modified metadata [e.g. location of the remote device] of the electronic file; and transmitting the electronic file [e.g. digital image] with the modified metadata to a receiving entity [e.g. computing device] for processing of the transaction information [Thus, receive input data from a non-transacting remote device that includes a digital image with a request to complete a data transaction]... One or more existing fields of the metadata stored in the electronic file may also be kept in the modified metadata and used in the transaction. The one or more existing fields of the metadata include one or more of: time and date information, and location information [e.g. identify a location of the device, current location within metadata of the digital image]… selecting an electronic file includes capturing as an image file an image relating to a product or a party to the transaction in respect of which a user wishes to make a financial transaction. The image may be an image of… a quick response (QR) code” See also Reference2 [0103] “It should be noted that the computing device (110) may also be a receiving entity (160) and the receiving entity (160) may also include the functionality for transmitting transaction information as described in the computing device (110).” See also Reference2 [0022, 0054] “receiving transaction information at a receiving entity comprising: a communication component for receiving an electronic file with modified metadata [e.g. a location of the non-transacting remote device]; an extracting component for extracting transaction information from one or more fields of the modified metadata stored in the electronic file [Thus, extracting the location from metadata of the digital image]; and a transaction processing component for using the transaction information to process a transaction… The receiving entity (160) may be a remote server” See also Reference2 [0064, 0093-0095] “In a first step (501), the user uses the camera [e.g. image capture device] of his or her feature phone to capture, as a JPEG image file (465), an image of the QR code (470). Metadata (480) [e.g. a location of the device] associated with the image file (465)… is automatically created by the mobile device at the time of capturing the image… some of the existing metadata of the electronic file may be kept such as GPS coordinates showing the current location, a time and a date of the electronic file…the existing metadata of the image may provide further verification of the current location and that the user was at the location at the time of capture of the image. [Thus, identify a location of the non-transacting remote device by extracting the location from metadata of the digital image]”)
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify Reference1 to incorporate the teachings of Reference2 to extracting metadata stored in the image files for processing of transaction information.
One would be motivated to do so to effectively obtain and provide potentially required transaction information to payment systems (Reference2 0042).
Reference1 further in view of Reference2, [hereinafter Reference1-Reference2] additionally disclose generate, responsive to completion of the data transaction by the computing device with the service provider, an association between the location of the of the non-transacting remote device and the data transaction that indicates that a transaction location of the data transaction is the location of the non-transacting remote device; and store the association between the location and the data transaction in a transaction record at a storage device. (See Reference1 Abstract “Payment information and VAS data can also be provided based on location. A request for payment information can be received [e.g. data transaction]. A location can be determined [Thus, identified], and a merchant associated with the location can also be determined [e.g. location of the computing device (Thus, an association between the location of the of the non-transacting remote device and the data transaction)]. Payment information and/or VAS data can be selected based on the merchant and/or location, and can be provided for a payment transaction.” See also Reference1 [0190-0191] “the geolocation module 1514 may determine a set of coordinates or an address associated with the position of the mobile device 1510. the geolocation module 1514 may be able to determine if the consumer and mobile device 1510 are within or near [Thus, an association between the location of the of the non-transacting remote device and the data transaction that indicates that a transaction location] a merchant 1520 location [Thus, indicates a location of the non-transacting remote device]. For example, the geolocation module 1514 may store information about one or more merchant 1520 locations, such as one or more sets of coordinates or addresses associated with one or more merchant 1520 locations. The geolocation module 1514 may determine the position of the mobile device 1510, and then compare the position of the mobile device 1510 with the information about one or more merchant 1520 locations.” See also Reference1 [0007], fig. 11c “a snap payment computer-implemented system and method can include determining, by the mobile device [e.g. computing device], a location of a consumer [i.e. location associated with the data transaction]” See also Reference1 [0032] “Upon completion of the purchase transaction, the payment network [Thus, responsive to completion of the data transaction by the computing device with the service provider] may provide a purchase receipt directly to the user mobile device, the POS terminal in the store [e.g. transaction location (Thus, the location of the non-transacting remote device)]… as confirmation of completion of transaction processing.”
PNG
media_image2.png
534
873
media_image2.png
Greyscale
Thus, Fig. 11C discloses a generated receipt example of a completed transaction which includes an address 1136 [e.g. current/transaction location] and the details of the transaction. Thus, indicates that a transaction location of the data transaction is the location of the merchant POS terminal (e.g. non-transacting remote device).
See also Reference1 [0120] “the user may initiate code capture using the mobile device camera (see 1120)… the virtual wallet application may optionally apply a Global Positioning System tag [i.e. metadata] (see 1118) to the QR code before storing it, or utilizing it in a transaction” See also Reference1 [0087-0089] “the pay network server may generate a transaction data record, e.g., 523, from the authorization request and/or authorization response, and store, e.g., 524, the details of the transaction [e.g. location of the remote device] and authorization relating to the transaction in a transactions database [Thus stored to a storage device]… The pay network server may parse the batch payment request, and extract the transaction data for each transaction… The pay network server may store the transaction data, e.g., 543-544, [e.g. transaction record] for each transaction in a database, e.g., pay network database. For each extracted transaction, the pay network server may query, e.g., 545-546, a database, e.g., pay network database, for an address of an issuer server. [Thus, accessing the storage device to obtain a plurality of associations between locations and data transactions]”)
Regarding claim 2, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein the input data includes a text message and the digital image depicts a transaction link to initiate the data transaction. (See Reference 1 [0029] “the user device may utilize methods alternative to capture of a QR code [e.g. digital image depicts a transaction link to initiate the data transaction] to obtain information from the POS terminal. For example, the POS terminal may communicate the information required for submitting a purchase transaction request to a payment network to user device via Bluetooth™, Wi-Fi, SMS, text message, electronic mail, and/or other communication methods.” See also Reference1 [0133] “the SNAP may send electronic mail message, text (SMS) messages”
Examiner notes that based on the Specification paragraph [0044] “the input data 210 includes a digital image of a transaction link such as a QR code”, the transaction link can be a QR code.)
Reference2 also teaches wherein the input data includes a text message and the digital image depicts a transaction link to initiate the data transaction. (See Reference2 [0066] “the image file [i.e. digital image depicts a transaction link] with the modified metadata (490) is transmitted to the payment authorization server (450) [Thus, to initiate the data transaction] over a normal mobile communication network by means of a multimedia messaging service (MMS) message [i.e. text message]”)
Regarding claim 3, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein the content control module is further configured to display a map in a user interface of the computing device that depicts the transaction location as the location of the non-transacting remote device. (See Reference1 [0096-0103] “FIGS. 8A-G show user interface diagrams illustrating example features of virtual wallet applications in a shopping mode, in some embodiments of the SNAP… the mobile application may further identify when the user in a store based on the user's location… With reference to FIG. 8G, in another embodiment, the local proximity option 819 may include a store map and a real time map features among others. For example, upon selecting the Walgreens store [e.g. transaction location as the location of the non-transacting remote device], the user may launch an aisle map 819 l which displays a map 819 m showing the organization of the store and the position of the user (indicated by a yellow circle)… the user may manipulate the orientation of the map using the navigation tool 819 [Thus, display a map in a user interface of the computing device that depicts the location associated with the data transaction]”)
Regarding claim 4, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein the content control module is further configured to generate one or more data transaction insights for display in a user interface of the computing device based in part on the association. (See Reference1 [0115-0117] “FIG. 10 shows a user interface diagram illustrating example features of virtual wallet applications, in a history mode, in some embodiments of the SNAP. In one embodiment, a user may select the history mode 1010 to view a history of prior purchases and perform various actions on those prior purchases [Thus, based in part on the association]… The wallet application may query the storage areas in the mobile device or elsewhere (e.g., one or more databases and/or tables remote from the mobile device) for transactions matching the search keywords. The user interface may then display the results of the query such as transaction 1015. The user interface may also identify the date 1012 of the transaction, the merchants and items 1013 relating to the transaction… The history mode, in another embodiment, may offer facilities for obtaining and displaying [Thus, in a user interface of the computing device based in part on the association] ratings 1019 of the items in the transaction [e.g. data transaction insights]. The source of the ratings may be the user, the user's friends (e.g., from social channels, contacts, etc.), reviews aggregated from the web, and/or the like.”)
Regarding claim 7, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein identification of the location includes extraction of the location from exchangeable image file format ("EXIF") metadata of the digital image. (See Reference2 [0041] “An example of such a metadata image file format is exchangeable image file (Exif) format, which forms part of a Joint Photographic Experts Group (JPEG) image file. Metadata files are typically automatically created by cameras and may include, but are not limited to, information such as: the date, time and global positioning system (GPS) coordinates at which the picture was taken [Thus location from EXIF metadata of the digital image]” See also Reference2 [0077] “The GPS coordinates are extracted directly from the metadata in a standard field.”)
Regarding claim 8, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein identification of the location includes validation of the location against an address of a party to the data transaction. (See Reference1 [0191] “the geolocation module 1514 may be able to determine if the consumer and mobile device 1510 are within or near a merchant 1520 location. For example, the geolocation module 1514 may store information about one or more merchant 1520 locations, such as one or more sets of coordinates or addresses associated with one or more merchant 1520 locations. The geolocation module 1514 may determine the position of the mobile device 1510 [e.g. location], and then compare [Thus, validate] the position of the mobile device 1510 with the information about one or more merchant 1520 locations [e.g. against an address associated with a party to the data transaction].”)
Regarding claim 9, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein the association differentiates between the transaction location and an additional location of the computing device during completion of the data transaction. (See Reference1 Abstract “Payment information and VAS data can also be provided based on location. A request for payment information can be received. A location can be determined [Thus, identified], and a merchant associated with the location can also be determined [e.g. location of the computing device]. Payment information and/or VAS data can be selected based on the merchant and/or location, and can be provided for a payment transaction.” See also Reference1 [0007] “a snap payment computer-implemented system and method can include determining, by the mobile device, a location of a consumer” See also Reference1 [0191] “the geolocation module 1514 may be able to determine if the consumer and mobile device [e.g. computing device] 1510 are within or near [Thus, association differentiates between the transaction location and an additional location of the computing device during completion of the data transaction] a merchant 1520 location. For example, the geolocation module 1514 may store information about one or more merchant 1520 locations, such as one or more sets of coordinates or addresses associated with one or more merchant 1520 locations. The geolocation module 1514 may determine the position [e.g. additional location of the computing device] of the mobile device 1510, and then compare the position of the mobile device 1510 with the information about one or more merchant 1520 locations.”
Regarding claim 10, Reference1-Reference2 teaches all of the elements of claim 1. The supporting rationale of the rejection to claim 1 applies equally as well to those elements of claim 10.
Reference1-Reference2 additionally disclose detect that the computing device is unable to participate in the data transaction with the service provider; (See Reference1 [0036], fig. 1b “one of the users 131 a may obtain a snapshot [e.g. digital image captured by a computing device unable to participate in the data transaction], e.g., 132, of a QR pay code, e.g., 134, generated at a POS terminal (or, e.g., presented on paper such as a dining bill) [Thus, a digital image captured by the non-transacting remote device at a location associated with the service provider], e.g., 133. The user may in turn generate a QR split pay code, embodying information on the amounts that the tender has been split into. The user 131 a may present the split tender QR code 135 to the other users 131 b-c [Thus, communicate the digital image associated with the current location to an additional computing device that is identified to participate in the data transaction], who may obtain snapshots of the split tender QR code, e.g., 136… the users 131 b-c may be making direct payments via the split tender QR code to the merchant (e.g., when the user 131 a took a snapshot of the merchant's QR code, no payment processing occurred immediately).”
PNG
media_image1.png
314
731
media_image1.png
Greyscale
Examiner notes that based on the Specification paragraphs [0013-0014] “For instance, Meera is an adolescent and does not have a personal payment account associated with the P2P payment system and thus is personally unable to complete the transaction. Accordingly, Meera utilizes the first client device to capture a digital image that includes a request to complete the payment transaction, e.g., a digital image of a QR code, to send to a second client device associated with her father, Ajay. The second client device receives the digital image of the QR code and completes the transaction via engagement with the QR code.”, the broadest reasonable interpretation of “detect that the computing device/remote client device is unable to participate in the data transaction with the service provider” is just a determination made by a user that it won’t complete the transaction by himself/herself.
Thus, in Reference1, by obtaining a snapshot of a QR code generated at a POS terminal in order to split the bill with other users by generating a QR split code, the user has determined/detected that it is unable to participate in the data transaction with the service provider corresponding to the QR code.)
associate, responsive to detection that the digital image includes the request to complete the data transaction and that the computing device is unable to participate in the data transaction, a current location of the computing device with the digital image; (See Makhdumi [0119-0120] “FIGS. 11A-F show user interface diagrams illustrating example features of virtual wallet applications in a snap mode, in some embodiments of the SNAP… A user [e.g. unable to participate in the data transaction] may use his or her mobile phone [e.g. computing device] to take a picture [e.g. digital image] of a QR code [e.g. digital image]… the virtual wallet application may optionally apply a Global Positioning System tag [e.g. current location] (see 1118) to the QR code [Thus, a current location of the computing device with the digital image] before storing it, or utilizing it in a transaction” Thus, associate, responsive to detection that the digital image includes the request to complete the data transaction and that the computing device is unable to participate in the data transaction, a current location of the computing device with the digital image.)
Regarding claim 11, Reference1-Reference2 teaches all of the elements of claim 1. The supporting rationale of the rejection to claim 1 applies equally as well to those elements of claim 11.
Regarding claim 12, Reference1-Reference2 teaches all of the elements of claim 1. The supporting rationale of the rejection to claim 1 applies equally as well to those elements of claim 12.
Regarding claim 13, Reference1-Reference2 teaches all limitations and motivations of claim 10, wherein the content control module is further configured to receive an indication that the data transaction has been completed by the additional computing device; and generate a transaction receipt that includes an association between the current location and the data transaction. (See Reference1 [0033, 0088] “Upon completion of the transaction, the payment network may provide transaction notification receipts to the users who are parties to the transaction… the merchant server may also generate a purchase receipt [Thus, generate a transaction receipt], e.g., 532, and provide the purchase receipt to the client. [Thus, to receive an indication that the data transaction has been completed by the additional computing device]” See Reference1 Fig. 11C Disclosing a generated receipt example of a completed transaction which includes an address 1136 [e.g. current location] and the details of the transaction.
PNG
media_image2.png
534
873
media_image2.png
Greyscale
Thus, a transaction receipt that includes an association between the current location and the data transaction.)
Regarding claim 14, Reference1-Reference2 teaches all limitations and motivations of claim 10, wherein the content control module is configured to validate the current location of the computing device against an address associated with a party to the data transaction. (See Reference1 [0191] “the geolocation module 1514 may be able to determine if the consumer and mobile device 1510 are within or near a merchant 1520 location. For example, the geolocation module 1514 may store information about one or more merchant 1520 locations, such as one or more sets of coordinates or addresses associated with one or more merchant 1520 locations. The geolocation module 1514 may determine the position of the mobile device 1510 [e.g. the current location of the computing device], and then compare [Thus, validate] the position of the mobile device 1510 with the information about one or more merchant 1520 locations [e.g. against an address associated with a party to the data transaction].”)
Regarding claim 15, Reference1-Reference2 teaches all of the elements of claim 10. The supporting rationale of the rejection to claim 10 applies equally as well to those elements of claim 15.
Regarding claim 17, Reference1-Reference2 teaches all limitations and motivations of claim 15, wherein the input data includes a text message and detecting the request includes detecting one or more key words or phrases from the text message that indicate the request to initiate the data transaction. (See Reference1 [0039, 0066] “the SNAP may facilitate P2P transactions via pre-filled, modifiable QR payment codes, e.g., 150… the QR code and messages [e.g. from the text message] sent to/from the QR-code capturing device may include the source ID (e.g., identifier of the device generating the QR code), session ID, merchant ID, item ID (e.g., model number), the charge amount, and/or transacting device ID (e.g., the user's smartphone device). [e.g. one or more key words that indicate the request to initiate the data transaction]” See also Reference1 [0029] “the user device may utilize methods alternative to capture of a QR code to obtain information from the POS terminal. For example, the POS terminal may communicate the information required for submitting a purchase transaction request to a payment network to user device… text message, electronic mail, and/or other communication methods.”)
Regarding claim 18, Reference1-Reference2 teaches all of the elements of claims 1 and 3. The supporting rationale of the rejection to claims 1 and 3 applies equally as well to those elements of claim 18.
Regarding claim 19, Reference1-Reference2 teaches all of the elements of claim 4. The supporting rationale of the rejection to claim 4 applies equally as well to those elements of claim 19.
Regarding claim 20, Reference1-Reference2 teaches all limitations and motivations of claim 19, wherein the one or more data transaction insights are generated based in part on the association, the location of the remote client device, and a location of the user device during execution of the data transaction. (See Reference1 Abstract “Payment information and VAS data can also be provided based on location. A request for payment information can be received. A location can be determined, and a merchant associated with the location can also be determined [e.g. location of the computing device]. Payment information and/or VAS data can be selected based on the merchant and/or location, and can be provided for a payment transaction [Thus, during execution of the data transaction].” See also Reference1 [0115-0117] “FIG. 10 shows a user interface diagram illustrating example features of virtual wallet applications, in a history mode, in some embodiments of the SNAP. In one embodiment, a user may select the history mode 1010 to view a history of prior purchases [Thus, based in part on the association ] and perform various actions on those prior purchases… The wallet application may query the storage areas in the mobile device or elsewhere (e.g., one or more databases and/or tables remote from the mobile device) for transactions matching the search keywords. The user interface may then display the results of the query such as transaction 1015. The user interface may also identify the date 1012 of the transaction, the merchants [e.g. a merchant associated with the location] and items 1013 relating to the transaction [Thus, based in part on the association, the location of the remote device, and a location of the computing device during execution of the data transaction]… The history mode, in another embodiment, may offer facilities for obtaining and displaying ratings 1019 of the items in the transaction [e.g. data transaction insights]. The source of the ratings may be the user, the user's friends (e.g., from social channels, contacts, etc.), reviews aggregated from the web, and/or the like.”)
Regarding claim 21, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein the content control module is further configured to generate a data transaction insight for presentation by the computing device based on the association that visually represents a relationship between the location of the non-transacting remote device and a location of the computing device during execution of the data transaction. (See Reference1 [0102-0103] “the mobile application may further identify when the user in a store based on the user's location. For example, position icon 819 d may be displayed next to a store (e.g., Walgreens) when the user is in close proximity to the store [Thus, visually represents a relationship between the location of the non-transacting remote device and a location of the computing device]. In one implementation, the mobile application may refresh its location periodically in case the user moved away from the store (e.g., Walgreens). In a further implementation, the user may navigate the offerings of the selected Walgreens store through the mobile application… With reference to FIG. 8G, in another embodiment, the local proximity option 819 may include a store map and a real time map features among others [e.g. during execution of the data transaction]. For example, upon selecting the Walgreens store the user may launch an aisle map 819 l which displays a map 819 m showing the organization of the store and the position of the user (indicated by a yellow circle) [Thus, generate a data transaction insight].”)
Regarding claim 22, Reference1-Reference2 teaches all limitations and motivations of claim 1, wherein the digital image is generated based on text instructions to complete the data transaction. (See Reference1 [0056-0058] “the merchant server may generate a QR code embodying the product information, as well as merchant information required by a payment network to process the purchase transaction. In some implementations, the QR code may include at least information required [e.g. text instructions] by the user device capturing the QR code to generate a purchase transaction processing request, such as a merchant identifier (e.g., a merchant ID number, merchant name, store ID, etc.) [Thus, the digital image is generated based on text instructions to complete the data transaction]”)
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Reference1-Reference2, in view of McGuire (US Patent Application Publication No. US 20210383440 A1).
Regarding claim 5, Reference1-Reference2 teaches all limitations and motivations of claim 4.
Reference1-Reference2 does not explicitly disclose wherein the one or more data transaction insights include one or more of a data transaction resource usage summary, or a data transaction location heatmap.
However, McGuire teaches generate data transaction insights that include a data transaction location heatmap. (See McGuire [0094] “venue management system 102 may communicate the message to user device 106 of the user based on the user conducting the payment transaction during the event, where the message include transaction data associated with the payment transaction.” See also McGuire claim 8 “receive a user request from the mobile device of the user, wherein the user request includes data associated with goods or services provided by a merchant attending the event… generate at least one message based on the current user location and the biometric data associated with the biometric of the user, the at least one message comprising… communicate heat map data associated with a heat map of a queue of a location of the merchant in the venue of the event to the mobile device of the user to cause the mobile device to display the heat map [Thus, generate data transaction insights that include a data transaction location heatmap]”)
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify Reference1- Reference2 to incorporate the teachings of McGuire of providing a heat map of a queue of a location of a merchant to a requesting user.
One would be motivated to do so to effectively provide an instant overview of key information for the user, enabling the user to make data-driven decisions.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to OSCAR WEHOVZ whose telephone number is (571)272-3362. The examiner can normally be reached 8:00am - 5:00pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, APU M MOFIZ can be reached at (571) 272-4080. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/OSCAR WEHOVZ/Examiner, Art Unit 2161