Prosecution Insights
Last updated: April 19, 2026
Application No. 17/431,920

Method for processing a payment transaction, and corresponding device, system and programs

Final Rejection §101§103
Filed
Aug 18, 2021
Examiner
SHARON, AYAL I
Art Unit
3695
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
BANKS AND ACQUIRERS INTERNATIONAL HOLDING
OA Round
6 (Final)
43%
Grant Probability
Moderate
7-8
OA Rounds
3y 8m
To Grant
72%
With Interview

Examiner Intelligence

Grants 43% of resolved cases
43%
Career Allow Rate
88 granted / 203 resolved
-8.7% vs TC avg
Strong +28% interview lift
Without
With
+28.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
43 currently pending
Career history
246
Total Applications
across all art units

Statute-Specific Performance

§101
35.2%
-4.8% vs TC avg
§103
30.7%
-9.3% vs TC avg
§102
10.6%
-29.4% vs TC avg
§112
14.7%
-25.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 203 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, 17/431,920, filed 08/18/2021, is a National Stage entry of PCT/EP2020/054184, International Filing Date: 02/18/2020, and claims foreign priority to FR 1901669, filed 02/19/2019. The effective filing date is after the AIA date of March 16, 2013, and so the application is being examined under the “first inventor to file” provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Priority Acknowledgment is made of applicant's claim for foreign priority, based on French Application FR 1901669, filed 02/19/2019. On 08/18/2021, the USPTO electronically retrieved the certified priority document from WIPO. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Status of the Application This Final Office Action is in response to Applicant’s communication of Dec. 19, 2025. Claims 1 and 4-11 are pending, of which claims 1, 10, and 11 are independent. Claims 1, 4, 5, 10, and 11 are currently amended. Previously, claims 2 and 3 were cancelled. All pending claims have been examined on the merits. Claim Rejections - 35 USC § 103 This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 5-8, 10, and 11 are rejected under 35 U.S.C. 103 as being unpatentable over US 2014/0201080 A1 to Just (“Just”. Eff. Filed Jan. 14, 2013. Published July 17, 2014) in view of US 2020/0244788 A1 to Adams (“Adams”. Filed Jan. 28, 2019. Published Jul. 30, 2020), and further in view of the Google Patents English Language Translation of CN 105493112 A to Hewlett Packard. (“Hewlett Packard”, Filed Aug. 20, 2013. Published Apr. 13, 2016). In regards to claim 1, the “Just” reference teaches: 1. (Currently Amended) A method for processing a purchase order of goods or services, said method being implemented within an electronic voice control device comprising at least one capturing component for capturing voice orders and a sound emission component, wherein the method comprises: obtaining, using the capturing component, at least one data item representative of a voice-based purchase order, said purchase order emanating from a voice of a user and relating to the purchase of at least one good or service; (See Just, para. [0029]: “FIG. 3 is a flow diagram 300 of an exemplary method for processing a customer purchase transaction using biometric data, consistent with disclosed embodiments. In step 310, processing entity 150 may receive biometric data associated with a purchase transaction. The biometric data may be received, for example, from merchant 130 with a request to process a purchase transaction. In other embodiments, processing entity 150 may received the biometric data from another source and/or entity (for example, directly from customer 140 through client device 160). The biometric data may reflect measurable characteristics unique to each person that remain constant over time. A more detailed discussion is provided below regarding the receiving biometric data (with respect to FIG. 4).”) (See Just, para. [0034]: “Server 250 may also determine which biometric data to request from customer 140 when making a purchase transaction (step 420). The biometric data may include data regarding customer characteristics identified from voice recognition, iris eye scan, fingerprint, palm print, walking gait, facial recognition, DNA swab, or the like. Server 250 may request one or more biometric data characteristics for use in processing purchase transactions. For example, server 250 may select three characteristics to request from each customer 140 when making a purchase transaction. The characteristics may be randomly selected before or during the purchase transaction to prevent fraudulent transactions. In other embodiments, financial service provider 120, customer 140, or any other component of system 100 may select which characteristics to provide server 250.”) (See Just, para. [0036]: “In step 430, server 250 may receive the requested biometric data associated with a purchase transaction. For example, server 250 may include devices capable of receiving and analyzing a customer's voice, iris eye scan, fingerprint, palm print, walking gait, facial recognition, DNA swab, or any other biometric data capable of being associated with customer 140. In exemplary embodiments, a payment terminal associated with processing entity 150 may be capable of receiving and/or analyzing the biometric data. For example, server 250 may be communicatively associated with a payment terminal having a video device capable of scanning an iris and/or capturing a voice recording of customer 140. Server 250 may further process this biometric data to determine recognizable features unique to that customer (e.g., iris pattern, syllable pronunciation, etc.).”) (See Just, para. [0037]: “Furthermore, server 250 may receive transaction data associated with the purchase transaction by customer 140 (step 440). The transaction data may include, for example, the purchase price, time and data of the transaction, product/service identification (e.g., SKU number), and merchant identification (e.g., merchant identification number). Server 250 may receive the transaction data substantially simultaneously as server 250 receives the biometric data. In other embodiments, server 250 may receive the transaction data and biometric data separately, by different means, and/or at different times.”) authenticating at least one voiceprint representative of said user based on said at least one data item representative of the purchase order; (See Just, para. [0036]: “In step 430, server 250 may receive the requested biometric data associated with a purchase transaction. For example, server 250 may include devices capable of receiving and analyzing a customer's voice, iris eye scan, fingerprint, palm print, walking gait, facial recognition, DNA swab, or any other biometric data capable of being associated with customer 140. In exemplary embodiments, a payment terminal associated with processing entity 150 may be capable of receiving and/or analyzing the biometric data. For example, server 250 may be communicatively associated with a payment terminal having a video device capable of scanning an iris and/or capturing a voice recording of customer 140. Server 250 may further process this biometric data to determine recognizable features unique to that customer (e.g., iris pattern, syllable pronunciation, etc.).”) (See Just, para. [0037]: “Furthermore, server 250 may receive transaction data associated with the purchase transaction by customer 140 (step 440). The transaction data may include, for example, the purchase price, time and data of the transaction, product/service identification (e.g., SKU number), and merchant identification (e.g., merchant identification number). Server 250 may receive the transaction data substantially simultaneously as server 250 receives the biometric data. In other embodiments, server 250 may receive the transaction data and biometric data separately, by different means, and/or at different times.”) The Examiner interprets that Just’s “captured voice recording of customer 140” reads upon the claimed “at least one data item representative of a voice-based purchase order”, or in the alternative, Just’s “syllable pronunciation” feature of the captured voice recording reads upon the claimed “at least one data item representative of a voice-based purchase order”. determining whether said at least one voiceprint corresponds to a user authorized to make purchases using said electronic voice control device; and (See Just, para. [0031]: “In some aspects, server 250 may authorize the purchase transaction based on the correlation of biometric data to a financial service account of customer 140 (step 330). For example, server 250 may compare transaction data with the customer account associated with financial service provider 120 and verify the customer account contains adequate funds to complete the transaction. Additionally, server 250 may verify the purchase transaction is not fraudulent based on the received biometric data. A more detailed discussion is provided below regarding authorizing the purchase transaction (see FIG. 6).”) (See Just, para. [0039]: “Server 250 may compare the received biometric data from customer 140 with stored biometric data, as shown in step 520. The stored biometric data may represent previously received biometric data for customers of financial service provider 120. Such stored biometric data may include one or more of the biometric characteristics (e.g., voice recognition, iris eye scan, fingerprint, palm print, walking gait, facial recognition, DNA swab, and the like). Server 250 may compile the stored biometric data into searchable databases. The stored biometric data may be linked to one or more customer accounts associated with financial service provider 120.”) when said at least one voiceprint representative of said user corresponds to a user authorized to make purchases using said electronic voice control device, transmitting, to an electronic processing device to which said electronic voice control device is connected, a request to obtain a purchase authorization, said request comprising at least one data item representative of the payment transaction, (See Just, para. [0032]: “Server 250, in step 340, may send transaction information associated with the purchase transaction to financial service provider 120. Specifically, server 250 may send customer account information, transaction data, and authorization information to financial service provider 120. Financial service provider 120 may use the received transaction information in order to, for example, update customer account balances or provide additional fraud detection. A more detailed discussion is provided below regarding sending transaction information to financial service provider 120 (see FIG. 7).”) (See Just, para. [0033]: “FIG. 4 depicts a flowchart of an exemplary method for receiving biometric data associated with a purchase transaction consistent with disclosed embodiments. As shown in FIG. 4, processing entity 150 may create a partnership with merchant 150. For example, the two entities may agree to allow customer 140 to purchase service/products from merchant 130 through processing entity 150 (step 410). In some embodiments, server 250 may receive an authorization from financial service provider to process purchase transactions associated with customer 140.”) The Examiner interprets that “In some embodiments, server 250 may receive an authorization from financial service provider to process purchase transactions associated with customer 140” in para. [0033] means that the “Server 250, in step 340, may send transaction information associated with the purchase transaction to financial service provider 120” in para. [0032] is “transmitting, to an electronic processing device to which said electronic voice control device is connected, a request to obtain a purchase authorization”. In regards to “to an electronic processing device to which said electronic voice control device is connected”, the Examiner interprets that all of the items connected to a network are connected to one another. However, under a conservative interpretation of the “Just” reference, it could be argued that the “Just” reference does not explicitly teach the italicized features below, which are taught by the “Adams” reference: wherein: the electronic processing device is a communication terminal of the user with which said electronic voice control device has been previously paired; and (See Adams, para. [0014]: “The user interface devices 10 in the network can include smart speakers, smartphones and other wireless communication devices, home automation control systems, smart televisions and other entertainment devices, and the like. User interface devices 10 may be provided in any suitable environment; for example, while not shown in FIG. 1, a user interface device may be provided in a motor vehicle. The user interface devices 10 may operate in a standalone manner, not part of a local area network or mesh network; or they may operate in a local network. For example, FIG. 1 illustrates a smart speaker wirelessly paired (e.g., using the Bluetooth® protocol) with a personal computer 20. Alternatively, the personal computer 20 may be used to control the smart speaker. A smart speaker may also be paired with or controlled by a mobile wireless communication device, as discussed below. A user interface device 10 that operates as a home automation control system may be joined in a mesh network with one or more smart appliances, such as light fixtures 30, or a heating or cooling system 40. Each of these user interface devices 10 may provide a voice interface (e.g., a microphone array, speaker or audio line out, and associated signal processing components) for a local user to interact with the intelligent automated assistant service provided by the system 150 as described above.”) (See Adams, para. [0029]: “It should be noted that a call may be placed from a smart speaker device 200 a to another communication device, such as a mobile wireless communications device operating on a cellular network, or vice versa. In the case where a call is placed to a mobile device 195 on a cellular network 160, call data may be routed from the call management infrastructure 190 to the cellular network 160, and thence to the mobile device 195, and vice versa. Optionally, the mobile device 195 may be paired with a smart speaker 200 b. The smart speaker 200 b can operate as a microphone and speaker for the paired mobile device 195, thus providing hands-free operation to the user of the mobile device 195. The smart speaker 200 b can still be in communication with the intelligent automated assistant service 150 via the network 100. Further, call sessions may include more than two parties; in this description, only two parties are used in these examples for the sake of simplicity. Thus, one or all users on a call may be using a smart speaker, but some number of users on the call may be using a mobile device or other communication device.”) (See Adams, para. [0037]: “A current state of the user's respective smart speaker device may include the state of any power or privacy settings (e.g., whether a do-not-disturb or mute mode is enabled); whether the smart speaker device is currently paired with an identified mobile device associated with the user (e.g., whether the user's smartphone is paired with the smart speaker device); whether the smart speaker device detects the presence of a mobile device, even if unpaired, over short-range wireless communication, such as Bluetooth; or whether an identified computing or mobile device associated with the user is connected to the same Wi-Fi network. These conditions may tend to indicate that the user is present with the smart speaker device. Other conditions may tend to indicate that the user may be present, but if so, is not alone. For example, detection of an unknown mobile device or multiple mobile devices available for pairing via Bluetooth may suggest that another person is nearby.”) The Examiner interprets that if a “smart speaker device is currently paired with an identified mobile device associated with the user”, then an obvious variation is that it was also “previously paired” in the immediate past. It would have been obvious to a person having ordinary skill in the art (PHOSITA), before the effective filing date of the claimed invention, to include in the “Systems and methods for processing customer purchase transactions using biometric data”, as taught by the “Just” reference, with “Securing of Internet of Things Devices Based on Monitoring of Information Concerning Device Purchases”, as further taught by the “Adams” reference above, because both references are in the same art of processing customer purchase transactions using voice recognition, and because “These conditions may tend to indicate that the user is present with the smart speaker device”: (See Adams, para. [0037]: “A current state of the user's respective smart speaker device may include the state of any power or privacy settings (e.g., whether a do-not-disturb or mute mode is enabled); whether the smart speaker device is currently paired with an identified mobile device associated with the user (e.g., whether the user's smartphone is paired with the smart speaker device); whether the smart speaker device detects the presence of a mobile device, even if unpaired, over short-range wireless communication, such as Bluetooth; or whether an identified computing or mobile device associated with the user is connected to the same Wi-Fi network. These conditions may tend to indicate that the user is present with the smart speaker device.”) However, under a conservative interpretation of Just in view of Adams, it could be argued that Just in view of Adams does not explicitly teach the following features, which are taught by Hewlett Packard: wherein: … transmitting the request comprises: building the request to obtain the purchase authorization; and emitting a sound according to the request to obtain the purchase authorization using the sound emission component. (Hewlett Packard: “Buyer device 115 can make for the various Trading Authorization initiated via buyer interface 152 by the buyer 114. Such as, buyer device 115 (such as, via the network as internet) can receive the transaction authorization request of combining gateway 118 from associating with the payment services 104 of the buyer 114. Buyer device 115 can be pointed out customer transaction to ratify a motion and just be waited for the approval of the buyer. This prompting can [be] via the proprietary application that audio call, text/SMS message, buyer device run or via the network interface on equipment. Buyer device 115 can allow the buyer 114 mutual with buyer device, to check transaction authorization request and authorize or Cancel Transaction. Buyer device 115 can send authorization response to associating gateway 118.Authorization response can generate to ratify to conclude the business alternately in response to the buyer and buyer device 115.”) It would have been obvious to a person having ordinary skill in the art (PHOSITA), at the effective filing date of the Application, to further include in the method for “systems and methods for processing customer purchase transactions using biometric data”, as taught by the “Just reference”, with purchase authentication via audio call prompt, as further taught by Hewlett Packard because both references are in the similar art of voice based purchase orders, and Hewlett Packard further describes the process of user’s purchase authentication in response to an audio call prompt. In regards to claim 5, 5. (Currently Amended) The method for processing a payment transaction, according to claim 1, wherein the method further comprises, after transmitting the request, receiving a payment transaction acceptance response. (See Just, para. [0032]: “Server 250, in step 340, may send transaction information associated with the purchase transaction to financial service provider 120. Specifically, server 250 may send customer account information, transaction data, and authorization information to financial service provider 120. Financial service provider 120 may use the received transaction information in order to, for example, update customer account balances or provide additional fraud detection. A more detailed discussion is provided below regarding sending transaction information to financial service provider 120 (see FIG. 7).”) (See Just, para. [0033]: “FIG. 4 depicts a flowchart of an exemplary method for receiving biometric data associated with a purchase transaction consistent with disclosed embodiments. As shown in FIG. 4, processing entity 150 may create a partnership with merchant 150. For example, the two entities may agree to allow customer 140 to purchase service/products from merchant 130 through processing entity 150 (step 410). In some embodiments, server 250 may receive an authorization from financial service provider to process purchase transactions associated with customer 140.”) The Examiner interprets that “In some embodiments, server 250 may receive an authorization from financial service provider to process purchase transactions associated with customer 140” in para. [0033] means that the “Server 250, in step 340, may send transaction information associated with the purchase transaction to financial service provider 120” in para. [0032] is “transmitting, to an electronic processing device to which said electronic voice control device is connected, a request to obtain a purchase authorization”. In regards to claim 6, 6. (Currently Amended) The method for processing a payment transaction, according to claim 5, wherein the method further comprises, after receiving a payment transaction acceptance response, transmitting a data structure representative of the payment transaction to a transaction server. (See Just, para. [0033]: “FIG. 4 depicts a flowchart of an exemplary method for receiving biometric data associated with a purchase transaction consistent with disclosed embodiments. As shown in FIG. 4, processing entity 150 may create a partnership with merchant 150. For example, the two entities may agree to allow customer 140 to purchase service/products from merchant 130 through processing entity 150 (step 410). In some embodiments, server 250 may receive an authorization from financial service provider to process purchase transactions associated with customer 140. For instance, processing entity 150 may provide payment options or terminals to merchant 150 that allows customer 140 to request purchase transactions. The payment terminals may be accessible to customer 140 at a physical location or through internet 110. For example, customer 140 may access the payment terminals through the internet using client device 160.”) (See Just, para. [0037]: “Furthermore, server 250 may receive transaction data associated with the purchase transaction by customer 140 (step 440). The transaction data may include, for example, the purchase price, time and data of the transaction, product/service identification (e.g., SKU number), and merchant identification (e.g., merchant identification number). Server 250 may receive the transaction data substantially simultaneously as server 250 receives the biometric data. In other embodiments, server 250 may receive the transaction data and biometric data separately, by different means, and/or at different times.”) (See Just, para. [0048]: “FIG. 8 is a flow diagram 800 of an exemplary method for receiving a customer purchase transaction using biometric data, consistent with disclosed embodiments. In step 810, financial service provider 120 may receive purchase transaction information from one or more components of system 100. Such information may include, for example, customer account information, transaction data, and authorization information. In step 820, financial service provider 120 may process the purchase transaction. For example, financial service provider 120 may locate the customer account associated with the customer account information, deduct the purchase amount from the customer account, and notify the customer of this deduction. A more detailed discussion is provided below (with respect to FIG. 10) regarding sending purchase transaction information to financial service provider 120.”) (See Just, para. [0051]: “FIG. 10 depicts a flowchart of an exemplary method processing the purchase transaction consistent with disclosed embodiments. As shown in FIG. 10, financial service provider 120 may process the purchase transaction. Financial service provider 120 may locate the customer account associated with the received customer account information (step 1010). The customer account may include a financial service account including, for example, credit card accounts, checking accounts, savings accounts, loans, investment accounts. Financial service provider 120 may additionally deduct the purchase price from the customer account, as shown in step 1020. In some embodiments, as shown in step 1030, financial service provider may further notify customer 140 of the deduction. For example, financial service provider 120 may provide a notification in the form of an electronic message or document (e.g., email, link to a website, SMS message, business software mechanisms (ERP, CRM, etc.). In some embodiments, financial service provider may provide the notification in the form of a bank statement.”) The Examiner interprets that “Financial service provider 120 may locate the customer account associated with the received customer account information (step 1010)” reads upon the claimed feature. In regards to claim 7, 7. (Currently Amended) The method for processing a payment transaction, according to claim 6, wherein the data structure representative of the payment transaction comprises at least one data item representative of a current voiceprint. (See Just, para. [0036]: “In step 430, server 250 may receive the requested biometric data associated with a purchase transaction. For example, server 250 may include devices capable of receiving and analyzing a customer's voice, iris eye scan, fingerprint, palm print, walking gait, facial recognition, DNA swab, or any other biometric data capable of being associated with customer 140. In exemplary embodiments, a payment terminal associated with processing entity 150 may be capable of receiving and/or analyzing the biometric data. For example, server 250 may be communicatively associated with a payment terminal having a video device capable of scanning an iris and/or capturing a voice recording of customer 140. Server 250 may further process this biometric data to determine recognizable features unique to that customer (e.g., iris pattern, syllable pronunciation, etc.).”) (See Just, para. [0037]: “Furthermore, server 250 may receive transaction data associated with the purchase transaction by customer 140 (step 440). The transaction data may include, for example, the purchase price, time and data of the transaction, product/service identification (e.g., SKU number), and merchant identification (e.g., merchant identification number). Server 250 may receive the transaction data substantially simultaneously as server 250 receives the biometric data. In other embodiments, server 250 may receive the transaction data and biometric data separately, by different means, and/or at different times.”) In regards to claim 8, 8. (Currently Amended) The method for processing a payment transaction, according to claim 7, wherein said at least one data item representative of a current voiceprint is used to replace at least one payment data item of a payment card of said user. (See Just, para. [0051]: “FIG. 10 depicts a flowchart of an exemplary method processing the purchase transaction consistent with disclosed embodiments. As shown in FIG. 10, financial service provider 120 may process the purchase transaction. Financial service provider 120 may locate the customer account associated with the received customer account information (step 1010). The customer account may include a financial service account including, for example, credit card accounts, checking accounts, savings accounts, loans, investment accounts. Financial service provider 120 may additionally deduct the purchase price from the customer account, as shown in step 1020. In some embodiments, as shown in step 1030, financial service provider may further notify customer 140 of the deduction. For example, financial service provider 120 may provide a notification in the form of an electronic message or document (e.g., email, link to a website, SMS message, business software mechanisms (ERP, CRM, etc.). In some embodiments, financial service provider may provide the notification in the form of a bank statement.”) In regards to independent claim 10, it is rejected on the same grounds as independent claim 1. In regards to independent claim 11, it is rejected on the same grounds as independent claim 1. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Just in view of Adams and Hewlett Packard as applied to claim 7 above, and further in view of US 2018/0349900 A1 to Anderson et al. (“Anderson”, Eff. Filed Mar. 13, 2003. Published Dec. 6, 2018). In regards to claim 9, under a conservative interpretation of Just in view of Adams and Hewlett Packard, it could be argued that Just in view of Adams and Hewlett Packard does not explicitly teach the following features, which are taught by Anderson: 9. (Currently Amended) The method for processing a payment transaction, according to claim 7, wherein said at least one data item representative of a current voiceprint is used to build a payment token using at least one payment data item of a payment card of said user. (See Anderson, para. [0053]: “At block 212, the authorization server 106 may provide the payment credentials to the user device 102. The payment credentials may include a bank account number, a credit card number, or a financial services account name or number. In one embodiment, the authorization server 106 may also include an indication of the authorization provided by the authorization service. The indication may include an encrypted key or token that is shared between the authorization service and the merchants. For example, the merchants may require that a purchase includes the key or token in order to finalize the transaction. The key or token may represent that the authorization service has authenticated the user and that the purchase is less likely to be fraudulent.”) It would have been obvious to a person having ordinary skill in the art (PHOSITA), at the effective filing date of the Application, to include in the method for “systems and methods for processing customer purchase transactions using biometric data”, as taught by Just above, in the combination of Just in view of Adams and Hewlett Packard, with “systems and methods for tokenizing financial information”, as further taught by Anderson, because “the merchants may require that a purchase includes the key or token in order to finalize the transaction. The key or token may represent that the authorization service has authenticated the user and that the purchase is less likely to be fraudulent” (see Anderson para. [0053]). Response to Arguments Re: Claim Rejections - 35 USC § 101 The 35 USC § 101 “abstract idea” rejection of claims 1 and 4-11 were previously withdrawn. The Examiner holds that the amendments to independent claim 1 to recite “said method being implemented within an electronic voice control device comprising at least one capturing component for capturing voice orders, and a sound emission component” and “emitting a sound according to the request to obtain the purchase authorization using the sound emission component” overcome the 35 USC § 101 “abstract idea” rejection. The Examiner holds that the amendments to independent claim 10 to recite “said electronic voice control device”, “at least one capturing component for capturing voice orders”, “a sound emission component”, and “emitting a sound according to the request to obtain the purchase authorization[,] using the sound emission component” overcome the 35 USC § 101 “abstract idea” rejection. The Examiner holds that the amendments to independent claim 10 to recite “said electronic voice control device”, “at least one capturing component for capturing voice orders”, “a sound emission component”, and “wherein: … the transmission of the request comprises: … generating a sound according to the request to obtain the purchase[,] using the sound emission component” overcome the 35 USC § 101 “abstract idea” rejection. The Examiner holds that the amendments to independent claim 11 to recite “when the instructions are executed on a processor of an electronic voice control device comprising at least one capturing component for capturing voice orders, and a sound emission component” and “wherein the method comprises: … emitting a sound according to the request to obtain the purchase authorization[,] using the sound emission component” overcome the 35 USC § 101 “abstract idea” rejection. Re: Claim Rejections - 35 USC §103 The 35 USC § 103 rejection has been amended, as necessitated by Applicant’s amendments to the claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US-2019/0149541-A1 to Valenti et al. (“Valenti”. Eff. Filed on Nov. 13, 2017. Published on May 16, 2019) (See Valenti, para. [0015]: “Once a biometric registration (or “lock”) is established between a consumer (i.e., the consumer's device and biometrics) and a merchant, the consumer can perform a transaction facilitated by the secure network using biometric authentication. For example, a consumer may establish biometric registration, based on a non-biometric enhanced authentication, with a particular mobile app, e.g., Uber. The user can then perform an authenticated payment transaction with the Uber app using biometrics, e.g., a fingerprint or selfie, depending on what the consumer's device supports. In disclosed embodiments, the software used by the merchant may be adaptable so that a consumer can choose how to authenticate themselves based on the capabilities of the consumer's mobile device. In disclosed embodiments, the technology used to confirm the identity of the consumer on the consumer's mobile device may be provided by the payment network operator who supplies the consumer with a digital wallet. The technology may be the same technology that a payment network operator provides for issuers to use in their mobile banking apps.”) (See Valenti, para. [0018]: “FIG. 1 is a diagram depicting a system 100 for providing biometric authentication with a secure network 110 (e.g., a payment network). In disclosed embodiments, a client/user, e.g., a consumer, may use a mobile device 120 to connect to a provider 130 (e.g., a merchant) via a communication network 140 (e.g., the internet) to make an online purchase of goods and/or services. To pay for the purchase, the consumer may use a payment card provided by an issuer 150 (e.g., a bank or other type financial institution).”) Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications should be directed to Examiner Ayal Sharon, whose telephone number is (571) 272-5614, and fax number is (571) 273-1794. The Examiner can normally be reached from Monday to Friday between 9 AM and 6 PM. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christine M Behncke can be reached on (571) 272-8103. The fax number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. Sincerely, /Ayal I. Sharon/ Examiner, Art Unit 3695 March 20, 2026
Read full office action

Prosecution Timeline

Aug 18, 2021
Application Filed
Sep 28, 2023
Non-Final Rejection — §101, §103
Jan 04, 2024
Response Filed
Mar 01, 2024
Final Rejection — §101, §103
May 07, 2024
Response after Non-Final Action
May 13, 2024
Response after Non-Final Action
Aug 01, 2024
Request for Continued Examination
Aug 02, 2024
Response after Non-Final Action
Dec 23, 2024
Non-Final Rejection — §101, §103
Mar 27, 2025
Interview Requested
Mar 27, 2025
Response Filed
Apr 03, 2025
Non-Final Rejection — §101, §103
Apr 08, 2025
Applicant Interview (Telephonic)
Apr 08, 2025
Examiner Interview Summary
Jun 27, 2025
Response Filed
Sep 04, 2025
Non-Final Rejection — §101, §103
Dec 09, 2025
Interview Requested
Dec 18, 2025
Applicant Interview (Telephonic)
Dec 18, 2025
Examiner Interview Summary
Dec 19, 2025
Response Filed
Mar 20, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597002
Method, System & Computer Program Product for Collateralizing Non-Fungible Tokens
2y 5m to grant Granted Apr 07, 2026
Patent 12586078
MANAGING COST DATA BASED ON COMMUNITY SUPPLIER AND COMMODITY INFORMATION
2y 5m to grant Granted Mar 24, 2026
Patent 12586046
SYSTEMS AND METHODS FOR EXECUTING REAL-TIME ELECTRONIC TRANSACTIONS BY A DYNAMICALLY DETERMINED TRANSFER EXECUTION DATE
2y 5m to grant Granted Mar 24, 2026
Patent 12561740
Method, System & Computer Program Product for Requesting Finance from Multiple Exchange and Digital Finance Systems
2y 5m to grant Granted Feb 24, 2026
Patent 12547795
METHOD AND DEVICE FOR DETERMINING THE FRACTURE SAFETY OF A TREE AND ASSOCIATED COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
43%
Grant Probability
72%
With Interview (+28.4%)
3y 8m
Median Time to Grant
High
PTA Risk
Based on 203 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month