DETAILED ACTION
Status of Claims
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This action is in reply to the remarks/arguments for Application 18/756,855 filed on 12 December 2025.
Claims 1-12 of Group I have been elected in the response.
Claims 13-18 of Group II have been withdrawn/canceled.
Claims 19-20 of Group III have been withdraw/canceled.
Claims 1-20 are currently pending and have been examined.
Response to Arguments
A. Restriction Requirement:
Claims 1-20 stand subject to restriction and/or election requirement.
1. Applicant argues that the three groups to which the claims belong are not distinct inventions but are obvious variants of each other.
Applicant’s admission regarding the claims being obvious variants of each other has been considered and is persuasive. Accordingly the restriction requirement is withdrawn in view of Applicant’s admission.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
In the instant case, representative method claim 1 is directed towards facilitating biometric multi-factor authentication associated with a commercial and/or financial activity. Claim 1 is directed to the abstract idea of using rules and/or instructions to facilitate a transaction in an automatic manner comprising the steps of merely collecting (“receiving a registration request”), computing (“generating a facial signature” … and … “hash value”), comparing (“verifying … facial signature matches”), and relaying (“sending a message”) data/information associated with a commercial and/or financial transaction, which is grouped under the certain methods of organizing human activity – fundamental economic principles, practices or concepts; sales activity; following set of instructions; commercial interactions (business relations); managing personal behavior of relationships or interactions between people (including social activities, teachings, following rules or instructions) as well as mathematical concepts – mathematical relationships, inasmuch as the claimed method as a whole is directed towards facilitating utilizing a mathematical model to perform calculations (e.g., hash value) utilizing an algorithm, but for the recitation of computer-related components.
Other than the mere nominal recitation of a computer-related device – nothing in the claim element precludes the steps from the organizing human interactions and mathematical concepts groupings grouping, in prong one of step 2A. Accordingly, for these reasons, the claim recites an abstract idea.
Claim 1 recites:
“receiving, from a user-operated device of a user, a registration request, wherein the registration request including images depicting a face of the user and at least two actions being performed by the user;
generating a facial signature for the face depicted in the images and at least one hash value based on the at least two actions depicted in the images;
receiving, from a terminal, second images of the user for a transaction;
generating a candidate facial signature and at least one candidate hash value from the second images which depict the face and at least two candidate actions performed by the user;
verifying the candidate facial signature matches the facial signature and the at least one candidate hash value matches the at least one hash value; and
sending a message to the terminal based on the verifying, wherein the message including an authentication successful message or an authentication failed message.”
Based on the underlined elements above, abstract ideas and/or concepts are identified.
Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application because, when analyzed under prong two of step 2A, the additional elements of the claim such as a “user-operated device”, “terminal”, represent the use of a computer as a tool to perform an abstract idea and/or does no more than generally apply the abstract idea to a particular field of use. Therefore, the additional elements do not integrate the abstract idea into a practical application as they do no more than represent a computer performing functions that correspond to (i.e. automate) implement the acts of using rules and/or instructions to facilitate a transaction in an automatic manner comprising the steps of merely collecting (“receiving a registration request”), computing (“generating a facial signature” … and … “hash value”), transmitting (“receiving … images”), comparing (“verifying … facial signature matches”), and relaying (“sending a message”) data/information associated with a commercial and/or financial transaction.
When analyzed under step 2B, the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception itself. Viewed as a whole, the combination of elements recited in the claims merely describe the concept of using rules and/or instructions to facilitate a transaction in an automatic manner comprising the steps of merely collecting (“receiving a registration request”), computing (“generating a facial signature” … and … “hash value”), transmitting (“receiving … images”), comparing (“verifying … facial signature matches”), and relaying (“sending a message”) data/information associated with a commercial and/or financial transaction using computer computer-related technology and/or devices that merely perform as designed to function. Therefore, the use of these additional elements does no more than employ a computer as a tool to automate and/or implement the abstract idea, which cannot provide significantly more than the abstract idea itself (MPEP 2106.05(I)(A)(f) & (h)). Hence, claim 1 is not patent eligible.
Additionally, based upon consideration of all of the relevant factors with respect to the claim as a whole, the independent claim is held to claim an abstract idea, and is therefore rejected as ineligible subject matter under 35 U.S.C. § 101.
As such, the claim is non-statutory because the body of the claim does not contain any limitations indicating the specific structure of the device and do not recite any machine or transformation (insufficient recitation of a machine or transformation either express or inherent) – no specific computer or processing device recited to carry out the claimed steps or execute computer code instructions is recited. See Interim Guidance for Determining Subject Matter Eligibility for Process Claims in View of Bilski v. Kappos (Federal Register / Vol. 75, No. 143 / Tuesday, July 27, 2010 / Notices).
Independent claim 12 recites substantially the same limitations and/or subject matter as claim 1 above and is ineligible for the same reasons. The subject matter of claim 12 corresponds to the subject matter of claim 1 in terms of a method (e.g., process). Therefore the reasoning provided for claim 1 applies to claim 12 accordingly.
Independent claim 18 recites substantially the same limitations and/or subject matter as claim 1 above and is ineligible for the same reasons. The subject matter of claim 18 corresponds to the subject matter of claim 1 in terms of a system (e.g., machine). Therefore the reasoning provided for claim 1 applies to claim 18 accordingly.
Dependent claims 2-11, 13-17 and 19-20 add further details and contain limitations that narrow the scope of the invention. However, these details do not result in significantly more than the abstract idea itself. As explained in the December 16, 2014 Interim Eligibility Guidance from the USPTO (in reference to the BuySAFE, Inc. v. Google, Inc. decision), further narrowing the details of an abstract idea does not change the § 101 analysis since a more narrow abstract idea does not make it any less abstract.
Viewed individually and in combination, these additional elements do not provide meaningful limitations to transform the abstract idea such that the claims amount to significantly more than the abstraction itself.
Accordingly, the present pending claims are not patent eligible and are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
Claim Rejections – 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office Action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Beigi, US 2015/0347734 A1 (“Beigi”), in view of Nagalla, US 10,346,675 B1 (“Nagalla”).
Re Claim 1: (Original) Beigi discloses a method, comprising:
A method, comprising:
receiving, from a user-operated device of a user, a registration request, wherein the registration request including images depicting a face of the user and at least two actions being performed by the user; (2.1 The Enrollment and/or Registration Stage; ¶[0154]: “When the phone is registered ( or at some later time), the owner of the device does a biometric enrollment and the model/models is/are built and stored on the device. These models are generally representations of the features of the specific biometric of interest.”; ¶[0019]: “For the second factor (knowledge of a fact), as an example, a challenge in the form of a traditional passcode may be requested, in which case it is usually typed in, or depending on the available input devices, preselected or predefined facial expressions (for cameras), natural language understanding or a repeated phrase, through a speech recognizer for a microphone input, a handwritten signature such as described by [4] used with a touchpad or a pen may be used along with other methods …”: ¶[0020]: “For the third factor (something one is), biometric techniques are used. Many different biometric methods may be used, such as those listed in Section 1.3. Some such techniques are Speaker Recognition, Image-Based or Audio Based Ear Recognition, Face Recognition, Fingerprint Recognition, Palm Recognition, Hand-Geometry Recognition, Iris Recognition, Retinal Scan, Thermographic Image Recognition, Vein Recognition, Signature Verification …”)
generating a facial signature for the face depicted in the images and at least one hash value based on the at least two actions depicted in the images; (¶[0025]: “FIG.1 describes the process of registering the essential authentication data such as the digest (hash) of the subscriber ID, biometric models, and the binary code of the software being run with certificate authorities. This is the process that takes place once, either when the user has first activated the device or once for each new security level and access credential which is added.”; ¶[0183]: “FIG. 11 shows the validation process for the reference data. At the time of each transaction where authentication is necessary, this process of validation takes place. The data is retrieved from the persistent memory of the device and is decrypted to get the signed hash values of the different reference data. Then the original reference data is retrieved by the authentication application from the persistent memory of the device and is hashed in the same manner as it was done in the hashing step of the registration defined in Section 2.4. These two sets of hash values are then compared as prescribed by FIG. 11 to see if they match.”)
With regard to the limitation feature comprising:
receiving, from a terminal, second images of the user for a transaction;
Nagalla however, makes these teachings in a related endeavor (Abstract: “The method prompts the user via a facial gesture cue to make a facial gesture, captures a second facial image of the user, and compares the second image with stored facial gesture credentials. The user is authorized to perform a transaction in the event the first facial image matches a facial recognition credential for an authorized
account, and the second facial image matches a facial gesture credential associated with the authorized account.”; FIG. 2: “210 Capture by imaging sensor of user terminal, second imaging data including second facial image of user following the display of facial gesture cue”; C2 L44-47: “The method prompts the user via a facial gesture cue to make a facial gesture, captures a second facial image of the user, and compares the second image with stored facial gesture credentials.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Nagalla with the invention of Beigi as described above for the motivation of facilitating authentication of individuals engaging in transactions in a secure manner.
Beigi further discloses:
generating a candidate facial signature and at least one candidate hash value from the second images which depict the face and at least two candidate actions performed by the user; (¶[0025]: “FIG.1 describes the process of registering the essential authentication data such as the digest (hash) of the subscriber ID, biometric models, and the binary code of the software being run with certificate authorities. This is the process that takes place once, either when the user has first activated the device or once for each new security level and access credential which is added.”; ¶[0183]: “FIG. 11 shows the validation process for the reference data. At the time of each transaction where authentication is necessary, this process of validation takes place. The data is retrieved from the persistent memory of the device and is decrypted to get the signed hash values of the different reference data. Then the original reference data is retrieved by the authentication application from the persistent memory of the device and is hashed in the same manner as it was done in the hashing step of the registration defined in Section 2.4. These two sets of hash values are then compared as prescribed by FIG. 11 to see if they match.”)
verifying the candidate facial signature matches the facial signature and the at least one candidate hash value matches the at least one hash value; (¶[0183]: “These two sets of hash values are then compare as prescribed by FIG. 11 to see if they match. If they match …”)
sending a message to the terminal based on the verifying, wherein the message including an authentication successful message or an authentication failed message. (¶[0071]: “FIG. 33 shows a scenario in which the user has forgotten to enter the required PIN. An error message is displayed to alert the user to make sure a valid PIN is entered.”; ¶[0167]: “The PDA owner receives notification for the transaction plus the challenge information.”; ¶[0238]: “Each time there is a successful authentication, a counter (Figure component 87) is incremented by the authentication software. Once the minimum number of matches (Figure component 88) has been achieved, before getting to the maximum number of tests which are defined by the administration, access is granted to the group of people who have authenticated.”)
Re Claim 2: (Original) Beigi in view of Nagalla discloses the method of claim 1. With regard to the limitation comprising:
utilizing the message as a front-end security layer for a third-party payment service, wherein the third-party payment service performs automatic payment transaction based on a verified facial signature of the user.
Nagalla however, makes these teachings in a related endeavor (C8 L20-24: “An authorized transaction can include one or more banking transaction including withdrawing cash, depositing money, making a payment, effecting a money transfer, and providing account information for the authorized user's account.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Nagalla with the invention of Beigi as described above for the motivation of facilitating authentication of individuals engaging in transactions in a secure manner.
Re Claim 3: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
identifying a sequence with which the user performs the at least two actions from the images and enforcing the sequence during the verifying. (¶[0019]: “For the second factor (knowledge of a fact), as an example, a challenge in the form of a traditional passcode may be requested, in which case it is usually typed in, or depending on the available input devices, preselected or predefined facial expressions (for cameras), natural language understanding or a repeated phrase, through a speech recognizer for a microphone input, a handwritten signature such as described by [4] used with a touchpad or a pen may be used along with other methods …”: ¶[0020]: “For the third factor (something one is), biometric techniques are used. Many different biometric methods may be used, such as those listed in Section 1.3. Some such techniques are Speaker Recognition, Image-Based or Audio Based Ear Recognition, Face Recognition, Fingerprint Recognition, Palm Recognition, Hand-Geometry Recognition, Iris Recognition, Retinal Scan, Thermographic Image Recognition, Vein Recognition, Signature Verification …”)
Re Claim 4: (Original) Beigi in view of Nagalla discloses the method of claim 1. With regard to the limitation comprising:
receiving a modification request from the user-operated device comprising additional images depicting the user performing the at least two actions in a different sequence or performing different actions; and
processing the generating of the facial signature and the at least one hash value to update one or more of the facial signature and the at least one hash value.
Nagalla however, makes these teachings in a related endeavor (C11 L3-11: “In another example, API's may be used by authorized users of the financial institution to update previously established
facial recognition credential records 142, facial gesture credential records 144, and secondary biometric credential records 146. Authorized users may set up user credential records that serve as user-supplied information for authenticating access to a user's account; and users may update user credential records much as customers of financial institutions may update passwords.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Nagalla with the invention of Beigi as described above for the motivation of facilitating authentication of individuals engaging in transactions in a secure manner.
Re Claim 5: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
receiving a deletion request from the user-operated device; and
deleting the facial signature and the at least one hash value to deregister the user from multiple factor biometric authentication services provided by the method.
(¶[0077]: “In addition, every database record has a Deactivate binary input. When it is set to "No", then the record is active, hence it is usable in authentication and other references. If it is set to "Yes", then the record is deactivated and it is treated as a deleted record.”)
Re Claim 6: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein receiving the registration request further includes identifying the at least two actions depicted in the images as facial expressions, user hand gestures, or a combination thereof. (¶[0019]: “For the second factor (knowledge of a fact), as an example, a challenge in the form of a traditional passcode may be requested, in which case it is usually typed in, or depending on the available input devices, preselected or predefined facial expressions (for cameras), natural language understanding or a repeated phrase, through a speech recognizer for a microphone input, a handwritten signature such as described by [4] used with a touchpad or a pen may be used along with other methods …”: ¶[0020]: “For the third factor (something one is), biometric techniques are used. Many different biometric methods may be used, such as those listed in Section 1.3. Some such techniques are Speaker Recognition, Image-Based or Audio Based Ear Recognition, Face Recognition, Fingerprint Recognition, Palm Recognition, Hand-Geometry Recognition, Iris Recognition, Retinal Scan, Thermographic Image Recognition, Vein Recognition, Signature Verification …”)
Re Claim 7: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein verifying further includes enforcing a time frame within which the at least one candidate hash value has to be matched to the at least one hash value and if not matched providing the authentication failed message to the sending. (¶[0104]: “preselected facial gestures have been chosen by the user which need to be enacted at the test or verification time, in order for the person to be authenticated.”)
Re Claim 8: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein verifying further includes enforcing a sequence associated with the at least two candidate actions depicted in the second images and when the sequence is not detected providing the authentication failed message to the sending. (¶[0078]: “MatchesRequested designates the minimum number of successful authentication matches done in sequence to allow access. That is 88 in the Figures. Once the value of the number of successful matches, 87 reaches this value, access is granted.”)
Re Claim 9: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein verifying further includes providing an authentication failed message to the sending when candidate facial signature does not match the facial signature. (¶[0071]: “FIG. 33 shows a scenario in which the user has forgotten to enter the required PIN. An error message is displayed to alert the user to make sure a valid PIN is entered.”; ¶[0167]: “The PDA owner receives notification for the transaction plus the challenge information.”
Re Claim 10: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein verifying further includes providing real-time feedback messages to the terminal as each of the candidate facial signature and the at least one hash value are successfully or unsuccessfully matched. (¶[0067]: “FIG. 29 shows a tablet device with an interface for inputting a PIN, accepting a speaker recognition audio input, a face recognition video capture, and the means for displaying a random string to be used for liveness testing through speech recognition. It also shows feedback on whether access has been granted or denied and the number of people who have matched the authentication procedure for this specific access control session and the total number of such authentications that need to be successfully performed.”)
Re Claim 11: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein verifying further includes verifying that a particular candidate hash value for a particular second action of the user depicted in the second images matches a particular hash value and sending a second message to the terminal, wherein the second message including a loyalty authentication successful message or a loyalty authentication failed message. (¶[0071]: “FIG. 33 shows a scenario in which the user has forgotten to enter the required PIN. An error message is displayed to alert the user to make sure a valid PIN is entered.”; ¶[0167]: “The PDA owner receives notification for the transaction plus the challenge information.”
Re Claim 12: (Original) Beigi in view of Nagalla discloses the method of claim 1. Beigi further discloses:
wherein verifying further includes verifying that at least two additional candidate hash values for remaining second actions of the user depicted in the second images matches at least one remaining hash value and providing the authentication successful message or the authentication failed message to the sending. (¶[0071]: “FIG. 33 shows a scenario in which the user has forgotten to enter the required PIN. An error message is displayed to alert the user to make sure a valid PIN is entered.”; ¶[0167]: “The PDA owner receives notification for the transaction plus the challenge information.”
Conclusion
The prior art made of record(s) and not relied upon is/are considered pertinent to applicant's disclosure.
The prior arts made of record and not relied upon are considered pertinent to applicant's disclosure.
Meikle, SR. (US 2024/0281811 A1) discloses a system and method for biometric payment. The system and methods disclosed provide a platform system, including software, and a backend system to conduct financial transactions, particularly biometric payments. The biometric payment platform system distinctly leverages biometric technology, such as facial recognition, to automatically verify the user's identity, access a digital asset, namely a digital wallet, that is specific to the user, and
proceed with the payment (or other financial transaction). The platform also enables users, and vendors/business, to perform other financial transactions related to their digital asset, including transferring funds, withdrawing funds, and depositing funds. The platform provides convenient and secure financial transactions, without the user having to remember passwords, carry physical cards ( e.g., debit card, credit card, etc.) or wallets, or even carry their own smartphone.
Claims 1-20 are rejected.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Clifford Madamba whose telephone number is 571-270-1239. The examiner can normally be reached on Mon-Thu 7:30-5:00 EST Alternate Fridays.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ryan Donlon, can be reached at 571-272-3602. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CLIFFORD B MADAMBA/Primary Examiner, Art Unit 3692