Detailed Action
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1 and 12 are currently amended.
Claim 4-5, and 15-16 are cancelled.
Claims 1-3, 6-14, and 17-20 are pending and are examined.
Response to Remarks
U.S.C. § 101
Remark 1: Applicant argues “It is respectfully submitted that the current subject matter, as recited in claim 1, clearly does not fall into the specifically enumerated categories of "Certain Methods of Organizing Human Activity"” . . . “claim 1 is directed to platform that enables verification of an identity of a consumer as part of a verification process and that can handle different types of proofing requests with the same architecture. The platform is flexible, dynamic, and customizable to accept and process many forms of identity proofing. (See, e.g., Specification, para. [0010]). The platform requests information associated with a user in response to a transaction. The information is an encrypted information. The platform then generates a decryption key using a rotatable public key and a private key in a key pair. The rotatable public key is valid during a predetermined period of time and rotated for another public key after expiration of that time. The decryption key is then used to decrypt the encrypted information. The platform generates a transaction identifier associated with the transaction to indicate transaction attribute(s) indicative of verification(s) associated with the user information. The transaction attributes are then used to select a verification policy to verify user information. Verification policies specify how verification(s) of the user information is to be performed. The platform then requests verification systems to perform verification of the user information using said policies. Once verification is completed, the platform generates an identity proofing result based on the results received from the verification systems. Clearly, this is far cry from the alleged fundamental economic principle or practice of mitigating risk. Nowhere does claim 1 call for a series of steps on instructing on how to hedge risk. The recited subject matter performs decryption of encrypted user information using specific key- based protocol, thereby enabling a highly secure process for verification of user information, and selects specific verification policies and systems for performing verification of user information to generate an identity proofing result. Similarly, claim 1 does not recite commercial or legal interactions. . . as recited in claim 1 and discussed in the specification, is directed to a highly secure process, through its use of specific key-based protocol for decryption of user information, to generate an identity proofing result using selected verification policies and systems. This is different from commercial or legal interactions.” (Applicant Arguments, 2025-12-11).
Response to Remark 2: Applicants characterization of the invention as ‘flexible, dynamic, and customizable’ does not change what the claim is directed to. The claim recites functional results (e.g. generating a transaction identifier, selecting a verification policy, requesting one or more verification systems, and generating an identity proofing result) without reciting any particular technological improvement to computer functionality. The focus remains organizing and managing identity verification activities for transactions by applying policies and obtaining verification responses. The applicants reliance on encryption and key rotation does not remove the claim from the abstract-idea categories. The additional limitation of ‘encrypted information’, ‘generating a decryption key’, and ‘rotatable public key’ merely describe data security measures applied to the abstract identity-proofing workflow. Such security features are ancillary to the claimed concept of selecting verifications and producing an identity-proofing result and do not change the character of the claim from an abstract information-processing scheme to a technological solution.
Remark 2: Applicant argues “claim 1 is directed to solving numerous technical problems with currently available know-your-customer (KYC)-related processes. In particular, such current processes are inflexible and do not accommodate different use cases or verification policies on the same product. Moreover, identity proofing products are typically tailor-made for a specific type of use case or policy. The current subject matter, as recited in the amended claim 1, is advantageous over the existing solutions in that it dynamically accommodates numerous and various verification use cases all on the same platform or platform architecture. The current subject matter's platform is highly customizable in that numerous different services or applications may be plugged-in to (or unplugged from) the platform based on user needs or requirements. Further, the inventive identity proofing platform, through its requirement of using encrypted user information and its decryption using rotatable key-based protocol, is highly secure and mitigates the risk of sensitive data being compromised. (See, e.g., Specification, para. [0014]) . . . Claim 1 clearly involves execution of programmable logic related to the decryption of encrypted user information through rotatable key protocol, selection of verification policies and systems based on transaction identifiers of a transaction executed by the user, and generation of an identity proofing result for the user information.” (Applicant Arguments, 2025-12-11).
Response to Remark 2: Even with encryption and rotating keys, the claims focus remains identity proofing as a information-processing workflow: collect user information for a transaction, choose which checks to run, invoke third-party checks, and output a pass/fail (or score) result. Adding ‘ encrypted information’ with ‘rotatable public key/decryption key’ is understood as generic data security applied to the same abstract business practice. Indeed, the added limitations read as token ‘apply encryption’ steps that protect data while doing the abstract idea, but do not change the character of the claim from ‘collect-analyze-decide’ into a concrete technological solution with a particularized mechanism beyond conventional cryptographic key rotation and decryption. Accordingly, this contention is unpersuasive.
35 U.S.C. § 102 and § 103
Remark 1: Applicant argues “Painter fails to disclose "generating a decryption key using a rotatable public key and a private key in a key pair, wherein the rotatable public key is configured to be valid during a predetermined period of time and rotated for another public key after expiration of the predetermined period of time," and "decrypting the encrypted information associated with the user using the decryption key in the key pair," as recited in the amended claim 1. In contrast, Painter is silent with regard to use of a decryption key that has been generated and is valid using a specific procedure recited in claim 1. Instead, it simply mentions that its encryption services are provided to internally encrypt/decrypt sensitive information, such as personally identifiable information (PII), and other information received via data vendor service and interface proxy service. (Painter, para. [0057]). This is different from the recitation of claim 1. In the Office Action, the Examiner cited Hall to illustrate the "generating a decryption key..." and "decrypting..." elements. (See, Office Action, pgs. 39-41). Hall appears to relate to online authentication of online attributes. (Hall, Abstract). Hall's key management system (KMS) of an authentication party allows individual encryption keys to be generated on-demand from a base key. The ability to derive keys is deployed in a highly distributed environment without the need to continuously replicate keys across servers. For example, if a key is required for disaster recovery or audit purposes, the key may be regenerated. (Hall, para. [0104]). This is different from generation of a decryption key using "a rotatable public key and a private key in a key pair", where the public key is rotatable and is rotated for another public key after a certain period of time.” (Applicant Arguments, 2025-12-11).
Response to Remark 1: Examiner respectfully disagrees, as the cited references (e.g. Painter, Gouget, and Fisher) still teach the amended independent claims, as shown at least in paragraphs 57-58 of Painter, and paragraphs 22, 36-37, and 49 of Gouget, and as further outlined in paragraphs 28-31 of this action. Painter teaches receiving a users transaction/application request, uses a transaction/application identifier, handles encrypted PII via encrypted/decryption services, and applies rules/policies to determine required identity/fraud data, fetches verification data from external systems via API, and outputs an approval/decision result (e.g. identity proofing result). Further, Gouget teaches time-bounded/rotating keying material (e.g. keys refreshed on a schedule with an inherent time reference for automatic cycling) used to derive cryptographic keys for decryption. Accordingly, this contention is unpersuasive.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1
Step 1 of the eligibility analysis asks is the claim to a process, machine, manufacture or composition of matter (See MPEP § 2106.03, subsections I and II). Claims 1-11 are directed to a computer-implemented method (i.e., process). Claims 12-18, and 19-20 are directed to a computer-implemented system (i.e., machine, and manufacture). Therefore, these claims fall within the four statutory categories of invention.
Step 2A, Prong One
Prong One asks does the claim recite an abstract idea, law of nature, or natural phenomenon (MPEP § 2106.04(II)(A)(1)). Claims 1, 12 and 19 under a broadest reasonable interpretation recite an abstract idea because the claims describe controlling authorization for a key-based encryption/decryption, and performing an approval step upon encryption/decryption, grouped within the “certain methods of organizing human activity” grouping of abstract ideas (MPEP § 2106.04(a)(2), subsection II). The claim limitations reciting the abstract idea are grouped within the “certain methods of organizing human activity” grouping of abstract ideas because the limitations describe commercial or legal interactions, including agreements in the form of contracts relating to digital assets. The following underlined claim limitations recite the abstract idea. The non-underlined claim limitations recite additional elements.
Claim 1:
A computer-implemented method, comprising:
requesting, using at least one processor, information associated with a user in response to a transaction executed by the user, wherein the information associated with the user is an encrypted information associated with the user.
generating a decryption key using a rotatable public key and a private key in a key pair, wherein the rotatable public key is configured to be valid during a predetermined period of time and rotated for another public key after expiration of the predetermined period of time; and
decrypting the encrypted information associated with the user using the decryption key in the key pair
generating, using the at least one processor, a transaction identifier associated with the transaction, wherein the transaction identifier indicating one or more transaction attributes indicative of one or more verifications associated with the information associated with the user;
selecting, using the at least one processor, using the one or more transaction attributes, a verification policy in a plurality of verification policies for verifying the information associated with the user, the verification policy specifying the one or more verifications of the information associated with the user;
requesting, using the at least one processor, one or more verification systems to perform verification of the information associated with the user using the one or more verifications; and
generating, using the at least one processor, an identity proofing result of the information associated with the user based on a response to the requesting received from the one or more verification systems.
Claim 12:
A system, comprising:
at least one processor; and
at least one non-transitory storage media storing instructions, that when executed by the at least one processor, cause the at least one processor to
generate a transaction identifier associated with a transaction requested by a user, wherein the transaction identifier indicating one or more transaction attributes indicative of one or more verifications associated with information associated with the user for executing the transaction, wherein the information associated with the user is an encrypted information associated with the user;
generate a decryption key using a rotatable public key and a private key in a key pair, wherein the rotatable public key is configured to be valid during a predetermined period of time and rotated for another public key after expiration of the predetermined period of time;
decrypt the encrypted information associated with the user using the decryption key in the key pair;
identify, using the one or more transaction attributes, a verification policy for verifying the information associated with the user, the verification policy specifying the one or more verifications of the information associated with the user;
execute the one or more verifications, wherein execution of the one or more verifications includes requesting one or more verification systems to perform verification of the information associated with the user using the one or more verifications; and
provide an identity proofing result of the information associated with the user based on execution of the one or more verifications.
Claim 19:
A computer program product comprising a non-transitory machine-readable medium storing instructions that, when executed by at least one programmable processor, cause the at least one programmable processor to:
determine a transaction identifier associated with a transaction requested by a user, wherein the transaction identifier indicating one or more transaction attributes indicative of one or more verifications associated with information associated with the user for executing the transaction, wherein the information associated with the user is an encrypted information associated with the user;
generate a decryption key using a rotatable public key and a private key in a key pair, wherein the rotatable public key is configured to be valid during a predetermined period of time and rotated for another public key after expiration of the predetermined period of time;
decrypt the encrypted information associated with the user using the decryption key in the key pair;
identify, using the one or more transaction attributes, a verification policy for verifying the information associated with the user, the verification policy specifying the one or more verifications of the information associated with the user;
execute the one or more verifications, wherein execution of the one or more verifications includes requesting one or more verification systems to perform verification of the information associated with the user using the one or more verifications; and
provide an identity proofing result of the information associated with the user based on execution of the one or more verifications.
Step 2A, Prong Two
Prong Two asks does the claim recite additional elements that integrate the judicial exception into a practical application (MPEP § 2106.04(II)(A)(2)). Here, the additional elements of a processor, key generation, and key decryption, individually and in combination, are recited at a high level of generality as generic and conventional elements merely serving as a tool to perform the abstract idea (MPEP § 2106.05(f)) and generally linking the use of the abstract idea to a particular technological environment (MPEP § 2106.05(h)). The description of the additional elements evidences that they are generic and conventional elements used as tools to perform the abstract idea. The processor may be a generic and conventional processor (See Spec. 0046).The key generation may be a generic and conventional key generation (See Spec. 0025-0026). The key decryption may be a generic and conventional key decryption (See Spec. 0026).
Step 2B
Step 2B determines whether the claim as a whole amount to significantly more than the abstract idea itself (MPEP § 2106.05). Evaluating additional elements to determine whether they amount to an inventive concept requires considering them both individually and in combination to ensure that they amount to significantly more than the abstract idea itself. Individually, the additional elements do not amount to significantly more than the abstract idea. As discussed previously, the description of the additional elements evidences that they are generic and conventional elements used as tools to perform the abstract idea (See Spec. 0025-0026, and 0046). As such, the additional elements merely serve as a tool to perform the abstract idea and generally link the use of the abstract idea to a particular technological environment. The ordered combination recites no more than the individual elements do. Thus, the additional elements are not significantly more than the abstract idea. Accordingly, the claims are directed to the abstract idea identified above without significantly more. The claims are not eligible, warranting a rejection for lack of subject matter eligibility and concluding the eligibility analysis.
Dependent Claims:
Claims 2-3, 6-11, 13-14, 17-18, and 20 have also been analyzed. However, the subject matter of these claims also fails to recite patent eligible subject matter for the following reasons:
Claim 2 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes using personal details to decide whether someone passes a check. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the information associated with the user is a personally identifiable information
Claim 3 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes checking a person by comparing basic identity fields. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the personally identifiable information includes at least one of: a first name of the user, a last name of the user, a middle name of the user, an e-mail address of the user, a home phone number of the user, a mobile phone number of the user, a work phone number of the user, a date of birth of the user, a tax identification number of the user, one or more different types of addresses of the user, associated cities of the user, associated state codes of the user, associated postal codes of the user, associated country codes of the user, and any combination thereof.
Claim 6 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes decide which checks to run, then decide pass/fail. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications include at least one of: one or more requested verifications, one or more required verifications, and any combinations thereof.
Claim 7 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes ask trusted third parties to confirm facts. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications systems include at least one of: a government ID application system, a credit bureau verification system, a mobile network operator system, and any combinations thereof.
Claim 8 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes looking an ID, reading it, and comparing. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications performed using the government ID application system is performed by
redirecting to the government ID application system to perform a government ID verification, wherein the government ID application system instructs the user to submit at least one government ID image and processes the at least one government ID image;
receiving identity or biographical information associated with the submitted at least one government ID image;
comparing the received identity or biographical information with the information associated with the user.
Claim 9 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes identity verification by sending user data to a credit bureau and receiving a verification result. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications performed using the credit bureau verification is performed by
sending the information associated with the user to the credit bureau verification system; and
receiving a credit bureau verification result from the credit bureau verification system.
Claim 10 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes asking the carrier if the phone/user matches, then deciding. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
the one or more verifications performed using the mobile network operator system is performed by:
sending the information associated with the user to the mobile network operator system; and
receiving a mobile network operator verification result from the mobile network operator system.
Claim 11 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes applying a rulebook (e.g. compliance/risk) to approve/deny. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications of the information associated with the user is performed using at least one of the following verifications: know your customer (KYC), anti-money laundering (AML), consumer credit risk, social networking, anti-spamming, an e-commerce transaction, and any combination thereof.
Claim 13 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes decide using personal data. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the information associated with the user is a personally identifiable information.
Claim 14 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes comparing basic identifiers. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the personally identifiable information includes at least one of: a first name of the user, a last name of the user, a middle name of the user, an e-mail address of the user, a home phone number of the user, a mobile phone number of the user, a work phone number of the user, a date of birth of the user, a tax identification number of the user, one or more different types of addresses of the user, associated cities of the user, associated state codes of the user, associated postal codes of the user, associated country codes of the user, and any combination thereof.
Claim 17 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes system that distinguishes required vs optional checks in deciding identity. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications include at least one of: one or more requested verifications, one or more required verifications, and any combinations thereof.
Claim 18 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes outsourcing confirmation to authorities. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications systems include at least one of: a government ID application system, a credit bureau verification system, a mobile network operator system, and any combinations thereof.
Claim 20 recites the following underlined claim elements as abstract ideas while the non-underlined claim elements recite additional elements according to MPEP 2106.04(a). The claim recites an abstract idea because the claim describes applying a screening rubric to approve/deny. The non-underlined additional elements fail to recite a practical application or significantly more than the abstract idea because it merely serves as a tool to perform the abstract idea (MPEP § 2106.05(f)).
wherein the one or more verifications of the information associated with the user is performed using at least one of the following verifications: know your customer (KYC), anti-money laundering (AML), consumer credit risk, social networking, anti-spamming, an e-commerce transaction, and any combination thereof.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 6-9, 11-16, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Painter et al. (US20180204280) in view of Gouget et al. (US20130301828A1) (hereinafter “Gouget”).
As per Claim 1, 12, and 19, Painter teaches:
A computer-implemented method, comprising: requesting, using at least one processor, information associated with a user in response to a transaction executed by the user; (“The system is configured to receive a request to approve a user application, access rules/models that fit the goal of approving the user application, obtain data from distributed sources, apply rules/models to generate processed data and determines if the obtained or processed data fits the rules.” (Abstract); (“Receive inputs with initial user information.” (Fig. 3, 302); (“Receive confirmed user information.” (Fig. 3, 314))
wherein the information associated with the user is an encrypted information associated with the user. (“Encryption services 208 are provided to internally encrypt/decrypt sensitive information, such as personally identifiable information (PII), and other information received via data vendor service 270 and interface proxy service 204. At least some data communicated between data processing system 100 and a client computing device may be encrypted beyond encryption generally used to encrypt communications (such as HTTPs). For example, PII provided by a client application (e.g., mobile application 114) may be encrypted according to a first encryption protocol. Interface proxy service 204 may forward the encrypted PII for use by other services, such as user application service 210, which cannot decrypt the information.” (Para. 0057-0058).
generating, using the at least one processor, a transaction identifier associated with the transaction, wherein the transaction identifier indicating one or more transaction attributes indicative of one or more verifications associated with the information associated with the user; (“Decision controller may assign a decision request a unique decision identification and return the decision identification to the requesting service. Decision controller 252 may pass a request for a decision along with relevant input data to decision engine 254 and pass the decision result to a requesting service.” (Para. 0067); (“The decision rules may apply conditions to input data from a user application, the output of a sub-decision, a prediction from a prediction model or data from a data source. For example, approval decision 1210 may include initial checks and a rule that requires an application to pass each sub-decision to pass the approval decision. Fraud decision 1220, in the embodiment illustrated includes fraud detection rules and identity verification rules, credit decision 1230 includes credit check rules and affordability decision 1240 includes income verification rules and affordability rules. A decision may also specify the decision outputs, for example, decline codes that are output or scores that are passed.” (Para. 0180); (“User application service 210 creates a user application having a unique application id for the user. User application service 210 returns the application id to client application 114 (via interface proxy service 204) for use in future communication regarding the application.” (Para. 0061); (“client application 114 may include an interface for an identification verification service and be configured to send the images input at step 304 to the identification verification service.” (Para. 0098))
selecting, using the at least one processor, using the one or more transaction attributes, a verification policy in a plurality of verification policies for verifying the information associated with the user, the verification policy specifying the one or more verifications of the information associated with the user; (“In some embodiments, the system is configured to receive a request for approval, access rules/models that fit the goal of approving or denying the user application, obtain data from distributed sources, apply rules/models to generate processed data and determine if the obtained or processed data fits the rules to determine if the application is approved. The system may also apply the rules/models to the obtained or processed data to generate a score for the user that can be used in downstream processes.” (Para. 0008); (“More particularly, embodiments relate to a rules/model-based data processing system that approves applications by users of the data processing system based on data from distributed sources. In some embodiments, the system is configured to receive a request for approval, access rules/models that fit the goal of approving or denying the user application, obtain data from distributed sources, apply rules/models to generate processed data and determine if the obtained or processed data fits the rules to determine if the application is approved. The system may also apply the rules/models to the obtained or processed data to generate a score for the user that can be used in downstream processes. The system can thus leverage a variety of distributed data systems to enhance the consumer information and apply rules specific to the data obtained from the data systems and processed data generated from the obtained data to approve an application with a high degree of certainty, very quickly (e.g., within five minutes, in some cases in less than a minute and, even more preferably, in less than ten seconds from a request to approve an application). The process approving a user application can be achieved using a simple operator interface on a mobile device and, in some embodiments, in a single client session in real-time. In some cases, if a user fails a step in the approval process, the system may request additional information from the user. Thus, the complexity of the interface may depend on the user or quality of information provided by the user.” (Para. 0029); (“The system may further comprise a data store storing approval rules referencing a set of information provider data from a remote information provider system. The system may further comprise a server computer server computer coupled to the data store and comprising a server processor and a server data application executable to access the approval rules based on a request to approve the user application from the mobile application, connect to the remote information provider system to retrieve, using personally identifiable information from the user application, the set of information provider data referenced by the approval rules, analyze the set of information provider data to determine a debt obligation of the user, based on the debt obligation of the user, apply the approval rules to determine an affordability score for the user and send a decision response and the affordability score to the mobile application, wherein the mobile application is further executable to present an application page indicating approval of the user application and the affordability score in response to the decision response indicting approval of the user application. The server data application may be further executable to return the decision response to the mobile application in the same session in which the request to approve the user application was received.” (Para. 0009))
requesting, using the at least one processor, one or more verification systems to perform verification of the information associated with the user using the one or more verifications; and (“Fetch Identity Verification via API.” (Fig. 6, 610); (“Fetch Credit Report via API.” (Fig. 8, 1006); (“The system may further comprise a server computer server computer coupled to the data store and comprising a server processor and a server data application executable to access the approval rules based on a request to approve the user application from the mobile application, connect to the remote information provider system to retrieve, using personally identifiable information from the user application, the set of information provider data referenced by the approval rules, analyze the set of information provider data to determine a debt obligation of the user, based on the debt obligation of the user, apply the approval rules to determine an affordability score for the user and send a decision response and the affordability score to the mobile application, wherein the mobile application is further executable to present an application page indicating approval of the user application and the affordability score in response to the decision response indicting approval of the user application.” (Para. 0009))
generating, using the at least one processor, an identity proofing result of the information associated with the user based on a response to the requesting received from the one or more verification systems. (“A decision engine 175 applies approval rules 140 to user application data provided by user application module 166 to approve or deny the application. Examples of approval rules 140 include, but are not limited to, fraud detection rules, identity verification rules, credit check rules, income verification rules and affordability rules. If an application is not approved, decision engine 175 may return the reason that the application was not approved. A failure to pass the approval rules may result in any configured action, such as withholding further information or services from the consumer, requesting the consumer re-enter information or provide additional information, and/or alerting an authority that of the failed check.” (Para. 0038); (“As part of determining a fair affordable monthly payment, the rules or model used to determine affordability may take into account additional costs associated with a purchased asset. For example, if a consumer is purchasing a vehicle, the affordability score may be calculated to leave room in the consumer's monthly budget for items such as gas and regular maintenance and thus the affordable monthly payment determined for the consumer can be selected to allow the consumer to underwrite the loan while paying for other expected costs associated with the asset (e.g., insurance, maintenance, gas).” (Para. 0160); (“The server can connect to the remote information provider system to retrieve, using personally identifiable information from the user application, the set of information provider data referenced and approve the user application based on the application of the approval rules to the information provider data.” (Abstract)).
Painter does not disclose:
“further comprising generating a decryption key using a rotatable public key and a private key in a key pair, wherein the rotatable public key is configured to be valid during a predetermined period of time and rotated for another public key after expiration of the predetermined period of time; and decrypting the encrypted information associated with the user using the decryption key in the key pair.” (claim 1).
However, as per Claim 1, Gouget in the analogous art of secured authentication between parties, teaches: “further comprising generating a decryption key using a rotatable public key and a private key in a key pair, wherein the rotatable public key is configured to be valid during a predetermined period of time and rotated for another public key after expiration of the predetermined period of time; and decrypting the encrypted information associated with the user using the decryption key in the key pair.”. (“the server S generates the ephemeral key pair (sks,pks) and sends the public key pks or an element computed from pks to the client C.” (Para. 0036); “The server authenticates the ephemeral public key pks to be used later on for the Chip Authentication CA step” (Para. 0037); “The Diffie-Hellman key shared between the client C and the server S is used to compute or verify the authentication token and also to establish the secure channel. This Diffie-Hellman key is computed from pkc and sks at the server side and from skc and pks at the client side.” (Para. 0039); (“allows to derivate common session keys between the client C and the gateway G. These keys are then used for establishing a secure channel.” (Para. 0049); (“a secure channel is established between the server S and the client C such that the gateway G cannot access to the plaintext data transmitted into the secure channel.” (Para. 0022)).
It would have been obvious to one of ordinary skill in the art before the effective filing date to combine the method of Painter with the technique of Gouget to include a public-private key decryption mechanism in the authentication process. Therefore, the incentives of providing increased transaction security between the parties provided a reason to make an adaptation, and the invention resulted from application of the prior knowledge in a predictable manner.
As per Claim 2, Painter teaches:
The method of claim 1, wherein the information associated with the user is a personally identifiable information. (“Encryption services 208 are provided to internally encrypt/decrypt sensitive information, such as personally identifiable information (PII), and other information received via data vendor service 270 and interface proxy service 204. At least some data communicated between data processing system 100 and a client computing device may be encrypted beyond encryption generally used to encrypt communications (such as HTTPs). For example, PII provided by a client application (e.g., mobile application 114) may be encrypted according to a first encryption protocol. Interface proxy service 204 may forward the encrypted PII for use by other services, such as user application service 210, which cannot decrypt the information.” (Para. 0057-0058).
As per Claim 3, Painter teaches:
The method of claim 2, wherein the personally identifiable information includes at least one of: a first name of the user, a last name of the user, a middle name of the user, an e-mail address of the user, a home phone number of the user, a mobile phone number of the user, a work phone number of the user, a date of birth of the user, a tax identification number of the user, one or more different types of addresses of the user, associated cities of the user, associated state codes of the user, associated postal codes of the user, associated country codes of the user, and any combination thereof.; (“Confirm or Edit your Current Address . . . ‘123 Main Street’. . .” (Fig. 4D).
As per Claim 6, Painter teaches:
The method of claim 1, wherein the one or more verifications include at least one of: one or more requested verifications, one or more required verifications, and any combinations thereof. (“Load Rules. . . [and] Determine Required Fraud Detection Data and Identity Verification Data and Sources.” (Fig. 6, 600 and 602).
As per Claim 7, Painter teaches:
The method of claim 1, wherein the one or more verifications systems include at least one of: a government ID application system, a credit bureau verification system, a mobile network operator system, and any combinations thereof. (“Receive images of Government Identification.” and “Authenticity Check” (Fig. 3, 306 and 308); “Fetch Credit Report via API.” (Fig. 8, 1006))
As per Claim 8, Painter teaches:
The method of claim 7, wherein the one or more verifications performed using the government ID application system is performed by redirecting to the government ID application system to perform a government ID verification, wherein the government ID application system instructs the user to submit at least one government ID image and processes the at least one government ID image; receiving identity or biographical information associated with the submitted at least one government ID image; comparing the received identity or biographical information with the information associated with the user. (“Receive Images of Government Identification.” (Fig. 3, 304); (“Extract Identification Information from Government Identification.” (Fig. 3, 306); (“Authenticity Check.” (Fig. 3, 308))
As per Claim 9, Painter teaches:
The method of claim 7, wherein the one or more verifications performed using the credit bureau verification is performed by sending the information associated with the user to the credit bureau verification system; and receiving a credit bureau verification result from the credit bureau verification system.; (“Fetch Credit Report via API.” (Fig. 8, 1006); (“Calculate Affordability Score.” (Fig. 8, 1012))
As per Claim 11, Painter teaches:
The method of claim 1, wherein the one or more verifications of the information associated with the user is performed using at least one of the following verifications: know your customer (KYC), anti-money laundering (AML), consumer credit risk, social networking, anti-spamming, an e-commerce transaction, and any combination thereof.; (“Prediction: Credit Risk Prediction [,] Fraud Detection Rules [,] Identity Verification Rules. . .” (Fig. 10, 1220); see also (“Credit Check Rules [and] Decision Outputs.” indicating consumer credit risk verification (Fig. 10, 1230))
As per Claim 13, Painter teaches:
The system of claim 12, wherein the information associated with the user is a personally identifiable information. (“Encryption services 208 are provided to internally encrypt/decrypt sensitive information, such as personally identifiable information (PII), and other information received via data vendor service 270 and interface proxy service 204.” (Para. 0057).
As per Claim 14, Painter teaches:
The system of claim 13, wherein the personally identifiable information includes at least one of: a first name of the user, a last name of the user, a middle name of the user, an e-mail address of the user, a home phone number of the user, a mobile phone number of the user, a work phone number of the user, a date of birth of the user, a tax identification number of the user, one or more different types of addresses of the user, associated cities of the user, associated state codes of the user, associated postal codes of the user, associated country codes of the user, and any combination thereof. (See Fig. 4D, “Confirm or Edit Your Current Address . . . Address Line 1 [-] 123 Main St. . . .”, and “For example, FIGS. 4D and 4E illustrate example application pages with data extracted from the user's driver's license populated in editable fields. The user may edit the information and interact with a control (e.g., “Looks Good” virtual button in FIG. 4E) to submit the user information as originally populated or updated by the user. In response to an input signal based on user interaction in the GUI (e.g., in response to the user tapping the “Looks Good” virtual button), client application 114 can send the additional user information to data processing system 100 to update the user application record. In the example of FIG. 2, user application service 210 may update the user application record in user application service data store 212 with the received information. In other embodiments, the confirmed user information may be stored in representation of the user application 118 and forwarded to data processing system 100 at a later time.” (Para. 0100).
As per Claim 17, Painter teaches:
The system of claim 12, wherein the one or more verifications include at least one of: one or more requested verifications, one or more required verifications, and any combinations thereof. ( See. Fig. 6, “Load Rules [600,] Determine Required Fraud Detection Data and Identify Verification Data and Sources [602]., and “Data application 150 can provide PII (e.g., user name, user address, user phone number, user email address, date of birth, driver's license number or other PII, financial institution information, such as a Plaid token, or other information) from the application to one or more income projection services and income estimation services, receive responsive income verification data, and apply an income prediction model and income verification rules to income verification data to determine a verified income.” (Para. 0129)
As per Claim 18, Painter teaches:
The system of claim 12, wherein the one or more verifications systems include at least one of: a government ID application system, a credit bureau verification system, a mobile network operator system, and any combinations thereof. (See Fig. 3, “Receive Images of Government Identification [304] . . . Authenticity Check [308].”, and “Furthermore, an image (a scan or digital photograph) of the user's government identification can be used to enhance PII without requiring explicit field inputs for each piece of information. To this end, client application 114 can receive an image of a government identification (step 304).” (Para. 0096); (“At step 308 the authenticity of the government identification may be checked. Client application 114 or data application 150 may include code to verify the authenticity of the identification or may leverage third party identification verification services.” (Para. 0098)
As per Claim 20, Painter teaches:
The computer program product of claim 19, wherein the one or more verifications of the information associated with the user is performed using at least one of the following verifications: know your customer (KYC), anti-money laundering (AML), consumer credit risk, social networking, anti-spamming, an e-commerce transaction, and any combination thereof. (See Fig. 10, “Prediction: Credit Risk Prediction 1.0 [,] Fraud Detection Rules [,] Identity Verification Rules.”, and “For example, fraud decision 1220 references a data source for Threatmetrix data. In addition, the decisions may reference prediction models. For example, credit decision 1230 references a credit risk prediction and affordability decision references an income prediction. The prediction models may further reference data sources.” (Para. 0181); (“Data processing system 100 searches inventory items based on affordability score. Data processing system 100 may also search its program pool for eligible inventory items based on other parameters, such as credit risk. Accordingly, data processing system 100 can determine the affordability score and other parameters associated with the consumer (step 1602). In some implementations, the affordability score may be included in the request from client application 114.” (Para. 0191).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Painter et al. (US20180204280) (hereinafter “Painter”) in view of Gouget in further view of Fisher et al. (US20180124047A1) (hereinafter “Fisher”).
As per Claim 10, Painter teaches:
The method of claim 1, . . ..
Painter and Gouget not disclose:
“the one or more verifications performed using the mobile network operator system is performed by: sending the information associated with the user to the mobile network operator system; and receiving a mobile network operator verification result from the mobile network operator system.” (claim 10).
However, as per Claim 10, Fisher in the analogous art of secured payments between parties, teaches: “the one or more verifications performed using the mobile network operator system is performed by: sending the information associated with the user to the mobile network operator system; and receiving a mobile network operator verification result from the mobile network operator system.”. (“Expanding on this example, consider that in addition to confirming the Claimant's phone number, the verification service also returns the matching home address. This newly collected additional identity attribute is used to further build the verified identity by comparing it to the home address listed on the Claimant's driver's license. Layering these interconnected identity data elements and then cross checking to multiple identity verification services yields a higher identity assurance score.” (Para. 0042); (“Method of claim 1 wherein the collected personal identity attributes are provided by the Claimant and include some or all of this data: home address, phone number(s), email address(es), Social Security number, image(s) of Government-issued credential(s), selfie, and other biometrics; Method of claim 1 wherein the collected personal identity data elements are obtained by surreptitious means to include some or all of this information: mobile phone number, session IP address, GPS location, MAC address, gestures, and other device forensics.” (Claims 3 and 4); (“Mobile Phone Verification [308]—Claimant's mobile phone offers an increasing number of options for verification such as out-of-band SMS or voice call verification.” (Para. 0050))
It would have been obvious to one of ordinary skill in the art before the effective filing date to combine the methods of Painter and Gouget with the technique of Fisher to include carrier-based identity checking in the authentication process. Therefore, the incentives of providing increased transaction security between the parties provided a reason to make an adaptation, and the invention resulted from application of the prior knowledge in a predictable manner.
Conclusion
The following prior art made of record and not relied upon is considered pertinent to applicant's disclosure: US20210226769A1 (Snow), discussing “For many payment transactions, the issuer of the payment card account or another entity mandates “two factor” security—that is, the user must not only present a physical credential (e.g., a payment card or payment-enabled mobile device), but also must provide additional information to verify that the user is the person who is authorized to present the credential. The presentation of additional information is sometimes referred to in the payment card industry as a “cardholder verification method”, or “CVM”. A widely used CVM calls for the user to enter a “PIN”, i.e., a “personal identification number”. Often when a payment card (and especially a debit card) is presented to a POS terminal, the user is prompted to enter his/her PIN to satisfy a CVM requirement. There have also been many proposals for CVM requirements involving receipt of biometric information from the user.” (Para. 0003); see also “a biometric unit that may be incorporated in some embodiments of the smartphone 102 in cases where the types of cardholder verification methods to be supported by the shared CVM applet 308 are to include biometric measures such as fingerprint reading, facial recognition, voice recognition, etc. It will be appreciated that in the case of at least some of these measures, the biometric unit 328 may be at least partially constituted by conventional features of the smartphone 102, such as the microphone 220 (FIG. 2) or a digital camera (not shown). It will be appreciated that the microphone 220 may serve as an input device to receive voice signals from the user as an input to a voice recognition CVM procedure; the camera, if present, may serve as an input device to generate image input signals for a “selfie” (photograph of the user's face) as an input to a facial recognition CVM procedure. Biometric information received by the smartphone 102 via the biometric unit 328 may be provided to the CVM capture software module 316, as indicated at 330.” (Para. 0037).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Justin A. Jimenez whose telephone number is (571) 270-3080. The examiner can normally be reached on 8:30 AM - 5:00 PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John W. Hayes can be reached on 571-272-6708. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Justin Jimenez/
Patent Examiner, Art Unit 3697
/ARI SHAHABI/Primary Examiner, Art Unit 3697