DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/17/2025 has been entered.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-27 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1-10, 13, 15, 17 and 18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Statutory category
Claim 1 is directed to a “method” and therefore recites a process. Claims 16 recites an apparatus, therefore , is a system. Claims 16 recite apparatuses comprising at least one processor and Claim 22 at least one non transitory computer readable medium, which fall within the machine category. Claims 22 recite non transitory computer readable media and fall within the manufacture category. Thus, the claims fall within one of the four statutory categories of invention.
Step 2A Prong I Judicial exception
Under the 2019 Revised Patent Subject Matter Eligibility Guidance, each independent claim is evaluated to determine whether it recites a judicial exception, including abstract ideas such as mental processes or method of organizing human activity, which have been recognized as abstract idea. Thus, the analysis moves towards step 2A, Prong II.
For this analysis, generic references to “a communication device,” “processor,”, “memory”, and “non-transitory computer readable medium” are disregarded, and the focus is on the remaining substantive language.
For claim 1, once the generic computer implementation language is removed, the method recites that it:
obtaining … an identifier of a communications device;
asking for a phone number, and receiving a phone number from a person
extracting first biometric parameters from [a] captured image of [a] first content;
visually inspecting facial features of a person by looking at them
extracting second biometric parameters and identifying information from [a] captured image of [a] second content;
visually inspecting facial features of a photo of a user in a passport
performing a first comparison between the first biometric parameters and the second biometric parameters;
comparing if the person’s face matches their photo in the passport
performing a second comparison between the identifying information and account parameters associated with the identifier of the communications device; and
looking up a person’s personal information using their phone number to find their name and address, and comparing that information with the information in the passport.
generating, based in part on the first comparison and the second comparison, an authentication signal (for transmission to a client computing resource = intended use language) - Verbally stating that the person is the same or not as depicted in the passport.
Accordingly, under Step 2A Prong I of the 2019 Guidance, independent claims 1, 16 and 22 each recite an abstract idea in the form of mental processes or method of organizing human activity, even when generic references to electronic or computer implementation are disregarded.
Step 2A Prong II Integration into a practical application:
Under Step 2A Prong II, the claims are evaluated to determine whether any additional elements, viewed individually and in combination, integrate the identified abstract idea into a practical application.
In claim 1, the elements beyond the abstract mental steps or method of organizing human activity are that the method steps are implemented “Claim 1 is further analyzed in step 2A prong 2, to evaluate whether the claim as a whole integrates the recited judicial exception into a practical application of the exception. This evaluation is performed by identifying whether there are any additional elements recited in the claim beyond the judicial exception, and evaluating those additional elements individually and in combination to determine whether the claim as a whole integrates the exception into a practical application. However, each of the remaining limitation “a communication device,” “processor,”, “memory”, and “non-transitory computer readable medium” and “transmitting, via the communications network, a message (intended use = to request an upload of a captured image of first content and to request an upload of a captured image of second content)” appears to be generic computer functions which do not constitute meaningful limitations that would amount to significantly more than the abstract idea.
The combination of these additional element is no more than generic computer functions. Thus, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limitations on practicing the abstract idea.
Independent claims 1, 16 and 22 therefore do not integrate the abstract idea into a practical application under Step 2A Prong II.
Step 2B Inventive concept:
Under Step 2B, the claims are analyzed to determine whether any additional element, or combination of elements, amounts to significantly more than the abstract idea itself, that is, whether there is an inventive concept. As discussed above, the additional elements in the independent claims consist of generic computer components such as processors, non transitory computer readable media, and a computer device, as well as the statement that the abstract scheme is used in an anonymous credential system. The specification describes these components at a high level as conventional computing devices suitable for executing instructions. Implementing the recited abstract operations of
obtaining … an identifier of a communications device
asking for a phone number, and receiving a phone number from a person
extracting first biometric parameters from [a] captured image of [a] first content;
visually inspecting facial features of a person by looking at them
extracting second biometric parameters and identifying information from [a] captured image of [a] second content;
visually inspecting facial features of a photo of a user in a passport
performing a first comparison between the first biometric parameters and the second biometric parameters;
comparing if the person’s face matches their photo in the passport
performing a second comparison between the identifying information and account parameters associated with the identifier of the communications device; and
looking up a person’s personal information using their phone number to find their name and address, and comparing that information with the information in the passport.
generating, based in part on the first comparison and the second comparison, an authentication signal (for transmission to a client computing resource = intended use language) is well understood, routine and conventional in the field of computer implemented data processing.
Independent claims 1, 16 and 22 therefore do not integrate the abstract idea into a practical application under Step 2B.
Accordingly, independent claims 1, 16 and 22, and dependent claims 2-15, 17-20 and 23-27 that stand with them, do not recite an inventive concept sufficient to transform the abstract idea into a patent eligible application. The claims are therefore directed to an abstract idea and fail to amount to significantly more than the judicial exception under 35 U.S.C. 101.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 4‑7 are rejected under 35 U.S.C. 112(a) (pre‑AIA 35 U.S.C. 112, first paragraph) as failing to comply with the written description requirement. The claims contain subject matter that is not described in the specification in such a way as to reasonably convey to one skilled in the art that the inventor had possession of the claimed invention at the time of filing. The Summary states that, in particular embodiments, “the first content includes a user document” and that the user document may be a government‑issued identification document, while “the second content includes an image of a portion of the user's person” (spec., ¶5) in the application as filed). The detailed description, however, discloses implementations in which an identity verifier analyzes “first content” as “an uploaded image of a portion of a subscriber's person (for example, a subscriber’s face or facial features)” and analyzes “second content, such as an uploaded image of a user document (for example, a driver's license, passport, or other type of government‑issued document)” to extract arrays of feature vectors and additional printed parameters such as date of birth, city of residence, and physical address (spec., ¶30). The flowchart description likewise explains that “first content may comprise an image of a portion of the subscriber's person” and “second content may comprise a subscriber document, such as a driver's license, a passport, an identification card, a healthcare ID card, or any other government‑issued document (or even a private company‑issued document)” (spec., ¶57). Thus, the only detailed embodiments consistently treat the user document as “second content,” with the system logic (capture, upload, extraction, and comparison) organized around that assignment (spec.,¶30 and ¶57), whereas dependent claim 4 affirmatively recites that “the first content comprises a user document,” and claims 5‑7 further specify the user document as a government‑issued identification document. The specification does not describe how the claimed methods operate when a user document is “first content” rather than “second content,” and therefore does not reasonably convey possession of the specific combination now recited in claims 4‑7. Accordingly, claims 4‑7 are rejected under 35 U.S.C. 112(a) for lack of written description support.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-27 are rejected under 35 U.S.C. 103 as being unpatentable over Larson et al. (U. S. Pat. No. 10,693,872 B1) (hereinafter “Larson”) in view of Genner et al. (U. S. PGPub. No. 2017/0134366 A1) (hereinafter “Genner”)
Regarding Claim 1, Larson teaches:
transmitting, via the communications network, a message to request an upload of a captured image of first content (Larson: [Col 22, lines 11-17], the IVS servers 145 may provide interfaces that allow an applicant/enrollee operating client system 105A to capture various forms of biometric data, enter or record identity information, upload various documents and/or content items, and submit the biometric data (=first content = user document = driving license and/or passport), identity information, and/or uploaded content to the IVS 140 for identity verification or other compliance purposes) and to request an upload of a captured image of second content (Larson: [Col 34, lines 48- 62], (99) FIGS. 5-6 illustrate example instances of a face scan (=second content) GUI in accordance with some embodiments. In FIG. 5, the face scan GUI instance 505 notifies the enrollee that their face is to be scanned. The face scan GUI instance 505 includes instruction text 530 providing instructions on how the enrollee is to perform the face scan. In this example, the instruction text 530 in GUI instance 505 instructs the enrollee to align his/her face in the face outline 535…)
extracting first biometric parameters from the captured image of the first content (Larson: [Col 11, lines 6-13], (31) A second example identity verification service provided by the IVS 140 may include object recognition services, wherein one or more IVS servers 145 are configured to identify a user based on image or video data. The object recognition services may include an enrollment phase and an evaluation phase. During the enrollment phase, an enrollee provides image or video data from which one or more object features are extracted. )
extracting second biometric parameters (Larson: [Col 26, lines 35-37], The facial data may include, for example, feature descriptors of one or more features extracted from the scanned face (=second biometric parameters). [Col 11, lines 21-28], In some embodiments, the one or more of the IVS servers 145 may implement geometric object recognition algorithm(s), wherein features are identified by analyzing the relative position, size, and/or shape of extracted landmarks/features, such as the eyes, nose, cheekbones, jaw, lips, and/or other facial features of a human face (=second biometric parameters); palmar skin patterns (e.g., lines, creases, mounts (or bumps) on the palm of a human hand)) and identifying information from the uploaded captured image of the second content (Larson: [Col 28, lines 34-44], The IVS server(s) 145 use the biographic data to perform several real-time checks 211, 212, and 213 using the biographic data (e.g., driver's license number, Social Security number (SSN), name, address and other identifying data). The check 211 is an identity correlation process that involves discovering and linking disparate biographical information from multiple platforms or institutions that potentially belong to the enrollee. [Col 28, lines 52-59], The check 213 is an identity assessment process where the biographic data is compared with other sources, for example, comparing the provided name, birth date, address(es), and/or SSN against Social Security Administration records, death records, birth certificates, and other publicly available data to determine whether the provided SSN corresponds with the provided name or some other name(s), and the like)
performing a first comparison between the first biometric parameters and the second biometric parameters (Larson: [Col 3, lines 25-29], provides for the biometric data collected during the live interview may also be compared with other collected data such as the validated authentication identity documents (e.g., driver's license photo, passport photo, etc.) and/or prior collected biometric data.)
Larson does not explicitly teach:
obtaining, via a communications network, an identifier of a communications device;
performing a second comparison between the identifying information and account parameters associated with the identifier of the communications device;
and generating based in part on the first comparison and the second comparison, an authentication signal for transmission to a client computing resource.
However, in analogous art, Genner teaches:
obtaining, via a communications network, an identifier of a communications device .
(Genner: [0065], The profile (=subscribers’ information) may also incorporate identifying features such as photographs or videos 20, addresses, emails, telephone numbers (=identifier of the communication device), or other data 150 from public or other records).
performing a second comparison between the identifying information and account parameters associated with the identifier of the communications device (Genner: [0037], algorithm 190 may compare the Subscriber Information 170, such as his age or date of birth, with his driver license or passport 10…(=identity information). If the birth date (=account parameter) supplied by the subscriber matches the data from these records, the subscriber may receive more trust points for being truthful, which may in turn increase his Trust Score 196. On the other hand, if the Subscriber Information 170 does not match these records, the subscriber may be regarded as being dishonest, which may negatively affect his Trust Score 196. For example, if a subscriber states that his age is different from what is indicated on an official document, such as a birth certificate 30, this raises the presumption that the subscriber is being dishonest. [0065], The profile (=subscribers’ information) may also incorporate identifying features such as photographs or videos 20, addresses, emails, telephone numbers (=identifier of the communication device), or other data 150 from public or other records).
and generating based in part on the first comparison and the second comparison, an authentication signal for transmission to a client computing resource (Genner: [0071] An alert may activate the Trust Score 196 viewer application to cause the Trust Score 196 alert (=authentication signal) to display on the remote subscriber computer to enable connection via the digital badge to the data source over the Internet…[0081], The viewer may review the subscriber's portfolio of information in the form of a Trust Stamp for the purpose of evaluating the risk or advantages involved in transacting with the subscriber on a commercial or personal basis. [0084], facilitate transactions that involve a degree of trust may want to confirm the identity or character of the person with whom they are doing business or otherwise relating. This may include the sale or purchase of goods…)
It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Larson’s method of extracting facial and extracting features from the driving license, and comparing with previously collected data by applying Genner’s method of the comparing subscriber’s information with driver license or passport information in order to verify that if subscriber is being honest or dishonest, fraudulent or legitimate. The motivation is to improve identity verification for online users combined with methodology for evaluating for trustworthiness (Genner: [Abstract])
Regarding Claim 2, Larson in view of Genner teaches:
The method of claim 1 (see rejection of claim 1 above),
wherein the identifier of the communications device comprises a mobile telephone number (Genner: Examiner interpreting that telephone numbers comprises mobile telephone numbers. [0065], The profile (=subscribers’ information) may also incorporate identifying features such as photographs or videos 20, addresses, emails, telephone numbers (=identifier of the communication device), or other data 150 from public or other records).
It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Larson’s method of extracting facial and extracting features from the driving license, and comparing with previously collected data by applying Genner’s method of the comparing subscriber’s information with driver license or passport information in order to verify that if subscriber is being honest or dishonest, fraudulent or legitimate. The motivation is to improve identity verification for online users combined with methodology for evaluating for trustworthiness (Genner: [Abstract])
Regarding Claim 3 Larson in view of Genner and teaches:
The method of claim 1 (see rejection of claim 1 above),
wherein the transmitted message comprises a link to a resource that is under the control and/or direction of an identity verifier (Larson: [Col 8, lines 25-36], To provide the identity verification services to the user, the client application 110 (or component 113) may be, or may include, a secure portal to the IVS 140. The secure portal may be a stand-alone application, embedded within a web or mobile application provided by SPP 120, and/or invoked or called by the web/mobile application provided by SPP 120 (e.g., using an API, Remote Procedure Call (RPC), and/or the like). In these cases, graphical objects 115 rendered and displayed within the client application 110 may be a GUI and/or GCEs of the secure portal, which allows the user to share data (e.g., biographic data, biometric data, etc.) with the IVS 140. [Col 10, lines 4-14], These HTTP messages may be sent in response to user interactions with the client application 110 (e.g., when a user submits biographic or biometric data as discussed infra), or the client application 110 may include one or more scripts, which when executed by the client system 105, cause the client system 105 to generate and send the HTTP messages upon loading or rendering the client application 110. Other message types may be used and/or the user and/or client system 105 information may be obtained by other means in other embodiments).
Regarding Claim 4, Larson in view of Genner teaches:
The method of claim 1 (see rejection of claim 1 above),
wherein the first content comprises a user document (Larson: [Col 9, lines 42-55], (27) As discussed previously, the IVS 140 may provide one or more identity verification services for individual users (e.g., a user of client system 105A) and/or users of third-party platforms (e.g., SPP 120). A first example identity verification service provided by the IVS 140 may include a biographic data collection service. This service may involve one or more IVS servers 145 collecting biographic data of a user directly from the client system 105A. For example, the client application 110 may enable the user of client system 105A to scan various identity documents (e.g., driver's license, passport, birth certificate, medical insurance card, etc.) using embedded or accessible sensors (e.g., cameras, etc.), which may then be transmitted to the one or more IVS servers 145).
Regarding Claim 5, Larson in view of Genner teaches:
The method of claim 4 (see rejection of claim 4 above),
wherein the user document comprises a government- issued identification document (Larson: [Col 10, lines 33-39], (30) The first example identity verification service may also involve the one or more IVS servers 145 collecting biographic data of the user from one or more external sources such as, for example, governmental databases (e.g., DMV, police, FBI, electoral records, property records, utility data, etc.), credit bureaus, social media platforms, and/or the like. [Col 28, lines 6-12], (75) At operation 208, the client application 110 performs an identity document scan and validation process. For example, operation 208 may involve the user of client system 105 (the “applicant” or “enrollee”) using an embedded camera to scan a driver's license and/or some other identity document(s) (e.g., government issued ID, passport, student ID, organization/enterprise ID, etc).
Regarding Claim 6, Larson in view of Genner teaches:
The method of claim 5 (see rejection of claim 5 above),
wherein the government-issued identification document comprises a driver's license (Larson: [Col 28, lines 6-19], (75) At operation 208, the client application 110 performs an identity document scan and validation process. For example, operation 208 may involve the user of client system 105 (the “applicant” or “enrollee”) using an embedded camera to scan a driver's license and/or some other identity document(s) (e.g., government issued ID, passport, student ID, organization/enterprise ID, etc.). Other devices may be used to scan the applicant's identity document(s), such as peripheral cameras or image capture devices, document scanners, photocopy machines, and/or other like devices. The client application 110 may access and use the camera using suitable drivers, libraries, APIs, and/or the like. The validation process may involve determining whether the correct document was scanned properly. [Col 30, lines 22-33], FIG. 13 shows an ID scan GUI instance 1305 which notifies the enrollee that a specific ID document is to be scanned. The ID scan GUI instance 1305 includes instruction text 1331 indicating best practices for scanning the ID documents, for example, holding the document flat (or placing the document on a flat surface) and capturing the image in a relatively bright environment. In some embodiments, the instruction text 1331 may also provide instructions regarding the types of ID documents that may be scanned (e.g., driver's license, military ID, naturalization card, passport, green card, or H-1B visa)).
Regarding Claim 7, Larson in view of Genner teaches:
The method of claim 5 (see rejection of claim 5 above),
wherein the government-issued identification document comprises a passport (Larson: [Col 30, lines 22-33], FIG. 13 shows an ID scan GUI instance 1305 which notifies the enrollee that a specific ID document is to be scanned. The ID scan GUI instance 1305 includes instruction text 1331 indicating best practices for scanning the ID documents, for example, holding the document flat (or placing the document on a flat surface) and capturing the image in a relatively bright environment. In some embodiments, the instruction text 1331 may also provide instructions regarding the types of ID documents that may be scanned (e.g., driver's license, military ID, naturalization card, passport, green card, or H-1B visa).
Regarding Claim 8, Larson in view of Genner teaches:
The method of claim 1 (see rejection of claim 1 above),
wherein the second content comprises an image of a portion of the user's person (Larson: [Col 34, lines 48-62], (99) FIGS. 5-6 illustrate example instances of a face scan GUI in accordance with some embodiments. In FIG. 5, the face scan GUI instance 505 notifies the enrollee that their face is to be scanned. The face scan GUI instance 505 includes instruction text 530 providing instructions on how the enrollee is to perform the face scan. In this example, the instruction text 530 in GUI instance 505 instructs the enrollee to align his/her face in the face outline 535. Additionally, before face scanning takes place, the user is shown visual representations 531 of best practices for capturing facial images including, for example, not to wear headwear or glasses (or sunglasses), having a neutral expression, capturing the image in a relatively bright environment, holding the image capture device at (or near) eye level, and/or the like).
Regarding Claim 9 Larson in view of Genner teaches:
The method of claim 8 (see rejection of claim 8 above),
wherein the portion of the user's person comprises at least a portion of the user's face (Larson: [Col 11, lines 21-26], (32) In some embodiments, the one or more of the IVS servers 145 may implement geometric object recognition algorithm(s), wherein features are identified by analyzing the relative position, size, and/or shape of extracted landmarks/features, such as the eyes, nose, cheekbones, jaw, lips, and/or other facial features of a human face. [Col 35, lines 10-32], a GCE may be provided that allows the enrollee to capture the facial image. In this example, the client application 110 (or an IVS server 145) detects that the enrollee is wearing glasses (or sunglasses), which may inhibit facial features from being extracted properly from the captured image. Detecting the glasses (or sunglasses) may cause the face scan GUI instance 515 to be displayed, which includes an interface 540 superimposed or overlaid on top of the GUI instance 515 that notifies the enrollee of the detected glasses (or sunglasses) and asks the enrollee to remove the glasses (or sunglasses) for the face scan. The instruction text in GUI instance 515 also instructs the enrollee to remove the glasses (or sunglasses). The enrollee may perform a tap gesture 520 on a GCE 545 to indicate that the glasses (or sunglasses) have been removed and that the face scan may continue. Additional types of issues that may be auto-detected may include, for example, low light levels (e.g., as compared to a preconfigured threshold light level), wearing headwear/header gear, image capture device not being close enough to face (e.g., as compared to a preconfigured threshold distance), image capture device not being at or near eye level (e.g., as compared to a preconfigured threshold eye level), and/or the like).
Regarding Claim 10 Larson in view of Genner teaches:
The method of claim 1 (see rejection of claim 1 above),
determining that the identifier of the communications device comprises an identifier stored in a database accessible to an identity verifier (Genner: [0028], [0028] With reference to FIG. 1, the Data Values (=subscriber’s information because Genner disclose in [0026], Data Values (referred to herein as the “Subscriber Information”) and the Subscriber Information are first stored 180 on a host computer with a non-transitory computer readable medium)
It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Larson’s method of extracting facial and extracting features from the driving license, and comparing with previously collected data by applying Genner’s method of the comparing subscriber’s information with driver license or passport information in order to verify that if subscriber is being honest or dishonest, fraudulent or legitimate. The motivation is to improve identity verification for online users combined with methodology for evaluating for trustworthiness (Genner: [Abstract])
Regarding Claim 11, Larson in view of Genner teaches:
The method of claim 10 (see rejection of claim 10 above),
wherein the first biometric parameters correspond to parameters of a subscriber identified by the identifier stored in the database (Larson: [Col 28, lines 52-61], The check 213 is an identity assessment process where the biographic data is compared with other sources, for example, comparing the provided name, birth date, address(es), and/or SSN against Social Security Administration records, death records, birth certificates, and other publicly available data to determine whether the provided SSN corresponds with the provided name or some other name(s), and the like. Some other checks that may be performed include criminal background checks, credit checks, financial fraud checks, and others. [Col 30, lines 2-5], identity matching is performed as discussed previously with respect to operations 208-213, and the results of the match are provided to the interviewer along with all potential matching identities.).
Regarding Claim 12, Larson in view of Genner teaches:
The method of claim 11 (see rejection of claim 11 above),
further comprising determining that the identified subscriber corresponds to a holder of an account with respect to the communications device (Larson: [Col 47, lines 10-14], The GUI instance 3110 includes text area 3130 including text indicating that the user may already have an account, and GCE 3125 that allows the user to proceed to a sign-in GUI when selected).
Regarding Claim 13, Larson in view of Genner teaches:
The method of claim 11 (see rejection of claim 11 above),
wherein generating the authentication signal is based, at least in part, on a trust score computed for the identified subscriber (Genner: [0071] An alert may activate the Trust Score 196 viewer application to cause the Trust Score 196 alert to display on the remote subscriber computer to enable connection via the digital badge to the data source over the Internet when the wireless device is locally connected to the remote subscriber computer and the remote subscriber computer comes online).
It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Larson’s method of extracting facial and extracting features from the driving license, and comparing with previously collected data by applying Genner’s method of the comparing subscriber’s information with driver license or passport information in order to verify that if subscriber is being honest or dishonest, fraudulent or legitimate. The motivation is to improve identity verification for online users combined with methodology for evaluating for trustworthiness (Genner: [Abstract])
Regarding Claim 14, Larson in view of Genner teaches:
The method of claim 13 (see rejection of claim 13 above),
wherein the trust score is being computed responsive to (Larson: [Col 15, lines 1-11], (42) A sixth example identity verification service provided by the IVS 140 may include identity proofing services wherein the one or more IVS servers 145 calculate identity scores or ratings, confidence scores, trust authenticators, max ID scores, and/or the like for each enrollee/applicant (hereinafter referred to as an “identity score” or the like). The identity scores may be probabilities or scalar values indicating an uncertainty regarding the true identity of an enrollee/applicant. In other words, the identity scores indicate the likelihood that an identity does (or does not) belong to a particular individual. [Col 15, lines 34-37], Once calculated, the identity scores can be compared with a threshold uncertainty value, which may then be used as a basis to reject or accept enrollees' access to different content/services) a detection of a discrepancy between the first biometric parameters (Larson: [Col 15, lines 1-11], (42) A sixth example identity verification service provided by the IVS 140 may include identity proofing services wherein the one or more IVS servers 145 calculate identity scores or ratings, confidence scores, trust authenticators, max ID scores, and/or the like for each enrollee/applicant (hereinafter referred to as an “identity score” or the like). The identity scores may be probabilities or scalar values indicating an uncertainty regarding the true identity of an enrollee/applicant. In other words, the identity scores indicate the likelihood that an identity does (or does not) belong to a particular individual. [Col 15, lines 34-37], Once calculated, the identity scores can be compared with a threshold uncertainty value, which may then be used as a basis to reject or accept enrollees' access to different content/services) and the second biometric parameters (Larson: [Col 15, lines 42-49], In embodiments, a user's identity score (=trust score) may be used as a basis to offer specific types or classes of content, services, or promotions offered from different third-party platforms (e.g., SPP 120). In various embodiments, users may submit additional or alternative biographic and/or biometric data to the IVS 140 in order to increase their identity score. Additionally, the identity scores may be compared against other data items to identify or predict fraudulent activity).
Regarding Claim 15, Larson in view of Genner teaches:
The method of claim 13 (see rejection of claim 13 above),
wherein the trust score being computed responsive to at least one of (Larson: [Col 15, lines 1-11], (42) A sixth example identity verification service provided by the IVS 140 may include identity proofing services wherein the one or more IVS servers 145 calculate identity scores or ratings, confidence scores, trust authenticators, max ID scores, and/or the like for each enrollee/applicant (hereinafter referred to as an “identity score” or the like). The identity scores may be probabilities or scalar values indicating an uncertainty regarding the true identity of an enrollee/applicant. In other words, the identity scores indicate the likelihood that an identity does (or does not) belong to a particular individual. [Col 15, lines 34-37], Once calculated, the identity scores can be compared with a threshold uncertainty value, which may then be used as a basis to reject or accept enrollees' access to different content/services):
determining validity of the uploaded captured image of the first content (Larson: [Col 12, lines 5-13], the evaluation phase may include utilizing aging or reverse aging protocols on the query image/video data prior to feature extraction. According to various embodiments, the evaluation phase involves comparing the one or more features extracted during the enrollment phase with features extracted from image/video data captured during a live interview to determine whether the enrollee is the same person as the person performing the live interview (within some margin of error),
detecting a discrepancy between the first biometric parameters (Larson: [Col 15, lines 6-11], The identity scores may be probabilities or scalar values indicating an uncertainty regarding the true identity of an enrollee/applicant. In other words, the identity scores indicate the likelihood that an identity does (or does not) belong to a particular individual. [Col 15, lines 34-37], Once calculated, the identity scores can be compared with a threshold uncertainty value, which may then be used as a basis to reject or accept enrollees' access to different content/services.
and the parameters of the subscriber identified by the identifier stored in the database (Larson: [Col 13, lines 64-67]-[Col 14, lines 1-15], The liveness detection services may be used to determine if a particular biometric being captured (such as the image/video or voice biometric data discussed previously) is an actual measurement from a living person who is present at the time of capture. For example, the liveness detection service may be used to determine when a user is attempting to use fake or prosthetic hands or fingers, high resolution images/video, face masks, contact lenses, voice recordings, fake physiological data, etc. during the enrollment or evaluation phases discussed previously. In some embodiments, the liveness detection services for object recognition based on image or video data may include, for example, using texture analysis (e.g., analyzing differences between skin surfaces and/or skin elasticity of real and fake faces or hands), motion analysis (e.g., detecting eye blinks; head, lip, or hand movements, etc.), three-dimensional reconstruction, defocusing techniques, and/or other like techniques),
and detecting a risk associated with the subscriber identifier (Larson: [Col 56, lines 23-25], The fraud risk GUI instance 4800 also includes GUI section 4810, which displays data/information that the IVS 140 has flagged as being potentially fraudulent. [Col 56, lines 53-59], The fraud risk GUI instance 4900 also includes GUI section 4910, which displays data/information that the IVS 140 has flagged as being potentially fraudulent. In this example, the GUI section 4910 shows four identity items that have been flagged as being potentially fraudulent. The GUI section 4910 also includes indicator 4914, which indicates the subject enrollee has a “High-risk” of fraud. [Col 58, lines 37-40], The GCEs 5307 include risk indicators labelled with one of “Low risk,” “Medium risk,” and “High risk” roughly indicating a fraud risk/potential).
Regarding Claim 16, Larson teaches:
at least one processor coupled to at least one memory device to (Larson: [Col 61, lines 17-22], The processor circuitry 6402 may include on-chip memory circuitry or cache memory circuitry, which may include any suitable volatile and/or non-volatile memory, such as DRAM, SRAM, EPROM, EEPROM, Flash memory, solid-state memory, and/or any other type of memory device technology, such as those discussed herein. [Col 61, lines 45-47], As examples, the processor circuitry 6402 may include Intel® Core™ based processor(s), MCU-class processor(s), Xeon® processor(s)):
This claim contains identical limitations found within that of claim 1 above albeit directed to a different statutory category (apparatus medium). For this reason the same grounds of rejection are applied to claim 16.
Regarding Claim 17, this claim contains identical limitations found within that of claim 2 above albeit directed to a different statutory category (apparatus medium). For this reason the same grounds of rejection are applied to claim 17.
Regarding Claim 18, this claim contains identical limitations found within that of claim 3 above albeit directed to a different statutory category (apparatus medium). For this reason the same grounds of rejection are applied to claim 18.
Regarding Claim 19, this claim contains identical limitations found within that of claim 4 above albeit directed to a different statutory category (apparatus medium). For this reason the same grounds of rejection are applied to claim 19.
Regarding Claim 20, this claim contains identical limitations found within that of claim 8 and 9 above albeit directed to a different statutory category (apparatus medium). For this reason the same grounds of rejection are applied to claim 20.
Regarding Claim 21, this claim contains identical limitations found within that of claim 11 and 13 above albeit directed to a different statutory category (apparatus medium). For this reason the same grounds of rejection are applied to claim 21.
Regarding Claim 22, Larson teaches:
a non-transitory computer-readable medium comprising instructions encoded thereon which (Larson: [Col 68, lines 55-62], FIG. 65 illustrates an example non-transitory computer-readable storage media (NTCRSM) that may be suitable for use to store instructions (or data that creates the instructions) that cause an apparatus (such as any of the devices/components/systems described with regard to FIGS. 1-9), in response to execution of the instructions by the apparatus, to practice selected aspects of the present disclosure), in response to being executed by a computer processor coupled to at least one memory device (Larson: [Col 62, lines 11-20], In these implementations, the circuitry of processor circuitry 6402 may comprise logic blocks or logic fabric including some other interconnected resources that may be programmed to perform various functions, such as the procedures, methods, functions, etc. of the various embodiments discussed herein. Additionally, the processor circuitry 6402 may include memory cells (e.g., EPROM, EEPROM, flash memory, static memory (e.g., SRAM, anti-fuses, etc.) used to store logic blocks, logic fabric, data, etc., in look-up tables (LUTs) and the like), instruct the computer processor coupled to the at least one memory device to (Larson: [Col 61, lines 17-22], The processor circuitry 6402 may include on-chip memory circuitry or cache memory circuitry, which may include any suitable volatile and/or non-volatile memory, such as DRAM, SRAM, EPROM, EEPROM, Flash memory, solid-state memory, and/or any other type of memory device technology)
This claim contains identical limitations found within that of claim 1 above albeit directed to a different statutory category (non-transitory medium). For this reason the same grounds of rejection are applied to claim 22.
Regarding Claim 23, this claim contains identical limitations found within that of claim 2 above albeit directed to a different statutory category (non-transitory medium). For this reason the same grounds of rejection are applied to claim 23.
Regarding Claim 24, this claim contains identical limitations found within that of claim 3 above albeit directed to a different statutory category (non-transitory medium). For this reason the same grounds of rejection are applied to claim 24.
Regarding Claim 25, this claim contains identical limitations found within that of claim 4 above albeit directed to a different statutory category (non-transitory medium). For this reason the same grounds of rejection are applied to claim 25.
Regarding Claim 26, this claim contains identical limitations found within that of claim 8 and 9 above albeit directed to a different statutory category (non-transitory medium). For this reason the same grounds of rejection are applied to claim 26.
Regarding Claim 27, this claim contains identical limitations found within that of claim 11 and 13 above albeit directed to a different statutory category (non-transitory medium). For this reason the same grounds of rejection are applied to claim 27.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of References Cited for a listing of analogous art.
U. S. PGPub. No. 2015/0154599 A1 (Lyda et al.): An identity risk score may be determined for subscribers of a service to indicate a level of confidence or certainty associated with a subscriber's identity. The identity risk score may be modified upward or downward in order to reflect changing levels of certainty. The changes may be based on transactions performed on behalf of and/or information submitted by the subscriber. Functionality provided to the subscriber may also be dependent upon whether the subscriber's identity risk score meets a threshold. In one or more arrangements, an identity risk score may be determined based on whether information entered by the subscriber can be confirmed and a level of confidence with which the information is confirmed.
U. S. PGPub. No. 2006/0080263 A1 (Willis et al.): An information monitoring and alert system is provided which registers subscribers and verifiers with a central alert system. The alert system provides an interface for the verifiers to submit queries relating to identification information. Information in this query is compared to the stored data submitted by the subscriber during registration and if a match occurs the subscriber is notified that the identification has been used for a certain purpose. The alert system only stores an encrypted value of the identification with only contact information which is preferably anonymous. Any other information is deleted after registration. The subscriber upon being alerted of the use of the identification is instructed to authorize or reject the transaction pertaining to the query.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RUPALI DHAKAD whose telephone number is (571)270-3743. The examiner can normally be reached M-F 8:30-5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander Lagor can be reached at 5712705143. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.D./Examiner, Art Unit 2437
/ALEXANDER LAGOR/Supervisory Patent Examiner, Art Unit 2437