DETAILED ACTION
This Office action is in response to amendments and remarks filed by Applicant on 11/26/2025.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Applicant presents amendments to claims 1–10 and 12–13, and cancels claims 11 and 14. All amendments have been fully considered.
Applicant’s amendments are sufficient to overcome the previous rejection under 35 U.S.C. 112(b) for being indefinite. The rejection is hereby withdrawn.
Applicant’s amendments address the claim objections identified in the previous Office Action. Canceling claim 14 makes the object to the claim moot.
Applicant’s amendments are not significant enough to distinguish the present invention from the conflicting application identified in the double patenting rejection. The Examiner maintains this rejection.
Applicant’s amendments to independent claim 10 are sufficient to overcome the previous anticipation rejection under 35 U.S.C. 102. While the rejection is withdrawn, the added subject matter places the claim in parallel with rejected claim 1 and a new rejection based upon the combination of references relied upon for the rejection of claim 1 are applied to claim 10 under 35 U.S.C. 103.
Response to Arguments
Applicant presents arguments with respect to independent claims 1 and 10. All argument have been fully considered.
Applicant argues that the previously cited combination of references fail to teach “the specific requirement of having a combined biometric and user device metadata record as the explicit unit of reference data used for comparison” and that the “retrieved second set of data packets pertain to both metadata types that are used in the core comparison step”. The Examiner responds: Nowhere in the language of the claims is there the requirement of having a combined biometric and user device metadata record as the explicit unit of reference data used for comparison. The claim requires that the primary reference couples a computing device and a centralized server (See Larson Figure 1 and 4:36–46), receiving data packs amounting to biometric data collected by sensors (See Larson 26:24–42), extracting this biometric data (See Larson 11:6–45), retrieve another set of packets from the centralized server (See Larson 26:43–67), compare the biometric data with the extracted centralized server packets (See Larson 26:43–67), identify an identity based upon a positive match (See Larson 10:33–11:5 and 26:43–67), and if no match register a new identity (See Larson 36:53–62). All of these limitations are found in the primary reference. However, what is not found in the primary reference is that the retrieved data from the centralized server is data pertaining to biometric features and device metadata of a pre-existing user profile on the centralized server. The Examiner asserts that these additional limitations are widely known in the art and that one of ordinary skill in the art would understand that a wide variety of identity data (including biometric and user-associated device data) is stored and used to verify users. The secondary reference, Zhou, is provided as evidence of the widely known concept of associating biometric and device data with a user profile for the purpose of comparing valid information in the determination of user authentication (See Zhou ¶ 61). Applicant’s assertion that the claim language contains requirements, such as “having a combined biometric and user device metadata record as the explicit unit of reference data used for comparison” and that the “retrieved second set of data packets pertain to both metadata types that are used in the core comparison step”, is not borne out by the claims. The Examiner has provided above a succinct interpretation of each claim limitation and how each reference is mapped to each limitation. The Examiner notes that Applicant has ample freedom to choose the language used to articulate the invention. If Applicant intends for terms to specify a particular relationship or sequency with respect to a specific other element of the invention, Applicant best articulate that. As it stands, the Examiner fails to find any deficiency in the asserted references.
Applicant also takes issue with the motivation to combine two references that deal precisely with user authentication using defined characteristics to verify identities in an electronic system, which is clearly the same. As noted in the previous Office Action the motivation to combine the Zhou reference with the Larson reference is to ensure the requested identity verification is consistent with a known payment device. Applicant’s arguments are unpersuasive.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement.
Effective January 1, 1994, a registered attorney or agent of record may sign a terminal disclaimer. A terminal disclaimer signed by the assignee must fully comply with 37 CFR 3.73(b).
Claims 1–14 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1–14 of copending Application No. 18/058,790. Although the claims at issue are not identical, they are not patentably distinct from each other because the claim the same subject matter.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1–10, 12–13 rejected under 35 U.S.C. 103 as being unpatentable over Larson (US 10,693,872 B1, issued Jun. 23, 2020) in view of Zhou (US 2020/0134633 A1, published Apr. 30, 2020).
Regarding claim 1, Larson discloses: a fool-proof registration system, said system comprising a centralized server coupled with a processor over a network, said processor being operatively configured with a memory storing instructions that on execution enable the processor to: establish a communicative coupling between a computing device of an entity and the centralized server (Larson Figure 1 and 4:36–46.); receive a first set of data packets associated with the entity, said first set of data packets being measured by one or more sensors operatively coupled with the processor (a primary biometric is captured by the client application and the facial data is securely sent to the IVS for processing. Larson 26:24–42.); extract from the received first set of data packets one or more biometric features of the entity (extracting feature data from received captured data. Larson 11:6–45.); retrieve from the centralized server a second set of data packets responsive to receipt of the first set of data packets (IVS obtains primary biometric data from storage databases. Larson 26:43–67.); compare the extracted biometric features of the entity with the retrieved second set of data packets (IVS performs a primary biometric match. Larson 26:43–67.); identify an identity of a user upon positive match of the extracted one or more biometric features of the entity and the second set of data packets (results of the primary biometric matching result in associations with the applicant’s identity. Larson 10:33–11:5 and 26:43–67.); and register, the entity as a new user upon the centralized server upon a negative match of the extracted one or more biometric features of the entity and the second set of data packets (IVS determines that the user needs to be enrolled if the biometric data does not match existing biometric data in storage. Larson Figure 9 and 36:53–62.).
Larson does not disclose: wherein said second set of data packets pertain to biometric feature metadata and a user device metadata of a user device associated with a pre-existing user profile, said user-profile being stored on the centralized database.
However, Zhou does disclose: wherein said second set of data packets pertain to biometric feature metadata and a user device metadata of a user device associated with a pre-existing user profile, said user-profile being stored on the centralized database (first biometric feature information is compared with the second biometric feature information. Zhou ¶ 61. The server stores the second biometric feature information, which is user biometric feature information mapped to a device identifier. Zhou ¶ 61.).
Therefore, it would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the biometric authentication system of Larson with the association of user biometric identity data with a user device identity and stored in server based upon the teachings of Zhou. The motivation being to ensure the requested identity verification is consistent with a known payment device. Zhou ¶ 59.
Regarding claim 2, Larson in view of Zhou discloses the limitations of claim 1, wherein on a first communicative coupling of the user device with the centralized server, the user profile corresponding to the user is generated by the centralized server based on user biometric feature metadata and the user device metadata based on one or more parameters input by the user (creating a new identity session based upon receipt of primary biometric captured by the client application. Larson 26:24–67.).
Regarding claim 3, Larson in view of Zhou discloses the limitations of claim 1, wherein the user biometric features are recorded and stored onto the centralized server for comparing with the received first set of data packets and identification of the entity with the corresponding pre-stored user (identity information is updated and improved when new biographic and biometric identity information is provided to the system. Larson 9:7–41.).
Regarding claim 4, Larson in view of Zhou discloses the limitations of claim 1, wherein the one or more sensors that generate the first set of data packets are any or a combination of a image capturing sensor, a biometric capturing sensor, a fingerprint sensor, an iris recognition sensor, a speech recognition sensor, a gesture recognition sensor, a scanner (client systems include one or more sensors including image capture devices and microphones. Larson 4:54–59.).
Regarding claim 5, Larson in view of Zhou discloses the limitations of claim 1, wherein the first set of data packets is a set of images including multiple view-profiles such as left, right, front and back of the entity (face scan including capturing images or video of the enrollee’s face and related cameras to capture liveness gestures. Larson 26:24–42.).
Regarding claim 6, Larson in view of Zhou discloses the limitations of claim 1, wherein extraction of the biometric features of the entity further includes any or a combination of face detection, mask/face accessory detection, head pose estimation, roll angle, gaze detection (facial recognition-based identity verification services. Larson 6:27–55.).
Regarding claim 7, Larson in view of Zhou discloses the limitations of claim 1, wherein the first set of data packets is a video-stream captured by a camera (face scan including capturing images or video of the enrollee’s face and related cameras to capture liveness gestures. Larson 26:24–42.).
Regarding claim 8, Larson in view of Zhou discloses the limitations of claim 1, wherein a plurality of cameras for capturing images are onboarded and communicatively coupled with a network file sharing server, said network file sharing server storing camera orientation information to enable capturing multiple view-profiles of the entity selected from any or a combination of left, right, front and back of the entity (face scan including capturing images or video of the enrollee’s face and related cameras to capture liveness gestures. Larson 26:24–42. Geometric object recognition for analyzing the relative position, size and shape of features of the human face. Larson 11:21–45.).
Regarding claim 9, Larson in view of Zhou discloses the limitations of claim 1, wherein the processor is further configured to: identify the entity as the user based on matching computing device metadata with the pre-stored user device metadata (first biometric feature information is compared with the second biometric feature information. Zhou ¶ 61. The server stores the second biometric feature information, which is user biometric feature information mapped to a device identifier. Zhou ¶ 61. Ensuring the requested identity verification is consistent with a known payment device. Zhou ¶ 59.).
Regarding claim 10, Larson discloses: a method for fool-proof registration and identification comprising: receiving by a data acquisition engine, a first set of data packets associated with an entity, said first set of data packets being measured by one or more sensors operatively coupled with the processor (a primary biometric is captured by the client application and the facial data is securely sent to the IVS for processing. Larson 26:24–42.); extracting by a feature extraction engine, from the received first set of data packets one or more biometric features of the entity (extracting feature data from received captured data. Larson 11:6–45.); retrieving from the centralized server a second set of data packets responsive to receipt of the first set of data packets (IVS obtains primary biometric data from storage databases. Larson 26:43–67.); comparing the extracted biometric features of the entity with the retrieved second set of data packets (IVS performs a primary biometric match. Larson 26:43–67.); identifying upon positive match of the extracted one or more biometric features of the entity and the second set of data packets an identity of a user (results of the primary biometric matching result in associations with the applicant’s identity. Larson 10:33–11:5 and 26:43–67.).
Larson does not disclose: wherein said second set of data packets pertain to biometric feature metadata and a user device metadata of a user device associated with a pre-existing user profile, said user-profile being stored on the centralized database.
However, Zhou does disclose: wherein said second set of data packets pertain to biometric feature metadata and a user device metadata of a user device associated with a pre-existing user profile, said user-profile being stored on the centralized database (first biometric feature information is compared with the second biometric feature information. Zhou ¶ 61. The server stores the second biometric feature information, which is user biometric feature information mapped to a device identifier. Zhou ¶ 61.).
Therefore, it would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the biometric authentication system of Larson with the association of user biometric identity data with a user device identity and stored in server based upon the teachings of Zhou. The motivation being to ensure the requested identity verification is consistent with a known payment device. Zhou ¶ 59.
Regarding claim 12, Larson in view of Zhou discloses the limitations of claim 10, further comprising: conducting a duplication check based on the extracted one or more biometric features of the entity and the second set of data packets (the IVS performs a primary biometric match that may be a one-to-many comparison with other identity data in the database. Larson 2643–67.).
Regarding claim 13, Larson in view of Zhou discloses the limitations of claim 10, wherein receiving the first set of data packets includes: inputting by an image capturing device, a plurality of images of the entity (face scan including capturing images or video of the enrollee’s face and related cameras to capture liveness gestures. Larson 26:24–42.); detecting by a machine learning engine, a face of the entity (implementation of the identity verification can use elements tailored for machine learning functionality. Larson 62:21–51.); checking by the ML engine, a dimension of the captured image with reference to a pre-defined threshold image size (geometric object recognition for analyzing the relative position, size and shape of features of the human face. Larson 11:21–45.); detecting a mask upon the face within the captured image of the entity (face scan including capturing images or video of the enrollee’s face and related cameras to capture liveness gestures. Larson 26:24–42.); and estimating a head pose of the entity based on the captured images (geometric object recognition for analyzing the relative position of features of the human face. Larson 11:21–45.).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VANCE LITTLE whose telephone number is (571) 270-0408. The examiner can normally be reached Monday - Friday 9:30am - 5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jung (Jay) Kim can be reached at (571) 272-3804. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VANCE M LITTLE/Primary Examiner, Art Unit 2494