Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This is a reply to the application filed on 7/1/2025, in which, claim(s) 1-3 and 5-14 are pending.
Claim(s) 4 is/are cancelled.
Priority
Acknowledgment is made of applicant's claim for foreign priority under 35 U.S.C. 119(a)-(d). Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 6/10/2025 and 7/1/2025, has been reviewed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the examiner is considering the information disclosure statement.
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification.
Drawings
The drawings filed on 6/10/2025 is/are accepted by The Examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
Claim 14 recites, “A computer-readable recording medium…” from the specification paragraph(s), it states “[0153] Furthermore, the user authentication method using input pattern information according to an embodiment may also be implemented as a computer program (or computer program product) including computer-executable instructions. The computer program may include machine-executable instructions processed by a processor and may be implemented in a high-level programming language, an object-oriented programming language, an assembly language, or machine language. The computer program may be recorded on a tangible computer-readable recording medium (e.g., memory, hard disk, magnetic/optical media, or SSD (Solid-State Drive))...”.
Since the only disclosure closed to a computer-readable recording medium is the tangible computer-readable recording medium cited in paragraph 153; however, it is unclear if they are the same or carrying a signal. Thus, it is determined that the computer readable medium carrying a signal. In addition, transitory forms of signal transmission through transmission medium such as radio broadcast, electrical signals through a wire, and light pulses through a fiber-optic cable, are embodiments that are not directed to statutory subject matter because those transmissions convey only information encoded in the manner are transitory (In re Nuijten 84 U.S.P.Q.2d 1495). Therefore, the claim(s) recites non-statutory subject matter.
The Examiner suggest to amend “A computer-readable recording medium…” to be “A non-transitory computer-readable recording medium…” to overcome the rejection.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3 and 5-14 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Edwards et al. (US 20200193754 A1; hereinafter Edwards).
Regarding claims 1, 13 and 14, Edwards discloses a user authentication method using input pattern information performed on a user device, comprising:
constructing input data by collecting the input pattern information of a subject using the user device (secured access device receiving acceleration data associated with the movement of access card when user performs a gesture for purposes of authentication of user identity, user will possess access card and perform a gesture, which may be a fully body gesture. For example, user may hold access card in their hand, and draw a shape, a signature, a code word, or a motion that only user knows and only user can perform reproducibly [Edwards; ¶38-47; Fig. 5-6 and associated texts]);
transforming a data space of the input data by inputting the input data into a data space transformation model (the raw acceleration data is filtered by secured access device, filter the acceleration data derived from user gesture, the raw acceleration data may include additional acceleration data that is not representative of user full body gesture. The acceleration data may be filtered, so that only accelerations resulting from user gesture are collected and processed, for example by using stock linear quadratic estimation, commonly known as Kalman filtering. The filtered acceleration data is normalized in the time domain and frequency domain. The normalized acceleration data creates a data set of individual time based acceleration vectors which can be codified by assembling the acceleration data into normalized Hausdorff space patterns, which are representative of user 140's gesture and ultimately representative of user identity. The filtered acceleration data derived from the full body gesture performed by user while possessing access card constitutes a determined identifying pattern. This determined identifying pattern can function as one factor in a multi factor authentication [Edwards; ¶38-47; Fig. 5-6 and associated texts]); and
outputting a user authentication result of the subject by analyzing the input data in which the data space is transformed using a classification (comparing the identifying pattern derived from the gesture with an identifying pattern associated with user that is stored in memory in the secured access device and/or in memory and/or database(s) associated with server. Based on a comparison between the stored identifying pattern and the determined identifying pattern, user may be authenticated, compare the determined identifying pattern with the stored identifying pattern. In certain aspects, the comparison must be within a predetermined confidence level, e.g., of 75%, 80%, 85%, 90%, 95%, or 99% to authenticate the identity of user [Edwards; ¶38-47; Fig. 5-6 and associated texts]),
wherein the data space transformation model is a model pre-trained such that user data follows a specific distribution in a latent space (acceleration data, associated with user movement of access card, may be measured by access card and converted into a vector in three-dimensional space. In some embodiments, timing associated with user movement of access card and/or individual gestures may be monitored to determine a uniformity within the movement of each individual identifying pattern [Edwards; ¶38-47; Fig. 5-6 and associated texts]).
Regarding claim 2, Edwards discloses the method of claim 1, wherein the input pattern information comprises:
spatial information in which authentication information is input on an input interface (a gesture for purposes of authentication of user identity, user will possess access card and perform a gesture, which may be a fully body gesture. For example, user may hold access card in their hand, and draw a shape, a signature, a code word, or a motion [Edwards; ¶38-47; Fig. 5-6 and associated texts]); and
temporal information related to a time in which the authentication information is input on the input interface (the movement of the gesture by user associated with access card will have accelerations per unit of time unique to user [Edwards; ¶38-47; Fig. 5-6 and associated texts]).
Regarding claim 3, Edwards discloses the method of claim 2, wherein the constructing of the input data comprises:
calculating time normalization information using the spatial information and the temporal information (filtered acceleration data is normalized in the time domain and frequency domain. The normalized acceleration data creates a data set of individual time based acceleration vectors which can be codified by assembling the acceleration data into normalized Hausdorff space patterns [Edwards; ¶38-47, 58; Fig. 5-6 and associated texts]); and
constructing the input data using at least one of the spatial information, the temporal information, and the time normalization information (The filtered acceleration data derived from the full body gesture performed by user while possessing access card constitutes a determined identifying pattern [Edwards; ¶38-47, 58; Fig. 5-6 and associated texts]).
Regarding claim 5, Edwards discloses the method of claim 1, wherein the data space transformation model is configured as a normalizing flow trained such that the user data forms a normal distribution in the latent space, and in the transforming of the data space, the input data is projected to a specific location in the latent space by the trained normalizing flow of the data space transformation model (filtered acceleration data is normalized in the time domain and frequency domain. The normalized acceleration data creates a data set of individual time-based acceleration vectors which can be codified by assembling the acceleration data into normalized Hausdorff space patterns, secured access device may execute machine learning software when determining the identifying pattern. In some embodiments, technology used for this purpose may include Neural Networks, Hidden Markov Models, and Support Vector Machines [Edwards; ¶38-47, 58; Fig. 5-6 and associated texts]).
Regarding claim 6, Edwards discloses the method of claim 1, wherein the data space transformation model is configured as an autoencoder trained to project features extracted from the user data into the latent space and to reconstruct data identical or similar to the user data based on the projected features, and in the transforming of the data space, features are extracted from the trained a to encoder to which the input data is applied, and the features are projected to a specific location in the latent space (constructing patterns within a particular space, or signature within the pad area, the maximum and minimum accelerations, as well as the time length of each gesture can be recorded. Additionally, accelerations due to the rotation of access card may be collected. For example, accelerations measured from the movement of access card may vary depending on the location of accelerometer(s). The location of accelerometer(s) may be accounted for during the filtering process. For example, user may be instructed to hold their card in a variety of ways (e.g., upside down, backwards, etc.), while performing the initial gesture, to inform secured access device of how the gesture would be performed with different variations of the placement of the accelerometers. In some embodiments, the location of the one or more accelerometers on access card may be considered when configuring the identifying pattern. The secured access device may execute machine learning software when determining the identifying pattern. In some embodiments, technology used for this purpose may include Neural Networks, Hidden Markov Models, and Support Vector Machines [Edwards; ¶38-47, 57-58; Fig. 5-6 and associated texts]).
Regarding claim 7, Edwards discloses the method of claim 1, wherein the classification model is a machine learning model-based classification model trained to determine whether the subject corresponds to a pre-registered user based on the input data with the transformed data space (The secured access device may execute machine learning software when determining the identifying pattern. In some embodiments, technology used for this purpose may include Neural Networks, Hidden Markov Models, and Support Vector Machines [Edwards; ¶38-47, 57-58; Fig. 5-6 and associated texts]).
Regarding claim 8, Edwards discloses the method of claim 1, further comprising:
storing the input data input to the data space transformation model and output data output from the data space transformation model corresponding to the input data as first training data; and storing the transformed input data input to the classification model and output data output from the classification model corresponding to the transformed input data as second training data (the filtered acceleration data and determined identifying pattern may be stored in memory, such that each successive determined identifying pattern may be used as continued input in training the machine learning model for variations. In other embodiments, the filtered acceleration data and identifying pattern may only be added as inputs to the machine learning model when exceeding a certain threshold of confidence, secured access device may execute machine learning software when determining the identifying pattern. In some embodiments, technology used for this purpose may include Neural Networks, Hidden Markov Models, and Support Vector Machines [Edwards; ¶38-47, 57-58; Fig. 5-6 and associated texts]).
Regarding claim 9, Edwards discloses the method of claim 8, further comprising:
determining whether an authentication model including the data space transformation model and the classification model is in a trainable state; and training the data space transformation model with the first training data and training the classification model with the second training data if the authentication model is in the trainable state (determined identifying pattern may be stored in memory, such that each successive determined identifying pattern may be used as continued input in training the machine learning model for variations. In other embodiments, the filtered acceleration data and identifying pattern may only be added as inputs to the machine learning model when exceeding a certain threshold of confidence. In this manner, the secured access device can become more accurate over time in its determination of whether user is truly the entity performing the gesture [Edwards; ¶38-47, 57-58; Fig. 5-6 and associated texts]).
Regarding claim 10, Edwards discloses the method of claim 9, further comprising:
deleting the first training data on which training of the data space transformation model has been completed and the second training data on which training of the classification model has been completed (access card may include a data storage component disposed in the card. As used herein, a “data storage component” may be or include one or more devices configured to receive, store, process, provide, transfer, send, delete, and/or generate data or other information [Edwards; ¶30, 38-47, 57-58; Fig. 5-6 and associated texts]).
Regarding claim 11, Edwards discloses the method of claim 1, wherein the user authentication method using the input pattern information is performed on the user device that stores and uses the data space transformation model and the classification model in memory or storage (comparing the identifying pattern derived from the gesture with an identifying pattern associated with user that is stored in memory in the secured access device and/or in memory and/or database(s) associated with server [Edwards; ¶38-47, 57-58; Fig. 5-6 and associated texts]).
Regarding claim 12, Edwards discloses the method of claim 1, further comprising: providing a financial service environment provided by a financial server to the subject when user authentication of the subject is completed (The entity may be a financial service provider, which may be a bank, credit union, credit card issuer, or other type of financial service entity that generates, provides, manages, and/or maintains financial service accounts for one or more customers. The entity may be any provider of goods and/or services. For example, the entity may be a hospital, university, business, gated community, apartment complex, self-storage, and/or school, among other entities. The entity associated with secured access device may issue access card(s) to one or more user(s) [Edwards; ¶21; Fig. 5-6 and associated texts]).
Internet Communications
Applicant is encouraged to submit a written authorization for Internet communications (PTO/SB/439, http:ljwww.uspto.gov/sites/default/files/documents/sb0439.pdf) in the instant patent application to authorize the examiner to communicate with the applicant via email. The authorization will allow the examiner to better practice compact prosecution. The written authorization can be submitted via one of the following methods only: (1) Central Fax which can be found in the Conclusion section of this Office action; (2) regular postal mail; (3) EFS WEB; or (4) the service window on the Alexandria campus. EFS web is the recommended way to submit the form since this allows the form to be entered into the file wrapper within the same day (system dependent). Written authorization submitted via other methods, such as direct fax to the examiner or email, will not be accepted. See MPEP § 502.03.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAO Q HO whose telephone number is (571)270-5998. The examiner can normally be reached on 7:00am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jeffrey Nickerson can be reached on (469) 295-9235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAO Q HO/Primary Examiner, Art Unit 2432