Prosecution Insights
Last updated: April 19, 2026
Application No. 18/915,103

METHOD AND APPARATUS FOR LIVENESS DETECTION, ELECTRONIC DEVICE, AND STORAGE MEDIUM

Non-Final OA §102§103
Filed
Oct 14, 2024
Examiner
MEHEDI, MORSHED
Art Unit
2408
Tech Center
2400 — Computer Networks
Assignee
Mashang Consumer Finance Co. Ltd.
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
85%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
724 granted / 844 resolved
+27.8% vs TC avg
Minimal -0% lift
Without
With
+-0.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
16 currently pending
Career history
860
Total Applications
across all art units

Statute-Specific Performance

§101
17.6%
-22.4% vs TC avg
§103
45.2%
+5.2% vs TC avg
§102
11.7%
-28.3% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 844 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. DETAILED ACTION Claims 1-20 are presented for examination. Information Disclosure Statement No information disclosure statement (IDS) is submitted. Drawings The drawings filed on 10/14/2024 are accepted by the examiner. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. 1. Claims 1, 4, 8, 11, 15, and 18 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Rodriguez et al. (US Publication No. 2017/0046583, hereinafter “Rodriguez”). Regarding claim 1, Rodriguez does disclose, a method for liveness detection, comprising: acquiring data of a user; establishing a binding relationship between the user and a model, wherein orientation of the model is determined based on the data after the binding relationship is established (Rodriguez, (para. [0097]), access to the remote system 130 using the user device 104 may be conditional on the user 102 successfully completing a validation process; (para. [0097-0098]), the remote system 130 may for example comprise a secure data store 132, which holds (say) the user's personal data. In order to keep the user's data secure, the back-end software 124 makes retrieval of the user's personal data from the database 132 using the user device 104 conditional on successful validation of the user 102); controlling movement of a control according to a variation of the orientation of the model, and acquiring first information of the control according to a preset time interval during the movement (Rodriguez, (para. [0013]), the controller is configured to control an output device to provide randomized outputs to an entity over an interval of time. The video input is configured to receive a moving image of the entity captured by a camera over the interval of time. The feature recognition module is configured to process the moving image to detect at least one human feature of the entity); and performing liveness detection according to the first information, to obtain a detection result of the user (Rodriguez, (para. [0013]), the liveness detection module is configured to compare with the randomized outputs a behaviour exhibited by the detected human feature over the interval of time to determine whether the behaviour is an expected reaction to the randomized outputs, thereby determining whether the entity is a living being). Regarding claim 4, Rodriguez further discloses, the method according to claim 1, wherein before the controlling the movement of the control according to the variation of the orientation of the model, and the acquiring the first information of the control according to the preset time interval during the movement, the method further comprises: displaying second information in a display area of a user terminal, wherein the second information is used to guide the user to control the movement of the control into a first area (Rodriguez, (para. [0013]), the controller is configured to control an output device to provide randomized outputs to an entity over an interval of time. The video input is configured to receive a moving image of the entity captured by a camera over the interval of time. The feature recognition module is configured to process the moving image to detect at least one human feature of the entity); the performing the liveness detection according to the first information, to obtain the detection result of the user comprises: under a circumstance that lastly acquired first information of the control is located in the first area, performing the liveness detection according to the first information, to obtain the detection result of the user (Rodriguez, (para. [0013]), the liveness detection module is configured to compare with the randomized outputs a behaviour exhibited by the detected human feature over the interval of time to determine whether the behaviour is an expected reaction to the randomized outputs, thereby determining whether the entity is a living being). Regarding claim 8, the substance of the claimed invention is similar to that of claim 1. Accordingly, this claim is rejected under the same rationale. Regarding claim 11, the substance of the claimed invention is similar to that of claim 4. Accordingly, this claim is rejected under the same rationale. Regarding claim 15, the substance of the claimed invention is similar to that of claim 1. Accordingly, this claim is rejected under the same rationale. Regarding claim 18, the substance of the claimed invention is similar to that of claim 4. Accordingly, this claim is rejected under the same rationale. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. 2. Claims 2-3, 5-7, 9-10, 12-14, 16-17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Rodriguez et al. (US Publication No. 2017/0046583, hereinafter “Rodriguez”) in view of Komogortsev et al. (US Pub No. 2013/0336547, hereinafter “Komogortsev”). Regarding claim 2, Rodriguez does disclose, the method according to claim 1, wherein the controlling the movement of the control according to the variation of the orientation of the model, and the acquiring the first information of the control according to the preset time interval during the movement (Rodriguez, (para. [0013]), the controller is configured to control an output device to provide randomized outputs to an entity over an interval of time. The video input is configured to receive a moving image of the entity captured by a camera over the interval of time. The feature recognition module is configured to process the moving image to detect at least one human feature of the entity). Rodriguez does not explicitly disclose but the analogous art Komogortsev does discloses, performing Euclidean distance transform processing according to the variation of the orientation of the model, to obtain an offset of the control; and controlling the movement of the control according to the offset of the control (Komogortsev, (para. [0082]), Scanpath length is indicative of the efficiency of visual search, and may be considered as a candidate biometric feature under the assumption that visual search is dependent on the subject's familiarity with similar patterns/content. Scanpath length may be measured as the sum of absolute distances between the vectorial centroid of fixation points, where the vectorial centroid was defined as the Euclidean norm of the horizontal and vertical centroid positions). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Rodriguez by including performing Euclidean distance transform processing taught by Komogortsev for the advantage of preventing a spoofing attack involves presenting an accurate mechanical replica of the human eye is presented to the sensor (Komogortsev, (para. [0013])). Regarding claim 3, the combination of Rodriguez-Komogortsev does disclose, the method according to claim 1, wherein the performing the liveness detection according to the first information, to obtain the detection result of the user comprises: performing trajectory judgment, speed judgment and acceleration judgment according to the first information, to obtain corresponding judgment results; generating, according to the judgment results, a score of the user; and determining the detection result of the user according to the score and a preset threshold (Komogortsev, (para. [0111), Eye movement classification module 302 classifies the eye position signal 304 into fixations and saccades. A sequence of classified saccades' trajectories is sent to the oculomotor plant mathematical model (OPMM) 306; (para. [0112, 0125, 0054, 0204]), OPMM 306 may generate simulated saccades' trajectories based on the default OPC values that are grouped into a vector with the purpose of matching the simulated trajectories with the recorded ones. Each individual saccade may be matched independently of any other saccade. Both classified and simulated trajectories for each saccade may be sent to error function module 308. Error function module 308 may compute error between the trajectories. The error result may trigger the OPC estimation module 310 to optimize the values inside of the OPC vector minimizing the error between each pair of recorded and simulated saccades). Regarding claim 5, the combination of Rodriguez-Komogortsev does disclose, the method according to claim 4, wherein the display area of the user terminal is pre-divided into a first partition and a second partition, the first partition comprises a plurality of second areas, and the second partition comprises a plurality of second areas, and a preset spacing distance exists between the first partition and the second partition; an initial position of the control is located in one of the first partition and the second partition, and the first area is located in the other one of the first partition and the second partition (Komogortsev, (para. [0101, 0174-0175]), scanpath_fix characteristics are compared via pairwise distances between the centroids representing positions of fixations at 248. In comparing two scanpaths, the Euclidean pairwise distance may be calculated between the centroid positions of fixations. Following this, a tally may be made of the total number of fixation points in each set that could be matched to within 1.degree. of at least one point in the opposing set. The similarity of scanpaths may be assessed by the proportion of tallied fixation points to the total number of fixation points to produce a similarity score similar to those generated for the various eye movement metrics). Regarding claim 6, the combination of Rodriguez-Komogortsev does disclose, the method according to claim 1, wherein the data comprises a first position in a display area corresponding to a first point of the user and a second position in the display area corresponding to a second point of the user; the establishing the binding relationship between the user and the model comprises: establishing the binding relationship between the user and the model under a circumstance that a face of the user is determined to be of bilateral symmetry according to the data, and the first position and the second position are located in a preset area of the display area (Komogortsev, (para. [0053, 0060, 0082, 0114]), eye positional signal information is acquired. Raw eye movement data produced during a recording is supplied to an eye movement classification module at 212. In some embodiments, an eye-tracker sends the recorded eye gaze trace to an eye movement classification algorithm at 212 after visual information employed for the authentication is presented to a user. An eye movement classification algorithm may extract fixations and saccades from the signal. The extracted saccades' trajectories may be supplied to the mathematical model of the oculomotor plant 214 for the purpose of simulating the exact same trajectories). Regarding claim 7, the combination of Rodriguez-Komogortsev does disclose, the method according to claim 5, wherein the data comprises a third position in the display area of the user terminal corresponding to a third point; the establishing the binding relationship between the user and the model comprises: establishing the binding relationship between the user and the model under a circumstance that the third position and the initial position of the control are located in a same second area (Komogortsev, (para. [0169), CEM (including COB) and OPC eye movement metrics are estimated. CEM related metrics may include fixation count, average fixation duration, average vectorial average vertical saccade amplitude, average vectorial saccade velocity, average vectorial saccade peak velocity, velocity waveform (Q), COB related metrics--undershot/overshoot, corrected undershoot/overshoot, multi-corrected undershoot/overshoot, dynamic, compound, express saccades, scanpath length, scanpath area, regions of interest, inflection count, and slope coefficients of the amplitude-duration and main sequence relationships; (para. [0101]), where the similarity of scanpaths may be assessed by the proportion of tallied fixation points to the total number of fixation points to produce a similarity score similar to those generated for the various eye movement metrics). Regarding claim 9, the substance of the claimed invention is similar to that of claim 2. Accordingly, this claim is rejected under the same rationale. Regarding claim 10, the substance of the claimed invention is similar to that of claim 3. Accordingly, this claim is rejected under the same rationale. Regarding claim 12, the substance of the claimed invention is similar to that of claim 5. Accordingly, this claim is rejected under the same rationale. Regarding claim 13, the substance of the claimed invention is similar to that of claim 6. Accordingly, this claim is rejected under the same rationale. Regarding claim 14, the substance of the claimed invention is similar to that of claim 7. Accordingly, this claim is rejected under the same rationale. Regarding claim 16, the substance of the claimed invention is similar to that of claim 2. Accordingly, this claim is rejected under the same rationale. Regarding claim 17, the substance of the claimed invention is similar to that of claim 3. Accordingly, this claim is rejected under the same rationale. Regarding claim 19, the substance of the claimed invention is similar to that of claim 5. Accordingly, this claim is rejected under the same rationale. Regarding claim 20, the substance of the claimed invention is similar to that of claim 6. Accordingly, this claim is rejected under the same rationale. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US Publication No. 2021/0312027, “the liveness detection threshold may include a liveness recognition intensity threshold. The comparing the second recognition information with a liveness detection threshold may include comparing an average of the second recognition information with the liveness recognition intensity threshold; or comparing a maximum value of the second recognition information, a minimum value of the second recognition information, or a difference between the maximum value and the minimum value of the second recognition information with the liveness recognition intensity threshold”. US Publication No. 2023/0230085, “the system ensures the customer presence using facial biometric authentication. The system combines facial authentication with biometric profiling, liveness detection, and anti-spoofing to combat credential theft, and to also provide a continuous security layer to defend from session compromise or takeovers in a continuous facial recognition authentication. Optionally, a decentralized biometric platform ensures that the biometric profile itself cannot be stolen or recombined with other personal data or other biometric data; and avoids any centralized or “honeypot” biometric database to be breached”. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MORSHED MEHEDI whose telephone number is (571) 270-7640. The examiner can normally be reached on M - F, 8:00 am to 4:00 pm EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Linglan Edwards can be reach on (571) 270-5440. The fax number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from their Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (In USA or Canada) or 571-272-1000. /MORSHED MEHEDI/Primary Examiner, Art Unit 2408
Read full office action

Prosecution Timeline

Oct 14, 2024
Application Filed
Feb 05, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596842
DATA ANONYMIZATION FOR SERVICE SUBSCRIBER'S PRIVACY
2y 5m to grant Granted Apr 07, 2026
Patent 12587357
METHODS AND SYSTEMS FOR P-ADIC ENCODING AND DECODING OF RATIONAL DATA FOR FHE SYSTEMS
2y 5m to grant Granted Mar 24, 2026
Patent 12580896
METHOD AND SYSTEM FOR PRIVATE IDENTITY VERIFICATION
2y 5m to grant Granted Mar 17, 2026
Patent 12574238
ELECTRONIC DEVICE AND CONTROLLING METHOD FOR INCREASING AN OPERATION SPEED OF HOMOMORPHIC ENCRYPTED DATA
2y 5m to grant Granted Mar 10, 2026
Patent 12574206
BLIND ROTATION FOR USE IN FULLY HOMOMORPHIC ENCRYPTION
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
85%
With Interview (-0.4%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 844 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month