Prosecution Insights
Last updated: April 19, 2026
Application No. 18/700,326

USER TERMINAL, PROCESSING EXECUTION APPARATUS, AUTHENTICATION SYSTEM, AUTHENTICATION ASSISTANCE METHOD, PROCESSING EXECUTION METHOD, AND COMPUTER READABLE MEDIUM

Non-Final OA §102§103
Filed
Apr 11, 2024
Examiner
DHOOGE, DEVIN J
Art Unit
2677
Tech Center
2600 — Communications
Assignee
NEC Corporation
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
3y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
50 granted / 71 resolved
+8.4% vs TC avg
Strong +43% interview lift
Without
With
+42.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
48 currently pending
Career history
119
Total Applications
across all art units

Statute-Specific Performance

§101
8.2%
-31.8% vs TC avg
§103
49.4%
+9.4% vs TC avg
§102
35.8%
-4.2% vs TC avg
§112
5.7%
-34.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 71 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Notice to Applicants This communication is in response to the application filed on 04/11/2024. Claims 1-2, 4-12 are currently amended. Claims 1-16 are pending. Information Disclosure Statement The information disclosure statements (IDS’s) filed on 04/11/2024, and 10/10/2025 have been considered Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 7-8, 10-11, and 13-16 are rejected under 35 § U.S.C. 102(a)(1) as being anticipated by Implementation QR Code Biometric Authentication for Online Payment to AGOSTINHO et al. (hereinafter “AGOSTINHO”). As per claim 1, AGOSTINHO discloses a user terminal comprising: an imaging device (a user smart phone acting as the terminal having an attached camera for capturing facial image selfies for biometric verification; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); at least one storage device configured to store instructions (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; page 676, columns 1-2); and at least one processor configured to execute the instructions to (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; page 676, columns 1-2): capture an information code including predetermined activation information using the imaging device (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); activate an authentication assistance function using the activation information included in the captured information code (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6 (and in this manner acts substantially as the activated authentication assistance function); figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); acquire an image including a face of a user captured by the imaging device according to the activation of the authentication assistance function (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and output biometric information of the user based on the image for biometric authentication (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). As per claim 7, AGOSTINHO discloses the user terminal according to claim 1,wherein the biometric authentication is face authentication, and the biometric information is facial feature information (the biometric information which is registered to the user is facial biometric information; fig 5; page 678, column 2). As per claim 8, AGOSTINHO discloses a processing execution apparatus comprising (a user smart phone acting as the terminal having an attached camera for capturing facial image selfies for biometric verification; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2): at least one storage device configured to store instructions (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; pages 676-679); and at least one processor configured to execute the instructions to (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; page 676, columns 1-2): acquire, from a user terminal possessed by a user (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2), biometric information based on an image including a face of the user captured by the user terminal according to an authentication assistance function of the user terminal activated based on predetermined activation information included in an information code captured by the user terminal (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); control biometric authentication for the user when the biometric information is acquired (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and execute predetermined processing when the biometric authentication has succeeded (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). As per claim 10, AGOSTINHO discloses the processing execution apparatus according to claim 8, wherein the biometric authentication is face authentication, and the biometric information is facial feature information (the biometric information which is registered to the user is facial biometric information; fig 5; page 678, column 2). As per claim 11, AGOSTINHO discloses an authentication system comprising: a user terminal possessed by a user (a user smart phone acting as the terminal having an attached camera for capturing facial image selfies for biometric verification; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and a processing execution apparatus configured to execute predetermined processing (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; page 676, columns 1-2), wherein the user terminal includes: an imaging device (the smartphone comprises a camera for capturing images including selfies; figs 4 and 6; page 676, column 1-2); at least one first storage device configured to store first instructions (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; page 676, columns 1-2); and at least one first processor configured to execute the first instructions to (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions; page 676, columns 1-2): capture an information code including predetermined activation information using the imaging device (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); activate an authentication assistance function using the activation information included in the captured information code (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6 (and in this manner acts substantially as the activated authentication assistance function); figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); acquire an image including a face of the user captured by the imaging device according to the activation of the authentication assistance function (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and output biometric information of the user based on the image for biometric authentication (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2), and the processing execution apparatus includes: at least one second storage device configured to store second instructions (a second storage device of the system is cloud storage provided by API azure cloud server storage; fig 1; pages 677, col 1-2; pages 678, col 1-2); and at least one second processor configured to execute the second instructions to (the bank includes bank computing terminals to process information and comprise respective processors; fig 1; pages 677, col 1-2; pages 678, col 1-2): acquire the biometric information from the user terminal (receive from the user smartphone facial biometric information from the onboard camera and is verified and matched with data base/QR code related biometric data; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); control biometric authentication for the user when the biometric information is acquired (authenticates user request from the bank computing terminal after receiving the information from the user; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and execute predetermined processing when the biometric authentication has succeeded (allowing the user to make purchases with bank information once authentication is concluded; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). As per claim 13, AGOSTINHO discloses an authentication assistance method comprising (a method of biometric authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2): by a computer including an imaging device (a user smart phone acting as the terminal having an attached camera for capturing facial image selfies for biometric verification; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2), capturing an information code including predetermined activation information by the imaging device (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); activating an authentication assistance function using the activation information included in the captured information code (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6 (and in this manner acts substantially as the activated authentication assistance function); figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); acquiring an image including a face of a user captured by the imaging device according to the activation of the authentication assistance function (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and outputting biometric information of the user based on the image for biometric authentication (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). As per claim 14, AGOSTINHO discloses a non-transitory computer readable medium storing a program for causing a computer including an imaging device to execute (a smart phone would comprise a non-transitory computer readable storage medium storing a program causing the camera of the smart phone to execute a method related to the instructions; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2): imaging processing of capturing an information code including predetermined activation information by the imaging device (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); activation processing of activating an authentication assistance function using the activation information included in the captured information code (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6 (and in this manner acts substantially as the activated authentication assistance function); figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); acquisition processing of acquiring an image including a face of a user captured by the imaging device according to the activation of the authentication assistance function (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and output processing of outputting biometric information of the user based on the image for biometric authentication (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). As per claim 15, AGOSTINHO discloses a processing execution method comprising (a method of biometric authentication and process execution; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2): by a computer (the smartphone comprises storage in order to store images and instructions related to the described methods and a processor to execute the instructions and an on board camera; page 676, columns 1-2): acquiring, from a user terminal possessed by a user (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2), biometric information based on an image including a face of the user captured by the user terminal according to an authentication assistance function of the user terminal activated based on predetermined activation information included in an information code captured by the user terminal (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); controlling biometric authentication for the user when the biometric information is acquired (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and executing predetermined processing when the biometric authentication has succeeded (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). As per claim 16, AGOSTINHO discloses a non-transitory computer readable medium storing a program for causing a computer to execute (a smart phone would comprise a non-transitory computer readable storage medium storing a program causing the camera of the smart phone to execute a method related to the instructions; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2): acquisition processing of acquiring, from a user terminal possessed by a user (the smartphones camera is used to capture a QR code in step 2 of fig 6 to retrieve facial biometric information from a data base for authentication; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2), biometric information based on an image including a face of the user captured by the user terminal according to an authentication assistance function of the user terminal activated based on predetermined activation information included in an information code captured by the user terminal (the QR code activates the SSL to azure face API to detect a face in the captured image and perform face comparison to the facial image brought up by the database related to the QR code as seen in steps 3 and 4 of fig 6; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); authentication control processing of controlling biometric authentication for the user when the biometric information is acquired (as seen in figure 6 at step 6 a selfie is captured in order to compare the facial information and determine a match; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2); and execution processing of executing predetermined processing when the biometric authentication has succeeded (at step 8 of fig 6 the information is output for verification and if a match the user is allowed to perform purchases and use the account information at the store/bank/location; figs 4 and 6; page 677, column 1 and 2; page 679, column 1-2). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or non-obviousness. Claims 2-3, 9, and 12 are rejected under 35 § U.S.C. 103 as being obvious over Implementation QR Code Biometric Authentication for Online Payment to AGOSTINHO et al. (hereinafter “AGOSTINHO”) in view of US 2020/0380110 A1 to ICHIHARA et al (hereinafter “ICHIHARA”). As per claim 2, AGOSTINHO discloses the user terminal according to claim 1, AGOSTINHO further discloses wherein the information code includes designation of a communication method for the biometric information (the QR code initiates a communication method and the method is using a personal ID card with encrypted related face data encrypted using a 256 bit AES algorithm cryptography communication method and is transmitted over a network using the installed software application on the smart phone; page 677, column 1). Modified AGOSTINHO fails to disclose and wherein the at least one processor is further configured to execute the instructions to: output the biometric information according to the communication method designated by the information code. ICHIHARA discloses and wherein the at least one processor is further configured to execute the instructions to: output the biometric information according to the communication method designated by the information code (the controller 211 of the terminal 20A will transmit the information included in the token (information code) from the terminal 20A to the terminal 20B located within the predetermined/given range by short-range wireless communication capabilities such as Bluetooth, Near field communication, wireless LAN, infrared communication, and ultrasonic communication; paragraph [0106-0109]). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify AGOSTINHO to have output the biometric information according to the designated communication method of ICHIHARA reference. The Suggestion/motivation for doing so would have been to provide a plurality of modes of short-range communication such that the most desirable for the situation may be selected as suggested by ICHIHARA at paragraph [0106]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine ICHIHARA with AGOSTINHO to obtain the invention as specified in claim 2. As per claim 3, AGOSTINHO in view of ICHIHARA discloses the user terminal according to claim 2. Modified AGOSTINHO fails to disclose wherein the communication method is short-range wireless communication. ICHIHARA discloses wherein the communication method is short-range wireless communication (the controller 211 of the terminal 20A will transmit the information included in the token (information code) from the terminal 20A to the terminal 20B located within the predetermined/given range by short-range wireless communication capabilities such as Bluetooth, Near field communication, wireless LAN, infrared communication, and ultrasonic communication; paragraph [0106-0109]). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify AGOSTINHO to have the communication method is short-range wireless communication of ICHIHARA reference. The Suggestion/motivation for doing so would have been to provide a plurality of modes of short-range communication such that the most desirable for the situation may be selected as suggested by ICHIHARA at paragraph [0106]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine ICHIHARA with AGOSTINHO to obtain the invention as specified in claim 3. As per claim 9, AGOSTINHO discloses the processing execution apparatus according to claim 8. Modified AGOSTINHO fails to disclose wherein the at least one processor is further configured to execute the instructions to: acquire the biometric information from the user terminal by short-range wireless communication. ICHIHARA discloses wherein the at least one processor is further configured to execute the instructions to: acquire the biometric information from the user terminal by short-range wireless communication (the controller 211 of the terminal 20A will transmit the information included in the token (information code) from the terminal 20A to the terminal 20B located within the predetermined/given range by short-range wireless communication capabilities such as Bluetooth, Near field communication, wireless LAN, infrared communication, and ultrasonic communication; paragraph [0106-0109]). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify AGOSTINHO to have acquire the biometric information from the user terminal by short-range wireless communication of ICHIHARA reference. The Suggestion/motivation for doing so would have been to provide a plurality of modes of short-range communication such that the most desirable for the situation may be selected as suggested by ICHIHARA at paragraph [0106]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine ICHIHARA with AGOSTINHO to obtain the invention as specified in claim 9. As per claim 12, AGOSTINHO discloses the authentication system according to claim 11. AGOSTINHO further discloses wherein the information code includes designation of a communication method for the biometric information (the QR code initiates a communication method and the method is using a personal ID card with encrypted related face data encrypted using a 256-bit AES algorithm cryptography communication method and is transmitted over a network using the installed software application on the smart phone; page 677, column 1). AGOSTINHO fails to disclose and wherein the at least one first processor is further configured to execute the first instructions to: output the biometric information according to the communication method designated by the information code. ICHIHARA discloses and wherein the at least one first processor is further configured to execute the first instructions to: output the biometric information according to the communication method designated by the information code (the controller 211 of the terminal 20A will transmit the information included in the token (information code) from the terminal 20A to the terminal 20B located within the predetermined/given range by short-range wireless communication capabilities such as Bluetooth, Near field communication, wireless LAN, infrared communication, and ultrasonic communication; paragraph [0106-0109]). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify AGOSTINHO to have output the biometric information according to the communication method of ICHIHARA reference. The Suggestion/motivation for doing so would have been to provide a plurality of modes of short-range communication such that the most desirable for the situation may be selected as suggested by ICHIHARA at paragraph [0106]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine ICHIHARA with AGOSTINHO to obtain the invention as specified in claim 12. Claims 4-6 are rejected under 35 § U.S.C. 103 as being obvious over Implementation QR Code Biometric Authentication for Online Payment to AGOSTINHO et al. (hereinafter “AGOSTINHO”) in view of A Novel QR Code and mobile phone-based Authentication protocol via Bluetooth to LIU et al. (hereinafter “LIU”). As per claim 4, AGOSTINHO discloses the user terminal according to claim 1. Modified AGOSTINHO fails to disclose wherein the information code includes terminal information that is a destination to which the biometric information is output, and wherein the at least one processor is further configured to execute the instructions to: output the biometric information with the terminal information included in the information code as an output destination. LIU discloses wherein the information code includes terminal information that is a destination to which the biometric information is output (each terminal is adapted to include and save a blue tooth address related to the terminal device and including an address location and self-verification time stamp with each address during authentication; page 42, bottom half), and wherein the at least one processor is further configured to execute the instructions to: output the biometric information with the terminal information included in the information code as an output destination (the computing terminal comprising a processor is adapted to output the facial biometric information that is authenticated at a particular blue tooth address related to a particular user terminal providing time and location stamps for approval of the authentication; page 42, bottom half; page 47; page 48, conclusion). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify AGOSTINHO to have output the biometric information with the terminal information included in the information code as an output destination of LIU reference. The Suggestion/motivation for doing so would have been to provide a four-party authentication system having self-verified time stamp and location/Bluetooth address technology and helps prevent man in the middle attacks using this kind of authentication scheme as suggested by LIU at the conclusion paragraph on page 48. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine LIU with modified AGOSTINHO to obtain the invention as specified in claim 4. As per claim 5, AGOSTINHO discloses the user terminal according to claim 1. Modified AGOSTINHO fails to disclose wherein the information code includes destination information of an authentication control apparatus configured to control the biometric authentication, and wherein the at least one processor is further configured to execute the instructions to: output the biometric information by transmitting the biometric information to the destination information included in the information code. LIU discloses wherein the information code includes destination information of an authentication control apparatus configured to control the biometric authentication, and wherein the at least one processor is further configured to execute the instructions to: output the biometric information by transmitting the biometric information to the destination information included in the information code (each terminal is adapted to include and save a blue tooth address related to the terminal device and including an address location and self-verification time stamp with each address during authentication and is then sent to the terminal at the destination of the user determined using blue tooth address technology; page 42, bottom half). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify AGOSTINHO to have output the biometric information by transmitting the biometric information to the destination information included in the information code of LIU reference. The Suggestion/motivation for doing so would have been to provide a four-party authentication system having self-verified time stamp and location/Bluetooth address technology and helps prevent man in the middle attacks using this kind of authentication scheme as suggested by LIU at the conclusion paragraph on page 48. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine LIU with AGOSTINHO to obtain the invention as specified in claim 5. As per claim 6, AGOSTINHO in view of LIU discloses the user terminal according to claim 5. Modified AGOSTINHO fails to disclose wherein the information code further includes identification information of a processing terminal configured to perform processing according to the biometric authentication, and wherein the at least one processor is further configured to execute the instructions to: output the biometric information and the identification information of the processing terminal included in the information code by transmitting the biometric information and the identification information of the processing terminal to the destination information. LIU discloses wherein the information code further includes identification information of a processing terminal configured to perform processing according to the biometric authentication, and wherein the at least one processor is further configured to execute the instructions to: output the biometric information and the identification information of the processing terminal included in the information code by transmitting the biometric information and the identification information of the processing terminal to the destination information (the computing terminal comprising a processor is adapted to output the facial biometric information that is authenticated at a particular blue tooth address related to a particular user terminal providing time and location stamps for approval of the authentication; page 42, bottom half; page 47; page 48, conclusion). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify AGOSTINHO to have output the biometric information and the identification information of the processing terminal included in the information code by transmitting the biometric information and the identification information of the processing terminal to the destination information of LIU reference. The Suggestion/motivation for doing so would have been to provide a four-party authentication system having self-verified time stamp and location/Bluetooth address technology and helps prevent man in the middle attacks using this kind of authentication scheme as suggested by LIU at the conclusion paragraph on page 48. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine LIU with modified AGOSTINHO to obtain the invention as specified in claim 6. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. These prior arts include the following: US 2014/0250500 A1 US 2019/0273733 A1 US 2021/0117980 A1 Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEVIN JACOB DHOOGE whose telephone number is (571) 270-0999. The examiner can normally be reached 7:30-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached on (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800- 786-9199 (IN USA OR CANADA) or 571-272-1000. /Devin Dhooge/ USPTO Patent Examiner Art Unit 2677 /ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677
Read full office action

Prosecution Timeline

Apr 11, 2024
Application Filed
Feb 18, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602773
Deep-Learning-based T1-Enhanced Selection of Linear Coefficients (DL-TESLA) for PET/MR Attenuation Correction
2y 5m to grant Granted Apr 14, 2026
Patent 12579780
HYPERSPECTRAL TARGET DETECTION METHOD OF BINARY-CLASSIFICATION ENCODER NETWORK BASED ON MOMENTUM UPDATE
2y 5m to grant Granted Mar 17, 2026
Patent 12524982
NON-TRANSITORY COMPUTER READABLE RECORDING MEDIUM, VISUALIZATION METHOD AND INFORMATION PROCESSING APPARATUS
2y 5m to grant Granted Jan 13, 2026
Patent 12517146
IMAGE-BASED DECK VERIFICATION
2y 5m to grant Granted Jan 06, 2026
Patent 12505673
MULTIMODAL GAME VIDEO SUMMARIZATION WITH METADATA
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+42.9%)
3y 5m
Median Time to Grant
Low
PTA Risk
Based on 71 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month