Prosecution Insights
Last updated: April 19, 2026
Application No. 18/625,789

SYSTEMS AND METHODS FOR EMOTIONALLY ADAPTIVE FINANCIAL CHATBOT

Non-Final OA §101§103§112
Filed
Apr 03, 2024
Examiner
YONO, RAVEN E
Art Unit
3694
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Wells Fargo Bank N A
OA Round
3 (Non-Final)
39%
Grant Probability
At Risk
3-4
OA Rounds
2y 6m
To Grant
72%
With Interview

Examiner Intelligence

Grants only 39% of cases
39%
Career Allow Rate
69 granted / 175 resolved
-12.6% vs TC avg
Strong +32% interview lift
Without
With
+32.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
32 currently pending
Career history
207
Total Applications
across all art units

Statute-Specific Performance

§101
40.5%
+0.5% vs TC avg
§103
31.3%
-8.7% vs TC avg
§102
3.0%
-37.0% vs TC avg
§112
19.9%
-20.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 175 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on December 11, 2025 has been entered. Status of Claims • This action is in reply to the RCE filed on December 11, 2025. • Claims 1, 10, and 16 have been amended and are hereby entered. • Claims 1-20 are currently pending and have been examined. • This action is made Non-FINAL. Response to Arguments Applicant’s arguments filed November 21, 2025 have been fully considered but they are not persuasive. The Examiner is withdrawing the Specification objections due to Applicant’s amendments. The Examiner is withdrawing some of the 35 USC § 112 rejections due to Applicant’s amendments. New 35 USC § 112 rejections have been entered due to applicant’s amendments. Applicant’s arguments with respect to 35 USC § 112 have been fully considered and are not persuasive. Regarding Applicant’s arguments on page 7, regarding claim 5, that the term “computer vision” does not render the claim indefinite, the Examiner respectfully disagrees. Although the Applicant argues that the term “computer vision” has a well-established meaning in the art, this does not resolve the ambiguity presented by the limitation. The limitation of “using computer vision” to determine an emotional state, without detail as to what specific algorithm or steps are being performed, is so broad as to be boundless and therefore indefinite under 112b. Regarding Applicant’s arguments on page 7, regarding claim 8, that the term “includes artificial intelligence” does not render the claim indefinite, the Examiner respectfully disagrees. Although the Applicant argues that the Specification provides detail about machine learning and AI technologies, this does not resolve the ambiguity presented by the limitation. The limitation of “including artificial intelligence,” without detail as to what specifically is included other that the field of artificial intelligence broadly, is so broad as to be boundless and therefore indefinite under 112b. Regarding Applicant’s arguments on pages 7-8, regarding claim 14, that the term “AI-based knowledge tree” does not render the claim indefinite, the Examiner respectfully disagrees. Although the Applicant argues that the term refers to well-understood concepts, this does not resolve the ambiguity presented by the limitation. The limitation of “using…(AI)-based knowledge tree,” without detail as to what constitutes being AI-based, is so broad as to be boundless and therefore indefinite under 112b. Applicant’s arguments with respect to 35 USC § 101 have been fully considered and are not persuasive. Regarding Applicant’s argument on pages 8-9, that the claims improve an existing technology or technical field and that the claim recites significantly more than the abstract idea and that the claim integrates the judicial exception into a practical application, the Examiner respectfully disagrees. Under the Patent Subject Matter Eligibility analysis, Step 2A, prong two, integration into a practical application requires an additional element(s) or a combination of additional elements in the claim to apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. Limitations that are not indicative of integration into a practical application are those that are mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea.-see MPEP 2106.05(f). Here, the claims recite a computer system; a computing system comprising one or more processors and a data storage system in communication with the one or more processors, wherein the data storage system comprises instructions thereon that, when executed by the one or more processors, causes the one or more processors to perform claim functions; a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium including instructions that, when executed by computers, cause the computers to perform claim functions; using machine learning; the machine learning includes a neural network trained on emotional training feature data such that they amount to no more than mere instructions to apply the exception using generic computer components (see MPEP 2106.05(f)). Furthermore, and in response to Applicant’s arguments on pages 8-10 that the claims improve technology, in determining whether a claim integrates a judicial exception into a practical application, a determination is made of whether the claimed invention pertains to an improvement in the functioning of the computer itself or any other technology or technical field (i.e., a technological solution to a technological problem). Here, the claims recite generic computer components, i.e., a generic processor, a memory storing a computer program executable by the processor to perform the claimed method steps and system functions. The processor, memory and system are recited at a high level of generality and are recited as performing generic computer functions customarily used in computer applications. Furthermore, the Specification describes a problem and improvement to a business or commercial process at least at [0002] of the Specification, describing problem improving understanding of emotional states of users which may impact their financial decisions, and at least at [0017]-[0019], describing improving security and mitigate fraud associated with transactions, and at least at [0022], stating, “the present system may function like a coach or therapist, reading the speech, face, and/or body language and adjusting responses to bring focus back for the customer or user.” Regarding Applicant’s arguments on page 10, that the claims could not be performed mentally or manually, the argument is not persuasive. In response to this argument, the Examiner notes that “Mental Processes” are not the only category of abstract idea recognized in the MPEP. The MPEP also recognizes that “fundamental economic principles or practices” including mitigating risk and “commercial and legal interactions” including sales activities or behaviors are among the enumerated groups of abstract ideas (MPEP §2106.04(a)). Furthermore, Examiners are directed to continue to use the Mayo/Alice framework (incorporated as Steps 2A and Step 2B) to resolve questions of eligibility and that Examiners should determine whether a claim recites an abstract idea by (1) identifying the claimed concept (the specific claim limitation(s) in the claim under examination that the examiner believes may be an abstract idea), and (2) comparing the claimed concept to the concepts previously identified as abstract ideas by the courts to determine if it is similar (see MPEP 2106.04(a)). Furthermore regarding Applicant’s arguments on page 10 that the claims improve technology, the Examiner respectfully disagrees. The pending claims do not describe a technical solution to a technical problem. The pending claims are directed to solving the problem improving understanding of emotional states of users which may impact their financial decisions (see at least [0002] of the Specification), and improve security and mitigate fraud associated with transactions (see at least [0017]-[0019]). The claims of the instant application describe an improvement to a business process i.e., improving understanding of emotional states of users which may impact their financial decisions, and improve security and mitigate fraud associated with transactions, not improvement in the functioning of the computer itself or an improvement to any other technology or technological field. And, notably, the Examiner notes that paragraph [0022] states, “the present system may function like a coach or therapist, reading the speech, face, and/or body language and adjusting responses to bring focus back for the customer or user.” Applicant is reminded that, mere automation of a process, without improving a technical aspect of that process, does not integrate the abstract ideas into a practical application. See Intellectual Ventures 1 LLC v. Capital One Bank (USA), 792 F.3d 1363, 1370 (Fed. Cir. 2015) (“merely adding computer functionality to increase the speed or efficiency of the process does not confer patent eligibility on an otherwise abstract idea.”). The claims are not patent eligible. Applicant’s arguments with respect to 35 USC § 103 on pages 11-12 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Furthermore, regarding Applicant’s arguments on page 12, that the cited art of record does not teach adjusting responses to provide a supportive and empathetic tone when negative states are detected, the Examiner respectfully disagrees. As discussed in the 103 rejection below, Smith teaches adjusting responses to notify emergency contacts about the state of the user, at least at least col. 12, lines 29-56. The Examiner interprets notifying loved ones as an adjustment to provide a supportive and empathetic tone. The art of record therefore teaches this limitation. Furthermore regarding Applicant’s arguments on pages 13-15, the arguments have been fully considered and are not persuasive. Applicant has not provided an explanation why the cited prior art does not teach the claimed limitations. Rather, Applicant has set forth claim limitations and made conclusory statements that the art does not teach the claim limitations. Answers to the arguments on the amended limitations are addressed in the action below. The Examiner therefore refers Applicant to the 103 rejection below, which addresses how the prior art teaches the claimed invention. For the reasons above, Applicant’s arguments are not persuasive. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 1, the claim recites “adjusting… provide a supportive and empathetic tone.” A person having ordinary skill in the art would not know what encompasses a supportive and empathetic tone. For example, if a chatbot replied “Yes” to a user request, and then replied “OK” to a user response, then a person having ordinary skill in the art would not be able to tell if this is an adjustment to a supportive and empathic tone; or if this is a neutral tone; or if this is a negative tone. The Specification offers no guidance as to how to ascertain what is considered a supportive and empathetic tone. The limitation is so broad as to be boundless and render the claim indefinite under 112b. For examination purposes, the Examiner is interpreting the limitation as “adjusting responses to the user when a negative emotional state of the user is detected.” Furthermore regarding claim 1, the claim recites “when a negative emotional state of the user is detected.” A person having ordinary skill in the art would not know what encompasses a negative emotional state of the user. The Specification offers no guidance as to how to ascertain what is considered encompasses a negative emotional state of the user. The limitation is so broad as to be boundless and render the claim indefinite under 112b. For examination purposes, the Examiner is interpreting the limitation as “a state of the user.” Claim 10 and 16 have similar limitations found in claim 1 above, and therefore is rejected by the same rationale. Regarding claim 5, the claim recites “determining… using computer vision.” The limitation of “using computer vision” is so broad as to be boundless and therefore indefinite under 112b. Regarding claim 8, the claim recites the limitation “machine learning includes artificial intelligence.” The limitation of “including artificial intelligence” is so broad as to be boundless and therefore indefinite under 112b. Regarding claim 14, the claim recites the limitation “using…(AI)-based knowledge tree.” The limitation of using an AI-based knowledge tree is so broad as to be boundless and therefore indefinite under 112b. The rest of the dependent claims are rejected due to their dependency to a rejected claim. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention recites an abstract idea without significantly more. Independent claims 1, 10, and 16 are directed to a method (claim 1), a system (claim 10), and an apparatus (claim 16). Therefore, on its face, each independent claim 1, 10, and 16 are directed to a statutory category of invention under Step 1 of the Patent Subject Matter Eligibility analysis (see MPEP 2106.03). Under Step 2A, Prong One of the Patent Subject Matter Eligibility analysis (see MPEP 2106.04), claims 1, 10, and 16 recite, in part, a system, a method, and an apparatus of organizing human activity. Using the limitations in claim 1 to illustrate, the claim recites receiving authentication information from a user; authenticating the user for a transaction based on the received authentication information; detecting an abnormal aspect of the transaction based on parameters of the transaction; upon detecting the abnormal aspect, determining an emotional state of the user by analyzing at least one of voice tone, speech patterns, or facial expressions of the user, wherein determining the emotional state includes facial emotion recognition methodology to identify facial features; adapting an interaction style with the user based on the determined emotional state of the user by automatically adjusting responses to the user to provide a supportive and empathetic tone when a negative emotional state of the user is detected; receiving an input from the user after adapting the interaction style; and implementing additional security requirements for the transaction based on the detected abnormal aspect, the input from the user, and the determined emotional state. The limitations, as drafted, is a process that, under its broadest reasonable interpretation, covers fundamental economic principles or practices and commercial and legal interactions (certain methods of organizing human activity), but for the recitation of generic computer components. The claims as a whole recite a method of organizing human activity. The claimed inventions allows for implementing additional security requirements to process a transaction initiated by the user when determining a user is in an abnormal emotional state, which is a fundamental economic principle or practice of mitigating risk, and a commercial and legal interaction of sales activities or behaviors. The mere nominal recitation of a computer system do not take the claim out of the methods of organizing human activity grouping. Thus, the claims recite an abstract idea. Under Step 2A, Prong Two of the Patent Subject Matter Eligibility analysis (see MPEP 2106.04), the judicial exception is not integrated into a practical application. In particular, the additional elements of a computer system; a computing system comprising one or more processors and a data storage system in communication with the one or more processors, wherein the data storage system comprises instructions thereon that, when executed by the one or more processors, causes the one or more processors to perform claim functions; a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium including instructions that, when executed by computers, cause the computers to perform claim functions; using machine learning; the machine learning includes a neural network trained on emotional training feature data are recited at a high-level of generality (i.e., as a generic processor performing a generic computer function detecting an abnormal aspect associated with a user conducting a transaction and require additional security requirements to proceed with the transaction) such that they amount to no more than mere instructions to apply the exception using a generic computer components (see MPEP 2106.05(f)). Accordingly, the combination of the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea. Under Step 2B of the Patent Subject Matter Eligibility analysis (see MPEP 2106.05), the claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements in the claims amount to no more than mere instructions to apply the exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. The claims are not patent eligible. The dependent claims have been given the full two part analysis including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because for the same reasoning as above and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. Dependent claims 2-4, 6-7, and 17-20 simply help to define the abstract idea. Dependent claims 5, 8-9, and 11-15, simply further describes the technological environment. The additional limitations of the dependent claim(s) when considered individually and as an ordered combination do not amount to significantly more than the abstract idea. Viewing the claim limitations as an ordered combination does not add anything further than looking at the claim limitations individually. When viewed either individually, or as an ordered combination, the additional limitations do not amount to a claim as a whole that is significantly more than the abstract idea. Accordingly, claims 1-20 are ineligible. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over US 11157906 B1 (“Smith”) in view of US 20250291904 A1 (“Castinado”). Regarding claim 1, Smith discloses a computer-implemented method comprising (see at least FIG. 3.): receiving, by a computer system, authentication information from a user of the computer system (Receiving transaction information communication. In a purchase transaction the communication may indicate information such as the amount, the product or service being purchased, the user, the location of the transaction, the merchant name. See at least col. 5, lines 21-50.); authenticating, by the computer system, the user for a transaction based on the received authentication information (After receiving transaction request, testing a user to validate emotional, physical, or intellectual state, for example, a test to determine whether the user is actually impaired/intoxicated. The user may be asked to solve a puzzle, answer questions, or otherwise interact with the interface of the user device to determine whether the user is impaired with regard to their physical, mental, and/or emotional state. See at least col. 12, lines 57-67; and see also col. 5, line 53 to col. 6, line 3.); detecting, by the computer system, an abnormal aspect of the transaction based on parameters of the transaction (An indication of a transaction may be received. As described above, the transaction may involve a user, and may indicate the user's attempt to make a purchase, request a funds transfer, enter into a contract, and/or other transaction. Data associated with the user may be received. The data may include sensor data generated by sensor device(s). The data may also include other information, such as text, gestural inputs (e.g., clicks, double-clicks, swipes, pinches, etc.), voice input, haptic input, brain waves, eye movement, facial expressions, and/or other inputs made by the user through an interface of the user device. The data received may also include a current location of the user and/or user device. The user state may be determined for the user as described above. See at least col. 5, lines 8-24.); upon detecting the abnormal aspect, determining, by the computer system, an emotional state of the user (The sensor data, and/or other data generated by the user device, may be received over one or more networks by one or more analysis modules executing on the management device(s) or elsewhere. The sensor data and/or other data may be analyzed by the analysis module(s) to determine a user state for the user. The user state may be a physical state indicating that the user is fatigued, intoxicated, asleep, and so forth. The user state may be a mental state indicating that the user is manic, depressed, experiencing dementia, not lucid, and so forth. The user state may be an emotional state indicating that the user is sad, angry, enraged, happy, and so forth. The user state may include a description of the user's physical, mental, and/or emotional state. See at least col. 7, lines 27-44.) by analyzing at least one of voice tone, speech patterns, or facial expressions of the user (Data associated with the user may be received. The data may include sensor data generated by sensor device(s). The data may also include other information, such as text, gestural inputs (e.g., clicks, double-clicks, swipes, pinches, etc.), voice input, haptic input, brain waves, eye movement, facial expressions, and/or other inputs made by the user through an interface of the user device. The data received may also include a current location of the user and/or user device. See at least col. 13, lines 14-22. Audio data of the voice of the user may be analyzed using audio analysis techniques to detect emotional indicators in the user's voice and/or language usage. See at least col. 8, lines 12-25.), wherein determining the emotional state includes using facial emotion recognition methodology to identify facial features (The sensors may include cameras or other imaging devices configured to generate images and or video of the user's face or other body parts. See at least col. 5, line 51 to col. 6, line 24. Images of at least a portion of the user's face (e.g., mouth, eyes, etc.) and/or posture (e.g., shoulders) may be analyzed using mood recognition analysis techniques to determine the emotional state of the user, such as whether the user is stressed, relaxed, happy, sad, angry, calm, scared, and so forth. See at least col. 7, lines 45-55.); adapting, by the computer system, an interaction style with the user based on the determined emotional state of the user (Based on the current user state, the constraint(s) in the constraint information, and/or other information in the user profile, one or more actions may be determined as described above. The action(s) may be performed to manage the transaction. See at least col. 13, lines 57-61.) by automatically adjusting responses to the user to provide a supportive and empathetic tone when a negative emotional state of the user is detected (Action(s) may include sending a notification to the user and/or other individual(s), describing the transaction and the user state that may indicate why the transaction may be a bad choice by the user given the user's financial plan, budget, or goals, or given the user's characteristics (e.g., gambling addiction). In some implementations, another individual may be notified that the user has initiated a transaction that violates or is contrary to one or more constraints. The notification may include appropriate information to enable the other individual to assist the user. For example, the notification may identify the user, indicate the user's location, describe the current user state of the user, describe the transaction that is being attempted, and/or other information. The notification may be communicated to the user and/or other individual(s) using various techniques, such as email, text message, voice message, and so forth. In some examples, the user may have previously indicated other individual(s) to receive notification(s), e.g., as emergency contact information stored in the user profile. For example, the user may have indicated a spouse, guardian, adult child, sibling, other relative, caregiver, doctor, attorney, accountant, trust manager, trust management service, business partner, colleague, or other(s). In some examples, the other individual(s) may have been selected an entity other than the user. For example, the other individual(s) may include a court-appointed guardian or caretaker. See at least col. 12, lines 29-56. The Examiner interprets notifying loved ones as an adjustment to provide a supportive and empathetic tone); receiving, by the computer system, an input from the user after adapting the interaction style (Action(s) may include administering a test to the user, e.g., through the user device, to determine whether the user is actually impaired. For example, to check whether a user is intoxicated the user may be asked to speak a phrase which is captured by audio sensor device(s) and analyzed to determine whether the user is slurring their words. See at least col. 12, lines 57-67.); and implementing, by the computer system, additional security requirements for the transaction based on the detected abnormal aspect, the input from the user, and the determined emotional state (Blocking transaction if user state is intoxicated. See at least FIG. 2, and see also col. 11, lines 26-39. Other actions include delaying transaction for period of time, such as 24 hours. The user may then reconsider the transaction with a possibly clearer head. See at least col. 12, lines 23-28.). While Smith discloses determining, Smith does not expressly disclose determining using machine learning. Furthermore, Smith does not expressly disclose the machine learning includes a neural network trained on emotional training feature data. However, Castinado discloses determining using machine learning; the machine learning includes a neural network trained on emotional training feature data (voice stress analysis, facilitated by machine learning models trained on a dataset of vocal patterns under various emotional states. See at least [0088].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify the determining of Smith to determine using machine learning, as taught by Castinado, and to modify Smith to include machine learning including a neural network trained on emotional training feature data, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Regarding claim 2, the combination of Smith and Castinado disclose the limitations of claim 1, as discussed above, and Smith further discloses determining the emotional state of the user includes using voice analysis (Analyzing voice and speech including tone. See at least col. 8, lines 26-42 and col. 13, lines 13-22.). Regarding claim 3, the combination of Smith and Castinado disclose the limitations of claim 2, as discussed above, and Smith further discloses determining the emotional state of the user includes analyzing voice tone of the user (Analyzing voice and speech including tone. See at least col. 8, lines 26-42 and col. 13, lines 13-22.). Regarding claim 4, the combination of Smith and Castinado disclose the limitations of claim 2, as discussed above, and Smith further discloses determining the emotional state of the user includes analyzing speech patterns of the user (Analysis of the speech of the user includes, but is not limited to, identification and analysis of the particular words, phrases, and/or sentences spoken by the user. Analysis of the user's speech may also include analysis of the tone, clarity, volume, speed, and/or other characteristics of the user's speech that may be captured in the audio data gathered by the audio sensor devices. See at least col. 8, lines 26-42.). Regarding claim 5, the combination of Smith and Castinado disclose the limitations of claim 1, as discussed above, and Smith further discloses determining the emotional state of the user (The sensor data, and/or other data generated by the user device, may be received over one or more networks by one or more analysis modules executing on the management device(s) or elsewhere. The sensor data and/or other data may be analyzed by the analysis module(s) to determine a user state for the user. The user state may be a physical state indicating that the user is fatigued, intoxicated, asleep, and so forth. The user state may be a mental state indicating that the user is manic, depressed, experiencing dementia, not lucid, and so forth. The user state may be an emotional state indicating that the user is sad, angry, enraged, happy, and so forth. The user state may include a description of the user's physical, mental, and/or emotional state. See at least col. 7, lines 27-44.). While Smith discloses determining, Smith does not expressly disclose determining includes using computer vision. However, Castinado discloses determining includes using computer vision (the system may employ computer vision algorithms to detect micro-expressions that are incongruent with baseline emotional states commonly associated with standard discourse on trade secrets or proprietary methodologies. See at least [0088]. See also [0056].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify the determining of Smith to determine using computer vision, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Regarding claim 6, the combination of Smith and Castinado disclose the limitations of claim 5, as discussed above, and Smith further discloses determining the emotional state of the user includes analyzing facial expressions of the user (The sensor data, and/or other data generated by the user device, may be received over one or more networks by one or more analysis modules executing on the management device(s) or elsewhere. The sensor data and/or other data may be analyzed by the analysis module(s) to determine a user state for the user. The user state may be a physical state indicating that the user is fatigued, intoxicated, asleep, and so forth. The user state may be a mental state indicating that the user is manic, depressed, experiencing dementia, not lucid, and so forth. The user state may be an emotional state indicating that the user is sad, angry, enraged, happy, and so forth. The user state may include a description of the user's physical, mental, and/or emotional state. See at least col. 7, lines 27-44. Movement data including facial movements, see at least col. 8, lines 43-61. See also col. 7, lines 45-55, describing analyzing images of a user’s face to determine mood.). Regarding claim 7, the combination of Smith and Castinado disclose the limitations of claim 5, as discussed above, and Smith further discloses determining the emotional state of the user includes analyzing body language of the user (The sensor data, and/or other data generated by the user device, may be received over one or more networks by one or more analysis modules executing on the management device(s) or elsewhere. The sensor data and/or other data may be analyzed by the analysis module(s) to determine a user state for the user. The user state may be a physical state indicating that the user is fatigued, intoxicated, asleep, and so forth. The user state may be a mental state indicating that the user is manic, depressed, experiencing dementia, not lucid, and so forth. The user state may be an emotional state indicating that the user is sad, angry, enraged, happy, and so forth. The user state may include a description of the user's physical, mental, and/or emotional state. See at least col. 7, lines 27-44. Movement data, indicating movements of the user and/or the user device, may be analyzed to determine a physical, emotional, and/or mental state of the user. For example, movement data indicating that the user is jittery may indicate that the user is stressed, tense, angry, or in some other emotional state. As another example, the walking movements, swaying, gestures (e.g., with hands, arms, wrists, and/or fingers), and/or other movements of the user and/or the user device may be characteristic of individuals who are intoxicated. The position(s), articulations (e.g., a flat hand, pointed finger, bend fingers, first, or other configurations of body parts), and/or movements of hands, arms, wrists, and/or fingers may also indicate intoxication or some other state of the user. See at least col. 8, lines 43-61.). Regarding claim 8, the combination of Smith and Castinado disclose the limitations of claim 1, as discussed above. Smith does not expressly disclose the machine learning includes artificial intelligence. However, Castinado discloses the machine learning includes artificial intelligence (This invention leverages a large language model (LLM) powered by artificial intelligence (AI) to analyze electronic communications across various modalities such as speech patterns, typing speed, and facial expressions to detect potential attempts at data or software misappropriation. See at least [0004]. See also [0028].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Smith to include a machine learning model including artificial intelligence, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Regarding claim 9, the combination of Smith and Castinado disclose the limitations of claim 8, as discussed above. Smith does not expressly disclose the artificial intelligence includes a large language model (LLM). However, Castinado discloses the artificial intelligence includes a large language model (LLM) (This invention leverages a large language model (LLM) powered by artificial intelligence (AI) to analyze electronic communications across various modalities such as speech patterns, typing speed, and facial expressions to detect potential attempts at data or software misappropriation. See at least [0004]. See also [0019] and [0028].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Smith to include a artificial intelligence that includes a large language model, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Claim 10 has similar limitations found in claim 1 above, and therefore is rejected by the same art and rationale. And Smith discloses a system comprising: a computing system comprising one or more processors and a data storage system in communication with the one or more processors, wherein the data storage system comprises instructions thereon that, when executed by the one or more processors, causes the one or more processors to perform claim functions (see Smith at least at col. 15, lines 34-67. See also col. 16, lines 1-36.) Regarding claim 11, the combination of Smith and Castinado discloses the limitations of claim 10, as discussed above. Smith does not expressly disclose using machine learning includes using a machine learning model including a neural network. However, Castinado discloses using machine learning includes using a machine learning model including a neural network (The machine learning algorithms contemplated, described, and/or used herein include supervised learning (e.g., using logistic regression, using back propagation neural networks, using random forests, decision trees, etc.), unsupervised learning (e.g., using an Apriori algorithm, using K-means clustering), semi-supervised learning, reinforcement learning (e.g., using a Q-learning algorithm, using temporal difference learning), and/or any other suitable machine learning model type. See at least [0058]. See also [0084] and [0088].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Smith to use machine learning including using a machine learning model including a neural network, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Regarding claim 12, the combination of Smith and Castinado discloses the limitations of claim 10, as discussed above. Smith does not expressly disclose using machine learning includes using a machine learning model including a long short-term memory (LSTM) network. However, Castinado discloses using machine learning includes using a machine learning model including a long short-term memory (LSTM) network (sequential models like recurrent neural networks (RNN) with long short-term memory (LSTM) cells are adept at detecting anomalies in speech patterns, including unusual pauses or rate changes that might suggest stress or deception. See at least [0084]. See also [0088].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Smith to use machine learning including using a machine learning model including a long short-term memory (LSTM) network, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Regarding claim 13, the combination of Smith and Castinado discloses the limitations of claim 10, as discussed above. Smith does not expressly disclose using machine learning includes using a machine learning model including natural language processing (NLP). However, Castinado discloses using machine learning includes using a machine learning model including natural language processing (NLP) (These models may include convolutional neural networks (CNNs) for temporal feature extraction from audio data, and recurrent neural networks (RNNs) with long short-term memory (LSTM) cells to track the progression of speech patterns over time. Additionally, natural language processing (NLP) techniques are applied to assess conversation content, identifying deviations such as unusual pauses or phrasing that could indicate a higher probability of incident concerning IP confidentiality. See at least [0088]. See also [0063], [0079]-[0081], and [0083]). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Smith to use machine learning including using a machine learning model including natural language processing, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Regarding claim 14, the combination of Smith and Castinado discloses the limitations of claim 10, as discussed above. Smith does not expressly disclose using machine learning includes using a machine learning model including an artificial intelligence (AI)-based knowledge tree. However, Castinado discloses using machine learning includes using a machine learning model including an artificial intelligence (AI)-based knowledge tree (Each of these types of machine learning algorithms can implement any of one or more of.. a decision tree learning method (e.g., classification and regression tree, iterative dichotomiser 3, C4.5, chi-squared automatic interaction detection, decision stump, random forest, multivariate adaptive regression splines, gradient boosting machines, etc.). See at least [0058]. See also [0059].). From the teaching of Castinado, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Smith to use machine learning including using a machine learning model including an AI-based knowledge tree, as taught by Castinado, in order to improve security and improve prevention of attempts to access or misappropriate sensitive data (see Castinado at least [0002]-[0003]), and to improve existing solutions to problem of detecting deceptive behaviors and improve speed and efficiency (see Castinado at least at [0030]). Claim 15 has similar limitations found in claim 9 above, and therefore is rejected by the same art and rationale. Claim 16 has similar limitations found in claim 1 above, and therefore is rejected by the same art and rationale. And Smith discloses a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium including instructions that, when executed by computers, cause the computers to perform claim functions (see Smith at least at col. 15, lines 34-67. See also col. 16, lines 1-36.) Regarding claim 17, the combination of Smith and Castinado discloses the limitations of claim 16, as discussed above, and Smith further discloses implementing additional security requirements for the transaction includes limiting an amount of the transaction (As another example, a constraint may indicate that a transaction is to be delayed for a period of time (e.g., 24 hours) if it includes a purchase that exceeds a predetermined threshold amount (e.g., greater than $500) and if the user 104 is currently in a manic mental state. See at least col. 11, lines 35-39. See also col. 9, line 63 to col. 10, line 48.). Regarding claim 18, the combination of Smith and Castinado discloses the limitations of claim 16, as discussed above, and Smith further discloses implementing additional security requirements for the transaction includes delaying the transaction (Other actions include delaying transaction for period of time, such as 24 hours. The user may then reconsider the transaction with a possibly clearer head. See at least col. 12, lines 23-28.). Regarding claim 19, the combination of Smith and Castinado discloses the limitations of claim 16, as discussed above, and Smith further discloses implementing additional security requirements for the transaction includes implementing an additional security review of the transaction (Blocking transaction if user state is intoxicated. See at least FIG. 2, and see also col. 11, lines 26-39. Other actions include delaying transaction for period of time, such as 24 hours. The user may then reconsider the transaction with a possibly clearer head. See at least col. 12, lines 23-28. The Examiner interprets the user reconsidering the transaction at a later time with a clearer head as an additional security review.). Regarding claim 20, the combination of Smith and Castinado discloses the limitations of claim 16, as discussed above, and Smith further discloses implementing additional security requirements for the transaction includes providing a notification to the user (Action(s) may include sending a notification to the user and/or other individual(s), describing the transaction and the user state that may indicate why the transaction may be a bad choice by the user given the user's financial plan, budget, or goals, or given the user's characteristics (e.g., gambling addiction). See at least col. 12, lines 29-56.). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 12131325 B1 (“Lyle”) discloses account access security at a network access device. The method includes receiving data indicating that a particular user has initiated a particular request at an access point; obtaining sensor data that is generated by one or more sensors of the access point proximate to a time that the particular user initiated the particular request; classifying the particular request as a particular type of request; in response to determining that the particular request is classified as the particular type of request, initiating an exception processing mode in which requests that are initiated by the users at the access point result in the generation and output of inaccurate completion data to mimic completion of the request; processing the particular request using the exception processing mode; and providing the inaccurate completion data generated by the access point for the particular request. US 11861645 B2 (“Bermudez”) discloses receive biometric data and transaction data relating to a transaction, apply a model to the biometric data and the transaction data to determine an emotional state of the user during the transaction, and determine an action associated for the transaction based on the emotional state of the user during the performance of the transaction. Embodiments further include causing performance of the action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAVEN E YONO whose telephone number is (313)446-6606. The examiner can normally be reached Monday - Friday 8-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bennett M Sigmond can be reached at (303) 297-4411. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RAVEN E YONO/Primary Examiner, Art Unit 3694
Read full office action

Prosecution Timeline

Apr 03, 2024
Application Filed
Jun 18, 2025
Non-Final Rejection — §101, §103, §112
Sep 22, 2025
Response Filed
Sep 29, 2025
Final Rejection — §101, §103, §112
Nov 21, 2025
Response after Non-Final Action
Dec 11, 2025
Request for Continued Examination
Dec 20, 2025
Response after Non-Final Action
Feb 13, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12548022
SYSTEMS AND METHODS FOR EXECUTING REAL-TIME ELECTRONIC TRANSACTIONS USING API CALLS
2y 5m to grant Granted Feb 10, 2026
Patent 12518276
SYSTEMS AND METHODS FOR SECURE TRANSACTION REVERSAL
2y 5m to grant Granted Jan 06, 2026
Patent 12511637
METHOD, APPARATUS, AND DEVICE FOR ACCESSING AGGREGATION CODE PAYMENT PAGE, AND MEDIUM
2y 5m to grant Granted Dec 30, 2025
Patent 12489647
SECURELY PROCESSING A CONTINGENT ACTION TOKEN
2y 5m to grant Granted Dec 02, 2025
Patent 12481992
AUTHENTICATING A TRANSACTION
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
39%
Grant Probability
72%
With Interview (+32.5%)
2y 6m
Median Time to Grant
High
PTA Risk
Based on 175 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month