Prosecution Insights
Last updated: April 19, 2026
Application No. 18/386,848

DEPOSIT FRAUD DETECTION

Non-Final OA §101§103
Filed
Nov 03, 2023
Examiner
BUI, TOAN D.
Art Unit
3693
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fmr LLC
OA Round
3 (Non-Final)
60%
Grant Probability
Moderate
3-4
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 60% of resolved cases
60%
Career Allow Rate
85 granted / 141 resolved
+8.3% vs TC avg
Strong +45% interview lift
Without
With
+44.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
44 currently pending
Career history
185
Total Applications
across all art units

Statute-Specific Performance

§101
40.7%
+0.7% vs TC avg
§103
41.2%
+1.2% vs TC avg
§102
1.5%
-38.5% vs TC avg
§112
5.5%
-34.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 141 resolved cases

Office Action

§101 §103
DETAILED ACTION This office action is in reply to the request for continued examination filed on 02/02/2026. Claims 1, 6, 7 and 16 have been amended. Claims 8 and 17 were previously canceled. Claims 1-7, 9-16, 18-20 have been examined. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/02/2026 has been entered. Response to Arguments With regard to the 112 rejection, the applicant has deleted the limitation that was subject to the rejection. Hence, the 112 rejection is withdrawn. With regard to the 101 rejection, the arguments have been considered but they are not persuasive. The Applicant asserted in page 6 that “[the] claims are necessarily rooted in computer technology, including characteristics related to user account logins, to overcome a problem specifically arising in the realm of computer networks”. However, the crux of the program is to analyze the user account for suspicious activities. The amended claim is directed to an abstract idea of keeping track of accounts opening and potential fraud – commercial interactions (Certain Methods of Organizing Human Activity). Also the Examiner does not see the parallel between the claims of the instant application and those of DDR Holdings. In DDR Holdings an improvement in web technology was used to address the problem of retaining web customers. DDR Holdings was solving a problem introduced by technology, such that it was a technological solution to a technological problem. Whereas the Applicants’ invention is a technological solution to a problem rooted in an abstract idea. The claims of the instant case employ a system, a server computing device, a processor, a memory, a watchlist database suitably programmed to perform the claimed functions. In light of the Alice decision and the July 2015 Update of Interim Guidance Identifying Abstract Ideas the features such as “receiving user account information . . .”, “evaluating the user account information to assign a risk score . . .”, “determining a risk score greater than a threshold . . .”, analyzing the account for suspicious activities . . .” are not considered an improvement to another technology or technical field, or an improvement to the functioning of the computer itself. These features recited in the claim are only further refinements of the abstract idea. That does not change the fact that the claim is drawn to abstract ideas. There are no improvements to another technology or technical field, no improvements to the functioning of the computer itself, transformation or reduction of a particular article to a different state or thing or any other meaningful limitations beyond generally linking the use of an abstract idea to a particular technological environment as a result of performing the claimed method. As discussed earlier, the claimed steps of the method are all functions that are conventional for a computer system, which in the Applicant’s invention comprises a storage device and a processor. The claimed sequence of steps comprises only "steps, specified at a high level of generality," which is insufficient to supply an "inventive concept." Id. at 2357 (quoting Mayo, 132 S. Ct. at 1294, 1297, 1300). Also the addition of merely novel or non-routine components to the claimed idea does not necessarily turn an abstraction into something concrete (See Ultramercial, Inc. v. Hulu, LLC, _ F.3d_, 2014 WL 5904902, (Fed. Cir. Nov. 14, 2014). In Alice also the system was specifically programmed to perform the claimed functions. Under step 2A – Prong Two analysis, the limitations are not indicative of integration into a practical application: Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Under step 2B analysis, the limitations are not indicative of an inventive concept (aka “significantly more”): Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. With regard to the 103 rejection, the arguments have been considered but they are not persuasive. The applicant amended the claim to incorporate analyzing the suspicious activities based on logging in incidents. But, the amended claim is disclosed by Paul reference. Hence, It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of analyzing suspicious login activities as taught by Paul with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better activate functions to detect suspicious events (Abstract). Therefore, the combination is obvious. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-7, 9-16, 18-20 are directed to a system, method, or product which are one of the statutory categories of invention. (Step 1: YES). Claims 1-7, 9-16, 18-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1-20 are directed to an abstract idea, Certain Method of Organizing Human Activity. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional computer elements, which are recited at a high level of generality, provide generic computer functions that do not add meaningful limits to practicing the abstract idea. Claims 1 and 16 are grouped together. Claim 16, for instance, recites a system for identifying a fraudulent electronic funds transfer (EFT), the system comprising a server computing device comprising a processor and a memory storing instructions that, when executed by the processor, cause the processor to perform the steps of: receiving, at a computing device, user account information for an account depositing funds using EFT; evaluating the user account information to assign a risk score to the user account; determining that the risk score is greater than a threshold risk score; adding the user account to a watchlist database for fraud detection; calculating a number of infractions for the added user account and, where the number of infractions is greater than a selected infraction threshold, marking the deposit for further review, wherein calculating the number of infractions includes analyzing user account for suspicious activity and one or more of analyzing the user account information for a number of authentication infractions, analyzing user deposit records for suspicious activity, analyzing disbursement records for suspicious activity, analyzing user deposit records for deposits in bad order, and importing one or more third-party fraud risk scores, and wherein analyzing the user account for suspicious activity comprises identifying recent failed login attempts, changes to login methods, recent high-risk logins by user, and geographical location of user login.. The limitations are directed to concept of quantifying risk for delivered financial items – mitigating risk – which belongs to fundamental economic practices. Hence, it falls within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. This judicial exception is not integrated into a practical application. In particular, the claim only recites additional elements such as a system, a server computing device, a processor, a memory, a watchlist database recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. In Alice Corp., the claim recited the concept of intermediated settlement as performed by a generic computer. The Court found that the recitation of the computer in the claim amounted to mere instructions to apply the abstract idea on a generic computer. 573 U.S. at 225-26, 110 USPQ2d at 1984. The Supreme Court also discussed this concept in an earlier case, Gottschalk v. Benson, 409 U.S. 63, 70, 175 USPQ 673, 676 (1972), where the claim recited a process for converting binary-coded-decimal (BCD) numerals into pure binary numbers. The Court found that the claimed process had no meaningful practical application except in connection with a computer. Benson, 409 U.S. at 71-72, 175 USPQ at 676. The claim simply stated a judicial exception (e.g., law of nature or abstract idea) while effectively adding words that “apply it” in a computer. Id. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. Next the claim as a whole is analyzed to determine whether any element, or combination of elements, is sufficient to ensure the claim amounts to significantly more than an abstract idea. Claims 1 and 16 do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements are merely performing the abstract idea on a generic device i.e., abstract idea and apply it. There is no improvement to computer technology or computer functionality MPEP 2106.05(a) nor a particular machine MPEP 2106.05(b) nor a particular transformation MPEP 2106.05(c). Given the above reasons, a generic processing device helps for fund transferring transaction is not an Inventive Concept. Thus, the claim is not patent eligible. The dependent claims have been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because for the same reasoning as above and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea. The dependent claim 2 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite concept of assigning risk weight and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because the claims only recite additional elements (such as a server computing device, a processor) are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claim 3 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite assigning a risk score and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because the claims recite additional elements (such as account application) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claims 4 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite mismatching information, identifying suspicious activity and analyzing deposit records and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because the claim recites additional elements (such as a server computing device, a processor) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claims 5 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite mismatching information, identifying suspicious activity and analyzing deposit records and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because the claim recites additional elements (such as a server computing device, a processor) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claims 6 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite mismatching information, identifying suspicious activity and analyzing deposit records and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because the claim recites additional elements (such as a server computing device, a processor) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claim 7 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite analyzing account for suspicious activity by identifying a geographic location, calculating number of infractions, and entering threshold and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional elements (such as a processor, a server computing device, a database) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claim 9 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite analyzing account for suspicious activity by identifying a geographic location, calculating number of infractions, and entering threshold and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional elements (such as a processor, a server computing device, a database) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claims 10 and 18 have been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite calculating step by repeating the batch update automatically and based on hourly basis, and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional elements (such as a server computing device, a processor, a database) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claim 11 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite calculating step by repeating the batch update automatically and based on hourly basis, and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional elements (such as a server computing device, a processor, a database) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claims 12 and 19 have been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite removing the user account from a database when the metrics fall below a threshold, removing when the account is inactive and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional elements (such as a server computing device, a database) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claims 13 and 20 have been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite removing the user account from a database when the metrics fall below a threshold, removing when the account is inactive and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional elements (such as a server computing device, a database) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claim 14 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite marking account for further review by sending to a manual review, prompting for additional information and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional element (such as an analyst, a server computing device, a processor) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. The dependent claim 15 has been given the full two part analysis (Step 2A – 2-prong tests and step 2B) including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because the claims recite marking account for further review by sending to a manual review, prompting for additional information and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional limitations of the dependent claim(s) when considered individually and as ordered combination do not amount to significantly more than the abstract idea because recites additional element (such as an analyst, a server computing device, a processor) which are adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Therefore, the claims are not patent eligible. Therefore, Claims 1-7, 9-16, 18-20 are not drawn to eligible subject matter as they are directed to an abstract idea without significantly more. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-7, 9-10, 12, 14-16, 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over are rejected under 35 U.S.C. 103 as being unpatentable over Calinog et al. (US 2020/0034813 A1) in view of Recce et al. (US 2009/0248465 A1) in further view of Clouthier et al. (US 2009/0171795 A1) in further view of Paul et al. (US 2020/0014713 A1). Claims 1 and 16 are grouped together. Claim 1, for instance, is disclosed: Calinog teaches: A computerized method for identifying a fraudulent electronic funds transfer (EFT), the method comprising: receiving, at a computing device, user account information for an account depositing funds using EFT (Calinog, see at least par. [0056] “ the payment terms circuit 124 is structured to initiate an electronic funds transfer (such as an automatic funds transfer transaction 154 from the payer funds account 144 to the payee funds account 142) based on consensus payment terms . . .” & par. [0074]) Interpretation: The cited portion discloses receiving user account information, the payer information, for calculating the payment terms using electronic funds transfer ; evaluating the user account information to assign a risk score to the user account (see at least par. [0075] “. . . In some embodiments, the internal risk score is calculated based on a value of the attribute 162 of the payment terms pattern repository for a relevant subset of prior payment transactions between the payer 108 and various payees 106. For example, the internal risk score may be calculated by evaluating the transaction date against the due date to determine if the payer 108 habitually pays on time. Other examples include evaluating a number of non-sufficient funds (NSF) instances. In some embodiments, the internal risk score is pre-set for the payer 108 by the payee 106 or another entity. . . .”) Interpretation: the cited portion discloses evaluating user account, payer, risk score ; determining that the risk score for the user account is greater than a threshold risk score (Calinog, see at least par. [0075] “. . . In some embodiments, the internal risk score is pre-set for the payer 108 by the payee 106 or another entity. In some embodiments, the risk score is compared to the risk tolerance score of the payee 106 to ensure that the level of risk to the payee 106 is acceptable. If the level of risk exceeds the threshold, the payment terms circuit 124 may, for example, be structured to avoid proposing installment payments made over time as part of the consensus payment terms 148.”); Calinog does not disclose the following; however, Recce teaches: adding the user account to a watchlist database for fraud detection (Recce, par. [0143] & [0167]) The cited portion discloses adding account to watchlist based on infractions; calculating a number of infractions for the added user account and, when the number of infractions is greater than a selected infraction threshold (Recce, par. [0102]) when infractions are greater than threshold number, an alert is generated for further monitoring or reviewing, marking the deposit for further review (Recce, par. [0127]) The cited portion discloses monitoring transactions based on the generation of infractions and alerts. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of adding the user account to a watchlist database and calculating a number of fractions as taught by Recce with the invention disclosed by Calinog to better determine risk and assign risk scores characterizing user accounts (Abstract). Therefore, the combination is obvious. Calinog in view of Recce does not disclose the following; however, Clouthier teaches: wherein calculating the number of infractions includes one or more of analyzing the user account information for a number of authentication infractions, analyzing user deposit records for suspicious activity, analyzing user account for suspicious activity, analyzing user deposit records for deposits in bad order, and importing one or more third-party fraud risk scores (Clouthier, see at least par. [0046] “. . . the host computer 110 (and, in particular aspects, the authorization application 120) might be configured to evaluate authorization and/or preauthorization requests based both on contemporary data about the requested transaction and historical data about the relevant parties (e.g., the merchant, customer or other check writer, their respective financial institutions, and/or the like). Merely by way of example, a check processor (and, specifically the host computer 110 and/or authorization application 120) might use a variety of information in determining whether to approve or decline a preauthorization request. This information falls into several categories related to both the customer and the bank account on which the check has been written. Examples of this type of information for the customer can include, without limitation, the number of checks presented in a certain time period, the amount of the check, the customer's personal check writing history, including any instances of suspected fraud, returned checks, and/or the like. Examples of information related to the presentation instrument can include, without limitation, recognizable bank account formatting, amount of the instrument, and/or past fraudulent use of the bank account on which the instrument is drawn.”) The cited portion discloses analyzing instances, or infractions, such as suspected fraud, returned checks, fraudulent use and many more. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of performing one or more of the infraction calculation methods as taught by Clouthier with the invention disclosed by Calinog in view of Recce to better provide assurance that the money transfer or check deposited will be authorize (Abstract). Therefore, the combination is obvious. Calinog in view of Recce in further view of Clouthier does not disclose the following; however, Paul teaches: and wherein analyzing the user account for suspicious activity comprises identifying recent failed login attempts, changes to login methods, recent high-risk logins by user, and geographical location of user login (Paul, see par. [0047] “. . . network events (e.g., network event 440), it may be determined which of the SSH failed login attempts are malicious. For example, observing a suspicious file transfer (network event 440) shortly after an SSH failed login (network event 410) may indicate that a hacker has correctly guessed the password and the SSH failed login is therefore malicious . . .” & par. [0049] “The GUI includes graph 910 and chart 920. Graph 910 indicates detected network event 410 (malicious SSH failed logins) over time, and chart 920 indicates detected network event 440 (suspicious file transfers) from three sources (IP addresses 10.10.2.1, 10.10.2.2, and 10.10.2.3) to three suspect destinations (RogueCountry1, RogueCountry2, and RogueCountry3). A user may take corrective action based on the information conveyed by the GUI in FIG. 9. Additionally/alternatively, the network management device 115 may automatically take corrective action to address the security threat 400.”) Interpretation: file transfer shortly after failed login attempt corresponds to changes to login methods. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of analyzing suspicious login activities as taught by Paul with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better activate functions to detect suspicious events (Abstract). Therefore, the combination is obvious. Claim 2. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: the computerized method of claim 1. Recce, furthermore, discloses: wherein calculating the number of infractions comprises assigning a weight to each of the infractions (Recce, see at least par. [0127] “Transaction monitor 134 provides facilities to manage the event weights associated with the generation of infractions and alerts. Transaction monitor 134 provides a test facility to allow different combinations of event weights to be run against a sample of events in order to test the impact of changes to the events . . .”). It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of calculating a number of infractions as taught by Recce with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better determine risk and assign risk scores characterizing user accounts. Therefore, the combination is obvious. Claim 3. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. Calinog further teaches: wherein assigning a risk score to the user account comprises validating user-provided information submitted in an account application against independently obtained user information (Calinog, see at least par. [0044] “. . . The information obtained from the payee data source(s) 110 may include any of the information otherwise manually provided by the payee 106 and/or account balance(s), a risk score, etc. The risk score may include, for example, a credit score received in a data feed or by dynamically accessing a record of the payee 106 maintained by a credit reporting agency such as Experian™, TransUnion™, etc. . . .”) The cited portion discloses score evaluation based on provided by the user and score obtained from other independent sources. Claim 4. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. Recce teaches: wherein the authentication infractions comprise mismatches between user-provided information and corresponding independently obtained user information (Recce, par. [0055] “A CDD check is usually performed following an IDV check. The CDD check is performed by comparing details of a customer (e.g., name, address, date of birth, SSN, etc.) against a database of known individuals. The database of known individuals (or business entities) comprises details of `good` and `bad` individuals. Details of good individuals may be held as a `cold` list. Details of bad individuals will be held in a `hot` list.”) The hotlist would contain accounts where the data attribute show a mismatch between stored data and presented data. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of comprising mismatches between provided and stored information as taught by Recce with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better determine risk and assign risk scores characterizing user accounts. Therefore, the combination is obvious. Claim 5. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. Clouthier teaches: wherein analyzing the user deposit records for suspicious activity comprises one or more of identifying past returned checks and identifying failed deposits (Clouthier, see at least par. [0046] “. . . the host computer 110 (and, in particular aspects, the authorization application 120) might be configured to evaluate authorization and/or preauthorization requests based both on contemporary data about the requested transaction and historical data about the relevant parties (e.g., the merchant, customer or other check writer, their respective financial institutions, and/or the like). Merely by way of example, a check processor (and, specifically the host computer 110 and/or authorization application 120) might use a variety of information in determining whether to approve or decline a preauthorization request. This information falls into several categories related to both the customer and the bank account on which the check has been written. Examples of this type of information for the customer can include, without limitation, the number of checks presented in a certain time period, the amount of the check, the customer's personal check writing history, including any instances of suspected fraud, returned checks, and/or the like. Examples of information related to the presentation instrument can include, without limitation, recognizable bank account formatting, amount of the instrument, and/or past fraudulent use of the bank account on which the instrument is drawn.”) The cited portion discloses analyzing instances, or infractions, such as suspected fraud, returned checks, fraudulent use and many more. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of performing one or more of the infraction calculation methods as taught by Clouthier with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better provide assurance that the money transfer or check deposited will be authorize. Therefore, the combination is obvious. Claim 6. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. Calinog further teaches: wherein analyzing disbursement records for suspicious activity comprises one or more of identifying recent changes in outgoing payment instructions for the user and identifying disbursements occurring in close proximity to deposits (Calinog, see at least par. [0054] “. . . In some embodiments, the payment terms circuit 124 may be structured to use data analytics to determine if the business is making a payment out of context and send an alert to the business (e.g., because the payment may be fraudulent, etc.) Interpretation: making payment out of context corresponds to change in outgoing payment . Claim 7. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. However, Clouthier teaches: wherein analyzing the user account for suspicious activity, comprises identifying recent changes in user account information, recent changes in user contact information, invalid contact information, geographic location of user contact information in a high risk area, geographic location of a user login in a high risk area, differences between geographical location of user contact information (Clouthier, see at least par. [0046] “. . . Other factors to be considered can include the history of the merchant in accepting bad checks, the nature and/or value of the goods/services provided in exchange for the check, the merchant's geographic location (e.g., in comparison with an address associated with the account on which the check is written), the nature or type of merchant and/or a typical amount on checks accepted by the merchant, etc.”) Interpretation: the identifying aspect includes identifying geographic location of the attempt deposit. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of performing one or more of identifying incidents as taught by Clouthier with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better provide assurance that the money transfer or check deposited will be authorize. Therefore, the combination is obvious. Claim 9. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. Recce further teaches: further comprising setting parameters for the evaluating and calculating steps comprising one or more of entering the threshold risk score, entering a time-out period for removing a user account from the watchlist database, entering a threshold deposit amount required before marking the deposit for further review, entering a path to direct deposits for further review, and entering the selected infraction threshold (Recce, par. [0216]-[0218]) The cited paragraphs disclose calculating, entering, or providing the threshold risk score(s). It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of comprising entering one or more of the risk threshold as taught by Recce with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better determine risk and assign risk scores characterizing user accounts. Therefore, the combination is obvious. Claims 10 and 18 are grouped together. Claim 10, for instance: Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. However, Recce teaches: wherein the calculating step is automatically repeated periodically for each user account in the watchlist database (Recce, see at least par. [0044] “. . . KYC checks can be performed in 2 modes of operation: to perform assessment of a customer or account interactively, based on presented characteristics; or automatically on a batch basis where the presented characteristics of one or more customers or accounts are checked and a risk score, report or both risk score and report is produced for each without the immediate need for user interaction . . .” & see at least par. [0135] “. . . Hotlisting can also be configured to generate a report when a record matches elements of a hotlist, or where the degree of match is beyond a given threshold. Hotlisting allows retrospective testing of new or changes to watchlists against historically stored reference or transaction data, where this data is held in data store 160.”) The use of batch analysis to update accounts automatically and risk score is part of a process to generate a report matching elements for hotlisting. Hotlisting, thus, effectuates changes to watchlists. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of performing calculating steps automatically as taught by Recce with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better determine risk and assign risk scores characterizing user accounts. Therefore, the combination is obvious. Claims 12 and 19 are grouped together. Claim 12, for instance: Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 10. Recce further teaches: comprising removing the user account from the watchlist database where the number of infractions for the added user account is below a monitoring threshold (Recce, see at least par. [0200] “. . . Although not shown in FIG. 4A had the risk score exceeded a predetermined threshold an alert would automatically have been generated. Otherwise, and where this is not applied, then the risk score is passed to 410 for further processing.”) When the risk score is below the threshold, the user account is removed from the batch and sent for further processing. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of removing account where the number of infractions is below a monitoring level as taught by Recce with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better determine risk and assign risk scores characterizing user accounts. Therefore, the combination is obvious. Claim 14. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: the computerized method of claim 1. Furthermore, Recce teaches: wherein marking for further review comprises sending one or more of the user account information, the assigned risk score (Recce, par. [0102]) when infractions are greater than threshold number, an alert is generated for further monitoring or reviewing, and the calculated number of infractions to an analyst for manual review(Recce, par. [0127]) The cited portion discloses monitoring transactions based on the generation of infractions and alerts. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of sending one of more of the user account information as taught by Recce with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better determine risk and assign risk scores characterizing user accounts. Therefore, the combination is obvious. Claim 15. Calinog in view of Recce in further view of Clouthier in further view of Paul teaches: The computerized method of claim 1. Furthermore, Calinog teaches: wherein marking for further review comprises prompting the user for additional information (Calinog, see at least par. [0019] “. . . The GUI conveniently consolidates information on one form and provides an enriched user experience for information drill-down through on-demand expandable forms (such as pop-up calendar controls). On-demand expandable forms allow the user to quickly analyze and/or provide additional information without having to navigate away from the payment arrangement screen.”) The user can provide additional information using GUI. Claims 11 is rejected under 35 U.S.C. 103 as being unpatentable over are rejected under 35 U.S.C. 103 as being unpatentable over Calinog et al. (US 2020/0034813 A1) in view of Recce et al. (US 2009/0248465 A1) in further view of Clouthier et al. (US 2009/0171795 A1) in further view of Paul et al. (US 2020/0014713 A1) in further view of Weinflash et al. (US 2018/0121975 A1). Claim 11. Calinog in view of Recce in further view of Clouthier in further view of Paul discloses: The computerized method of claim 10. However, Weinflash: wherein the calculating step is automatically repeated at least once per hour for each user account in the watchlist database (Weinflash, see at least par. [0061] “. . . For example, in many embodiments, each of financial institutions (e.g., 131-134) can provide overnight batch data to system 110, which can include information about whether accounts are open and in good status, what balances are available in the open accounts, whether accounts have had recent not sufficient funds (NSF) or other activity, and/or whether accounts have had a stop payment order. In a number of embodiments, system 110 can use the data provided by the financial institutions to provide fraud-prevention services to financial institutions (e.g., 131-134). For example, if a payee attempts to cash a check at financial institution 131 for a check drawn on a payor's account maintained at financial institution 132, financial institution 131 can inquire with system 110 about information regarding the payor's account at financial institution 132. . . .” & see at least [0502] “. . . The method can include receiving at least hourly updated account data comprising current statuses and current available balances of accounts maintained by one or more depository financial institutions . . .”) the batch is updated per hour to check for potential fraudulent data. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of updating batch on hourly basis as taught by Weinflash with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better update batch for account authentication. Therefore, the combination is obvious. Claims 13 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over are rejected under 35 U.S.C. 103 as being unpatentable over Calinog et al. (US 2020/0034813 A1) in view of Recce et al. (US 2009/0248465 A1) in further view of Clouthier et al. (US 2009/0171795 A1) in further view of Paul et al. (US 2020/0014713 A1) in further view of Smith et al. (US 2019/0394231 A1). Claims 13 and 20 are grouped together. Claim 13, for instance, is disclosed: Calinog in view of Recce in further view of Clouthier in further view of Paul discloses: The computerized method of claim 1. However, Smith teaches: further comprising removing the user account from the watchlist database after a selected period of user inactivity (Smith, par. [0077]) the cited portion discloses removing inactive accounts, which are unsuspicious accounts. The suspicious accounts would be analyzed since there would be activities in those accounts. It would be obvious to one of ordinary skill in the art before the effective filing date to combine the features of removing inactive account from suspicious list as taught by Smith with the invention disclosed by Calinog in view of Recce in further view of Clouthier in further view of Paul to better update batch analysis to remove unsuspicious accounts. Therefore, the combination is obvious. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to TOAN DUC BUI whose telephone number is (571)272-0833. The examiner can normally be reached on M-F 8-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael W. Anderson, can be reached on (571) 270-0508. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TOAN DUC BUI/Examiner, Art Unit 3693 /ELIZABETH H ROSEN/Primary Examiner, Art Unit 3693
Read full office action

Prosecution Timeline

Nov 03, 2023
Application Filed
Jun 12, 2025
Non-Final Rejection — §101, §103
Sep 16, 2025
Response Filed
Oct 28, 2025
Final Rejection — §101, §103
Feb 02, 2026
Request for Continued Examination
Feb 24, 2026
Response after Non-Final Action
Mar 03, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12400213
TEMPORARY DEBIT CARD SYSTEM AND METHOD
2y 5m to grant Granted Aug 26, 2025
Patent 12361435
REDUCING FALSE POSITIVE FRAUD ALERTS FOR ONLINE FINANCIAL TRANSACTIONS
2y 5m to grant Granted Jul 15, 2025
Patent 12340362
TWO-DIMENSIONAL CODE COMPATIBILITY SYSTEM
2y 5m to grant Granted Jun 24, 2025
Patent 12333519
SECURE QR CODE BASED DATA TRANSFERS
2y 5m to grant Granted Jun 17, 2025
Patent 12314940
CURRENCY MANAGEMENT SYSTEM AND ELECTRONIC SIGNATURE DEVICE
2y 5m to grant Granted May 27, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
60%
Grant Probability
99%
With Interview (+44.6%)
2y 4m
Median Time to Grant
High
PTA Risk
Based on 141 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month