Prosecution Insights
Last updated: April 19, 2026
Application No. 17/703,107

COMPUTER SYSTEM SECURITY VIA DEVICE NETWORK PARAMETERS

Final Rejection §101
Filed
Mar 24, 2022
Examiner
OJIAKU, CHIKAODINAKA
Art Unit
3696
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Paypal Inc.
OA Round
4 (Final)
45%
Grant Probability
Moderate
5-6
OA Rounds
3y 3m
To Grant
54%
With Interview

Examiner Intelligence

Grants 45% of resolved cases
45%
Career Allow Rate
207 granted / 456 resolved
-6.6% vs TC avg
Moderate +8% lift
Without
With
+8.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
46 currently pending
Career history
502
Total Applications
across all art units

Statute-Specific Performance

§101
35.1%
-4.9% vs TC avg
§103
31.7%
-8.3% vs TC avg
§102
7.1%
-32.9% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 456 resolved cases

Office Action

§101
DETAILED ACTION Status of the Claims The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is in response to an Amendment dated August 29, 2025. Claim 1 is cancelled. Claims 2, 9 and 16 are amended. Claims 2-21 are pending. All pending claims are examined. Response to Arguments 101 Rejection Analysis 101 Analysis In line with the "2019 Revised Patent Subject Matter Eligibility Guidance," which explains how we must analyze patent-eligibility questions under the judicial exception to 35 U.S.C. § 101. 84 Fed. Reg. 50-57 ("Revised Guidance"), the first step of Alice (i.e., Office Step 2A) consists of two prongs. In Prong One, we must determine whether the claim recites a judicial exception, i.e., an abstract idea, a law of nature, or a natural phenomenon. 84 Fed. Reg. at 54 (Section III.A. I.). If it does not, the claim is patent eligible. Id. An abstract idea must fall within one of the enumerated groupings of abstract ideas in the Revised Guidance or be a "tentative abstract idea, "with the latter situation predicted to be rare. Id. at 51-52 (Section I, enumerating three groupings of abstract ideas), 54 (Section III.A. I., describing Step 2A Prong One), 56-57 (Section III.D., explaining the identification of claims directed to a tentative abstract idea). If a claim does recite a judicial exception, the next is Step 2A Prong Two, in which we must determine if the "claim as a whole integrates the recited judicial exception into a practical application of the exception." Id. at 54 (Section II.A.2.) If it does, the claim is patent eligible. Id. If a claim recites a judicial exception but fails to integrate it into a practical application, we move to the second step of Alice (i.e., Office Step 2B). to evaluate the additional limitations of the claim, both individually and as an ordered combination, to determine whether they provide an inventive concept. Id. at 56 (Section III.B.). In particular, we look to whether the claim: • Adds a specific limitation or combination of limitations that are not well-understood, routine, conventional in the field, which is indicative that an inventive concept may be present; or • simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present. The analysis in line with current 101 guidelines. Even if the abstract idea is deemed to be novel, the abstract idea is no less abstract (see Flook- new mathematical formula was an abstract idea). “ In accordance with judicial precedent and in an effort to improve consistency and predictability, the 2019 Revised Patent Subject Matter Eligibility Guidance extracts and synthesizes key concepts identified by the courts as abstract ideas to explain that the abstract idea exception includes the following groupings of subject matter, when recited as such in a claim limitation(s) (that is, when recited on their own or per se): (b) Certain methods of organizing human activity—fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)1 – See Federal Register / Vol. 84, No. 4 / Monday, January 7, 2019 / p.52. Step 1: The claims are directed to one or more of the following statutory categories: a process, a machine, a manufacture, and a composition of matter. Claim 16, which is illustrative of independent claims 2 and 9, recites: “A method, comprising: accessing, by a computer system, first computer network address origin information corresponding to a computer network address of a user computing device; making a determination, by the computer system based on the first computer network address origin information, as to whether the computer network address of the user computing device corresponds to a virtual private network (VPN) on which a digital account is being utilized by the users computing device for one or more VPN transactions; analyzing device data corresponding to the user computing device to determine a user agent identifier corresponding to specific software installed on the user computing device; based on the determination as to whether the computer network address of the user computing device corresponds to a virtual private network (VPN) and based on the user agent identifier corresponding to the specific software installed on the user computing device, determining whether the VPN was used to hide the user computing device as a source of determining that identity information for a user associated with the user computing device was further hidden by the VPN when processing the one or more VPN transactions using the digital account; determining a first likelihood that a fraudulent user utilized the user computing device or the digital account based on the determining that the identity information was hidden; generating, using an intelligent scoring engine, and wherein the intelligent scoring engine comprises a plurality of weights usable to weight a plurality of factors for generating the first tier threshold score; monitoring, over a period of time, a plurality of specific user actions made via the digital account, performing a user identity analysis of the user based on the plurality of specific user actions; generating, using the intelligent scoring engine, a second-tier threshold score for the digital account based on the performing the user identity analysis; based on the second-tier threshold score indicating a threshold likelihood of the account engaging in the one or more specific behaviors that are prohibited by the terms of service applicable to the digital account, the computer system restricting at least one or the one or more specific functionalities for the digital account using enabling at least one of the one or more specific functionalities for the digital account based on a verification of the identity information. configuring a plurality of thresholds associated with the intelligent scoring engine and usable to determine at least the first tier threshold score and the second tier threshold score based on the account security restriction and the verification of the identity information; and updating the plurality of weights of the intelligent scoring engine based on a plurality of the account assessments associated with the plurality of thresholds. 2A, Prong One, Taking the broadest reasonable interpretation, the invention is directed to a method of organizing human activity which is a fundamental economic practice of fraud analysis(see App. Spec. paras. 0010-0013). Determining a risk profile that is provided to the relying party based on predefined conditions whereby there is an evaluation of information based on the existing data as recited in the claims are nothing more than gathering data and applying a set of instructions to the data. These limitations describe steps a person would take in escalating the review or evaluation of account based on certain indicators or thresholds that suggest engagement in fraudulent behavior based on predefined criteria (see App. Spec. paras. 0015-0016 – escalated review tiers of accounts in which a variety of factors are evaluated to determined potential for fraudulent activity) . Beyond the abstract idea, the additional elements recite hardware components such as a user computing device, there does not appear to be any technology being improved. They are described at a high level of generality where each step does no more than require a generic computer to perform generic computer functions. Absent is any support in the specification that the claims as recited require specialized computer hardware or other inventive computer components (see App. Spec. paras. 0055-0056, 0071). The innovation as claimed appears to be directed to the user’s objective of evaluating the risk associated with an account based on one or more factors of the account’s creation and or usage (see App. Spec. 0014-0015), rather than the integration of a practical application. Applicant claims the use of VPN as a way of masking malicious activity, however, the invention as claimed suggests the source of digital account usage activity (transaction processing and other online activities), including use of a VPN as one of the factors used in the assessment. Further, the invention as claimed recites generating, using an intelligent scoring engine, and wherein the intelligent scoring engine comprises a plurality of weights usable to weight a plurality of factors for generating the first tier threshold score; monitoring, over a period of time, a plurality of specific user actions made via the digital account, performing a user identity analysis of the user based on the plurality of specific user actions; generating, using the intelligent scoring engine, a second-tier threshold score for the digital account based on the performing the user identity analysis; based on the second-tier threshold score indicating a threshold likelihood of the account engaging in the one or more specific behaviors that are prohibited by the terms of service applicable to the digital account, the computer system restricting at least one or the one or more specific functionalities for the digital account using enabling at least one of the one or more specific functionalities for the digital account based on a verification of the identity information. configuring a plurality of thresholds associated with the intelligent scoring engine and usable to determine at least the first tier threshold score and the second tier threshold score based on the account security restriction and the verification of the identity information; and updating the plurality of weights of the intelligent scoring engine based on a plurality of the account assessments associated with the plurality of thresholds. This suggests an evaluation process in the form scoring using scoring system weights, thresholds or other assessment data using a baseline based on fraudulently detected accounts and accounts determined to be valid (see App. Spec. para. 0022; Fig. 4) Unlike, McRO, the present claims contain improvements to the context in which the assessment for fraud is made and not one of a technology or technological field. Evaluating the data from the different sources to identify the type of information and its sources suggests making evaluations based on predefined criteria. In particular, there is a lack of improvement to a computer or technical field of assessing fraud because the data processing performed merely uses a system as a tool to perform an abstract idea- see MPEP 2106.05(f). Therefore, the claims are directed to an abstract idea. The invention as claimed recites a generic computer component and the claim does not pass step 2A, Prong Two. Step 2B; The next step is to identify any additional limitations beyond the judicial exception. The additional elements are user computer device which is disclosed in the specification at a high degree of generality (see App. Spec. paras. 0055-0056, 0071). Absent is any genuine issue of material fact that this component requires any specialized hardware or inventive computer component. Likewise, the dependent claims 3-8, 10-15 and 17-21 are rejected under 35 U.S.C. § 101. For example, claims 17-18 provide descriptive material of the information related to the digital account activity and analysing it to make certain determinations. conditions or rules used in the fraudulent activity scoring process. These claim limitations recite steps at a high level of generality and performed in a traditional manner and therefore do not integrate the abstract idea into a practical application or provide an inventive concept. Independent claims 2 and 9 and 16 are rejected under 35 U.S.C. § 101 including dependent claims 3-8, 10-15 and 17-21 which fall with claims 2-21. Therefore, claims 2-21 are not patent eligible under 35 USC 101. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 2-21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (abstract idea) without significantly more. The claim recites abstract idea of organizing human activities. This judicial exception is not integrated into a practical application and the claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Analysis The claims are directed to one or more of the following statutory categories: a process, a machine, a manufacture, and a composition of matter. Claim 16, which is illustrative of independent claims 2 and 9, recites: . “A method, comprising: accessing, by a computer system, first computer network address origin information corresponding to a computer network address of a user computing device; making a determination, by the computer system based on the first computer network address origin information, as to whether the computer network address of the user computing device corresponds to a virtual private network (VPN) on which a digital account is being utilized by the users computing device for one or more VPN transactions; analyzing device data corresponding to the user computing device to determine a user agent identifier corresponding to specific software installed on the user computing device; based on the determination as to whether the computer network address of the user computing device corresponds to a virtual private network (VPN) and based on the user agent identifier corresponding to the specific software installed on the user computing device, determining whether the VPN was used to hide the user computing device as a source of determining that identity information for a user associated with the user computing device was further hidden by the VPN when processing the one or more VPN transactions using the digital account; determining a first likelihood that a fraudulent user utilized the user computing device or the digital account based on the determining that the identity information was hidden; generating, using an intelligent scoring engine, and wherein the intelligent scoring engine comprises a plurality of weights usable to weight a plurality of factors for generating the first tier threshold score; monitoring, over a period of time, a plurality of specific user actions made via the digital account, performing a user identity analysis of the user based on the plurality of specific user actions; generating, using the intelligent scoring engine, a second-tier threshold score for the digital account based on the performing the user identity analysis; based on the second-tier threshold score indicating a threshold likelihood of the account engaging in the one or more specific behaviors that are prohibited by the terms of service applicable to the digital account, the computer system restricting at least one or the one or more specific functionalities for the digital account using enabling at least one of the one or more specific functionalities for the digital account based on a verification of the identity information. configuring a plurality of thresholds associated with the intelligent scoring engine and usable to determine at least the first tier threshold score and the second tier threshold score based on the account security restriction and the verification of the identity information; and updating the plurality of weights of the intelligent scoring engine based on a plurality of the account assessments associated with the plurality of thresholds. The invention as claimed is a method of organizing human activity that is a fundamental economic practice of fraud analysis and in particular recites: “…making a determination, by the computer system based on the first computer network address origin information, as to whether the computer network address of the user computing device corresponds to a virtual private network (VPN) on which a digital account is being utilized by the users computing device for one or more VPN transactions; analyzing device data corresponding to the user computing device to determine a user agent identifier corresponding to specific software installed on the user computing device; based on the determination as to whether the computer network address of the user computing device corresponds to a virtual private network (VPN) and based on the user agent identifier corresponding to the specific software installed on the user computing device, determining whether the VPN was used to hide the user computing device as a source of : determining that identity information for a user associated with the user computing device was further hidden by the VPN when processing the one or more VPN transactions using the digital account; determining a first likelihood that a fraudulent user utilized the user computing device or the digital account based on the determining that the identity information was hidden;..” This is an abstract idea wherein an account is evaluated or monitored for and scored for fraudulent activity based on pre-defined rules. It can also be considered a mental process practically with the human mind since it entails making comparisons of data albeit with the help of a computer. Besides reciting the abstract idea, the remaining claim limitations recite generic computer components (e.g. computer network, computing device). This recited abstract idea is not integrated into a practical application because, in particular, the claim only recites generic computer components (e.g. computing device; see App. Spec. paras. 0055-0056, 0071) to access/determine/analyze based on the activity associated with the account. The additional elements are recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using generic computer components. Accordingly, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements - (e.g. computing device) amount to no more than mere instructions to apply the abstract idea using generic computer components. Dependent claims 3-8, 10-15 and 17-21 provide additonal details about the types of data collected. For example, claims 17-18 provide additional descriptions of the type of data or other predefined rules applied to evaluation of an account for fraudulent activity, and do not address the issues raised in the independent claims and therefore do not amount to a technical improvement or an integration of a practical application. In conclusion, merely “applying” the exception using generic computer components cannot provide an inventive concept. Therefore, the claims 2-21 are not patent eligible under 35 USC 101. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Peram et. al, US Application No. 20170098219 (Systems And Methods For Fraudulent Account Detection And Management) Vandezande et. al, US Application No. 20190295090 (Method And Apparatuses For Fraud Handling). THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHIKA OJIAKU whose telephone number is (571)270-3608. The examiner can normally be reached Monday - Friday: 8.30 AM -5:00 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Gart can be reached at 571 272-3955. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHIKAODINAKA OJIAKU/Primary Examiner, Art Unit 3696 1 Interval Licensing, 896 F.3d at 1344–45 (concluding that ‘‘[s]tanding alone, the act of providing someone an additional set of information without disrupting the ongoing provision of an initial set of information is an abstract idea,’’ observing that the district court ‘‘pointed to the nontechnical human activity of passing a note to a person who is in the middle of a meeting or conversation as further illustrating the basic, longstanding practice that is the focus of the [patent ineligible] claimed invention.’’); Voter Verified, Inc. v. Election Systems & Software, LLC, 887 F.3d 1376, 1385 (Fed. Cir. 2018) (finding the concept of ‘‘voting, verifying the vote, and submitting the vote for tabulation,’’ a ‘‘fundamental activity’’ that humans have performed for hundreds of years, to be an abstract idea); In re Smith, 815F.3d 816, 818 (Fed. Cir. 2016) (concluding that ‘‘[a]pplicants’ claims, directed to rules for conducting a wagering game’’ are abstract). 14 If a claim, under its broadest reasonable interpretation, covers performance in the mind but for the recitation of generic computer components, then it is still in the mental processes category unless the claim cannot practically be performed in the mind. See Intellectual Ventures I LLC v. Symantec Corp., 838 F.3d 1307, 1318 (Fed. Cir . 2016) (‘‘[W]ith the exception of generic computer-implemented steps, there is nothing in the claims themselves that foreclose them from being performed by a human, mentally or with pen and paper.’’); Mortg. Grader, Inc. v. First Choice Loan Servs. Inc., 811 F.3d. 1314, 1324 (Fed. Cir. 2016)(holding that computer-implemented method for ‘‘anonymous loan shopping’’ was an abstract idea because it could be ‘‘performed by humans without a computer’’); Versata Dev. Grp. v. SAP Am., Inc., 793 F.3d 1306, 1335 (Fed. Cir. 2015) (‘‘Courts have examined claims that required the use of a computer and still found that the underlying, patent-ineligible invention could be performed via pen and paper or in a person’s mind.’’); CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1375, 1372 (Fed. Cir. 2011) (holding that the incidental use of ‘‘computer’’ or ‘‘computer readable medium’’ does not make a claim otherwise directed to process that ‘‘can be performed in the human mind, or by a human using a pen and paper’’ patent eligible); id. at 1376 (distinguishing Research Corp. Techs. v. Microsoft Corp., 627 F.3d 859 (Fed. Cir. 2010), and SiRF Tech., Inc. v. Int’l Trade Comm’n, 601 F.3d 1319 (Fed. Cir. 2010), as directed to inventions that ‘‘could not, as a practical matter, be performed entirely in a human’s mind’’). Likewise, performance of a claim limitation using generic computer components does not necessarily preclude the claim limitation from being in the mathematical concepts grouping, Benson, 409 U.S.at 67, or the certain methods of organizing human activity grouping, Alice, 573 U.S. at 219–20 - –  See Federal Register / Vol. 84, No. 4 / Monday, January 7, 2019
Read full office action

Prosecution Timeline

Mar 24, 2022
Application Filed
Sep 24, 2024
Non-Final Rejection — §101
Dec 04, 2024
Interview Requested
Dec 24, 2024
Response Filed
Feb 07, 2025
Final Rejection — §101
Mar 14, 2025
Interview Requested
May 12, 2025
Request for Continued Examination
May 14, 2025
Response after Non-Final Action
May 28, 2025
Non-Final Rejection — §101
Aug 05, 2025
Interview Requested
Aug 28, 2025
Response Filed
Dec 11, 2025
Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597078
ASSEMBLING PARAMETERS TO COMPUTE TAXES FOR CROSS-BORDER SALES
2y 5m to grant Granted Apr 07, 2026
Patent 12597079
ASSEMBLING PARAMETERS TO COMPUTE TAXES FOR CROSS-BORDER SALES
2y 5m to grant Granted Apr 07, 2026
Patent 12597029
System, Method, and Computer Program Product for Authenticating a Transaction
2y 5m to grant Granted Apr 07, 2026
Patent 12567109
Lean Level Support for Trading Strategies
2y 5m to grant Granted Mar 03, 2026
Patent 12443940
SYSTEM AND METHOD FOR PROVIDING AN AUGMENTED PERSONAL MESSAGE
2y 5m to grant Granted Oct 14, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
45%
Grant Probability
54%
With Interview (+8.2%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 456 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month