Prosecution Insights
Last updated: April 19, 2026
Application No. 17/895,235

SYSTEMS AND METHODS FOR RECOMMENDING INSURANCE

Non-Final OA §101§112
Filed
Aug 25, 2022
Examiner
HAMILTON, SARA CHANDLER
Art Unit
3695
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Royal Bank Of Canada
OA Round
5 (Non-Final)
64%
Grant Probability
Moderate
5-6
OA Rounds
3y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
321 granted / 500 resolved
+12.2% vs TC avg
Strong +53% interview lift
Without
With
+53.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
35 currently pending
Career history
535
Total Applications
across all art units

Statute-Specific Performance

§101
30.9%
-9.1% vs TC avg
§103
27.7%
-12.3% vs TC avg
§102
8.7%
-31.3% vs TC avg
§112
24.5%
-15.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 500 resolved cases

Office Action

§101 §112
DETAILED ACTION Response to Amendment This Office Action is responsive to Applicant’s request for continued examination of application 17/895,235 (08/25/22) filed on 11/10/25 and applicant’s arguments as filed 08/12/25 and 10/10/25. Claim Objections Claims 24, 25 and 26 are objected to because of the following informalities: Claim 24 recites, “A computer implemented method of recommending insurance policies comprising:” This should be -- A [[DELETE computer implemented]] method of recommending insurance policies comprising: -- or something similar. To avoid undue interpretation, consistent terminology should be used. Claims 24, 25 and 26 recite, “applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount,” This should be -- applying at least one of the one or more trained policy recommendation models to the PNA data collected from [[DELETE an]] the insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount [[DELETE,]]; -- or something similar. NOTE: Amending the limitation “collecting by a computing device personal needs assessment (PNA) data comprising:” to recite -- collecting by a computing device personal needs assessment (PNA) data from an insurance customer comprising: -- provides further clarity to the claim. Terms referenced earlier in the claim should be preceded by terms such as “the” or “said” to clarify the prior reference. Typo. Claims 24, 25 and 26 recite, “mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation; outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer;” This should be -- mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the [[DELETE policy]] recommendation; outputting the generated [[DELETE policy]] recommendation and the human-understandable explanation of the [[DELETE policy]] recommendation for presentation to the insurance customer; -- or something similar. To avoid undue interpretation, consistent terminology should be used. Claim 25 recites, “collecting by a computing device personal needs assessment (PNA) data comprising”; “receiving at the computing device a first subset of the PNA data in response to the presented first set of questions”; and “receiving at the computing device a second subset of the PNA data in response to the presented second set of questions”. This should be -- collecting [[DELETE by a computing device]] personal needs assessment (PNA) data comprising --; -- receiving [[DELETE at the computing device]] a first subset of the PNA data in response to the presented first set of questions --; and --- receiving [[DELETE at the computing device]] a second subset of the PNA data in response to the presented second set of questions --. The claim is already directed to a “A non-transitory computer readable medium storing instructions which when executed by a processor of a computing device configure the computing device to perform …..”. The language “by a computing device” and “at the computing device” is redundant as all positively recited steps or acts, as claimed, are already required to be performed by the “computing device”. Claim 26 recites, “collecting by a computing device personal needs assessment (PNA) data comprising”; “receiving at the computing device a first subset of the PNA data in response to the presented first set of questions”; and “receiving at the computing device a second subset of the PNA data in response to the presented second set of questions”. This should be -- collecting [[DELETE by a computing device]] personal needs assessment (PNA) data comprising --; -- receiving [[DELETE at the computing device]] a first subset of the PNA data in response to the presented first set of questions --; and --- receiving [[DELETE at the computing device]] a second subset of the PNA data in response to the presented second set of questions --. The claim is already directed to a “A computing device comprising: a processor for executing instructions; and a memory storing instructions which when executed by the processor configure the computing device to perform …..”. The language “by a computing device” and “at the computing device” is redundant as all positively recited steps or acts, as claimed, are already required to be performed by the “computing device”. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 2 - 7, 10 - 20 and 24 - 26 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. ALICE/ MAYO: TWO-PART ANALYSIS 2A. First, a determination whether the claim is directed to a judicial exception (i.e., abstract idea). Prong 1: A determination whether the claim recites a judicial exception (i.e., abstract idea). Groupings of abstract ideas enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance. Mathematical concepts- mathematical relationships, mathematical formulas or equations, mathematical calculations. Certain methods of organizing human activity- fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). Mental processes- concepts performed in the human mind (including an observation, evaluation, judgement, opinion). Prong 2: A determination whether the judicial exception (i.e., abstract idea) is integrated into a practical application. Considerations indicative of integration into a practical application enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance. Improvement to the functioning of a computer, or an improvement to any other technology or technical field Applying or using a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition Applying the judicial exception with, or by use of a particular machine. Effecting a transformation or reduction of a particular article to a different state or thing Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception Considerations that are not indicative of integration into a practical application enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance. Merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. Adding insignificant extra-solution activity to the judicial exception. Generally linking the use of the judicial exception to a particular technological environment or field of use. 2B. Second, a determination whether the claim provides an inventive concept (i.e., Whether the claim(s) include additional elements, or combinations of elements, that are sufficient to amount to significantly more than the judicial exception (i.e., abstract idea)). Considerations indicative of an inventive concept (aka “significantly more”) enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance. Improvement to the functioning of a computer, or an improvement to any other technology or technical field Applying the judicial exception with, or by use of a particular machine. Effecting a transformation or reduction of a particular article to a different state or thing Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception NOTE: The only consideration that does not overlap with the considerations indicative of integration into a practical application associated with step 2A: Prong 2. Considerations that are not indicative of an inventive concept (aka “significantly more”) enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance. Merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. Adding insignificant extra-solution activity to the judicial exception. Generally linking the use of the judicial exception to a particular technological environment or field of use. Simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. NOTE: The only consideration that does not overlap with the considerations that are not indicative of integration into a practical application associated with step 2A: Prong 2. See also, 2010 Revised Patent Subject Matter Eligibility Guidance; Federal Register; Vol. 84, No. 4; Monday, January 7, 2019 Claims 2 - 7, 10 - 20 and 24 - 26 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. 1: Statutory Category Applicant’s claimed invention, as described in independent claim 24, is/are directed to a process (i.e. a method). 2(A): The claim(s) are directed to a judicial exception (i.e., an abstract idea). PRONG 1: The claim(s) recite a judicial exception (i.e., an abstract idea). Certain Method of Organizing Human Activity The claim as a whole recites a method of organizing human activity. The claimed invention is involves collecting personal needs assessment (PNA) data comprising: presenting the insurance customer with a first set of questions; receiving a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data; determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; determining a second set of questions from a plurality of PNA questions based on the determined feature importance; presenting the insurance customer with the second set of PNA questions; and receiving a second subset of the PNA data in response to the presented second set of questions; applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation; outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer; and determining a future action for contacting the insurance customer in the future comprising: predicting a probability that a plurality of lifestage milestones will occur within a given set of time; predicting a persona type of the insurance customer; predicting future insurance needs of the insurance customer based on the predicted probability that the plurality of lifestage milestones will occur and the predicted persona type; determining a difference between current insurance of the insurance customer and future insurance needs; and based on the determined difference, generating a contact action associated with the insurance customer to take at a future time, which is a fundamental economic principles or practices (recommending insurance policies); commercial or legal interactions (recommending insurance policies); and managing personal behavior or relationships or interactions between people (collecting, presenting, receiving, inputting, determining, applying, mapping, outputting, predicting, generating). The mere nominal recitation of “computer implemented” (preamble only) and “a computing device” does not take the claim out of the method of organizing human activity grouping. Thus, the claim recites an abstract idea. Mental Processes The claim recites limitations directed to collecting personal needs assessment (PNA) data comprising: presenting the insurance customer with a first set of questions; receiving a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data; determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; determining a second set of questions from a plurality of PNA questions based on the determined feature importance; presenting the insurance customer with the second set of PNA questions; and receiving a second subset of the PNA data in response to the presented second set of questions; applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation; outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer; and determining a future action for contacting the insurance customer in the future comprising: predicting a probability that a plurality of lifestage milestones will occur within a given set of time; predicting a persona type of the insurance customer; predicting future insurance needs of the insurance customer based on the predicted probability that the plurality of lifestage milestones will occur and the predicted persona type; determining a difference between current insurance of the insurance customer and future insurance needs; and based on the determined difference, generating a contact action associated with the insurance customer to take at a future time. The limitation(s), as drafted, is/are a process that, under it’s broadest reasonable interpretation, covers performance of the limitation(s) in the mind. That is, other than reciting “computer implemented” (preamble only) and “a computing device”, nothing in the claim element precludes the steps from practically being performed in the mind. In other words, the claim encompasses the user collecting personal needs assessment (PNA) data comprising: presenting the insurance customer with a first set of questions; receiving a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data; determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; determining a second set of questions from a plurality of PNA questions based on the determined feature importance; presenting the insurance customer with the second set of PNA questions; and receiving a second subset of the PNA data in response to the presented second set of questions; applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation; outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer; and determining a future action for contacting the insurance customer in the future comprising: predicting a probability that a plurality of lifestage milestones will occur within a given set of time; predicting a persona type of the insurance customer; predicting future insurance needs of the insurance customer based on the predicted probability that the plurality of lifestage milestones will occur and the predicted persona type; determining a difference between current insurance of the insurance customer and future insurance needs; and based on the determined difference, generating a contact action associated with the insurance customer to take at a future time. NOTE: (a) The claimed invention is exclusively from the perspective of “a computing device”, but the “computing device” only performs some of the positively recited step or acts (e.g., “collecting”, “receiving”). There is also a nominal recitation of “computer implemented” (preamble only). The mere nominal recitation “computer implemented” (preamble only) and “a computing device” does not take the claim limitation out of the mental processes grouping. This/these limitation(s) recite a mental process. Thus, the claim recites an abstract idea. PRONG 2: The judicial exception (i.e., an abstract idea). Is not integrated into a practical application. The claim recites the combination of additional elements of “computer implemented” (preamble only). The claim recites the combination of additional elements of “a computing device” performing some of the positively recited steps or acts required of the claimed invention (e.g., “collecting”, “receiving”). The additional element(s) is/ are recited at a high level of generality (i.e., as a generic computer performing the generic computer functions of and (a) data receipt/ transmission (e.g., “collecting”, “receiving”, “inputting”, “outputting”, etc. step(s) as claimed); (b) data display (e.g., “presenting”, etc. step(s) as claimed); and (c) data processing (e.g., “determining”, “applying”, “mapping”, “predicting”, “generating”, etc. step(s) as claimed)). The additional element(s) is/ are recited at a high level of generality (i.e., as general means of gathering information for recommending insurance policies, and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The “computing device” that is used to perform the steps or acts is also recited at a high level of generality, and merely automates the step(s). The “computing device” limitations are no more than mere instructions to apply the exception using generic computer components. Accordingly, the additional element(s) does not integrate the abstract idea into a practical application because it does not impose any meaningful limitations on practicing the abstract idea. The claim is directed to an abstract idea. Since the claim(s) recite a judicial exception and fails to integrate the judicial exception into a practical application, the claim(s) is/are “directed to” the judicial exception. Thus, the claim(s) must be reviewed under the second step of the Alice/ Mayo analysis to determine whether the abstract idea has been applied in an eligible manner. 2(B): The claims do not provide an inventive concept (i.e., The claim(s) do not include additional elements, or combinations of elements, that are sufficient to amount to significantly more than the judicial exception (i.e., abstract idea)). As discussed with respect to Step 2A Prong Two, the additional element(s) in the claim amounts to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception using a generic computer component cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B. Furthermore, the additional element(s) under STEP 2A Prong 2 have been evaluated in STEP 2B to determine if it is more than what is well-understood, routine conventional activity in the field. Applicant’s specification as filed 08/25/22 does not provide any indication that the “computer” is anything other than a generic, off-the-shelf computer component. Furthermore, the prosecution history of the instant application provides McCormick, US Pat. No. 11,538,076; Shiu, US Pub. No. 2021/0295427; and Jain, US Pat. No. 11,456,080 operating in a similar environment, suggesting performing tasks such as (a) data receipt/ transmission (e.g., “collecting”, “receiving”, “inputting”, “outputting”, etc. step(s) as claimed); (b) data display (e.g., “presenting”, etc. step(s) as claimed); and (c) data processing (e.g., “determining”, “applying”, “mapping”, “predicting”, “generating”, etc. step(s) as claimed) are well understood, routine and conventional. Furthermore, the courts have recognized that computer functions or tasks analogous to those claimed such as (a) data receipt/ transmission (e.g., “collecting”, “receiving”, “inputting”, “outputting”, etc. step(s) as claimed); (b) data display (e.g., “presenting”, etc. step(s) as claimed); and (c) data processing (e.g., “determining”, “applying”, “mapping”, “predicting”, “generating”, etc. step(s) as claimed) are well understood, routine and conventional. Symantec, TLI, OIP Techs and buySAFE court decisions cited in MPEP § 2106.05(D) (ii) indicate that mere collection or receipt of data over a network is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as here). SAP America Inc. v. Investpic, LLC, 890 F.3d 1016 USPQ2d 1638 (Fed Cir. 2018) (displaying and disseminating financial information) and Intellectual Ventures 1 LLC v. Capital One Bank (USA) (advanced internet interface providing user display access of customized web pages) indicate displaying information is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as here). Flook, Bancorp court decisions cited in MPEP § 2106.05(D) (ii) indicate performing repetitive calculations is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as here). Accordingly, a conclusion that the additional elements are well-understood, routine, conventional activity is supported under Berkheimer. For these reasons, there is no invention concept in the claim, and thus the claim is ineligible. Dependent claims 2 - 7 and 10 - 20 are rejected as ineligible subject matter under 35 U.S.C. 101 based on a rationale similar to the claims from which they depend. Alice Corp. also establishes that the same analysis should be used for all categories of claims (e.g., product and process claims). Therefore, independent non-transitory computer readable medium claim 25 and independent computing device claim 26 is/are also rejected as ineligible subject matter under 35 U.S.C. 101 for substantially the same reasons as the method claims. The component(s) (i.e., “non-transitory computer readable medium” and “a processor of a computing device”) described in independent non-transitory computer readable medium claim 25 and the component(s) (i.e., “a processor” and “a memory”) described in independent computing device claim 26, add nothing of substance to the underlying abstract idea. At best, the product(s) (non-transitory computer readable medium, computing device) recited in the claim(s) are merely providing an environment to implement the abstract idea. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 24, 25 and 26 (claims 2 - 7 and 10 - 20 based on their dependency) are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 24, 25 and 26 (claims 2 - 7 and 10 - 20 based on their dependency) recite the limitation " presenting the insurance customer with the second set of PNA questions;". There is insufficient antecedent basis for “the insurance customer” in the claim. NOTE: Amending the limitation “collecting by a computing device personal needs assessment (PNA) data comprising:” to recite -- collecting by a computing device personal needs assessment (PNA) data from an insurance customer comprising: -- appears to cure this problem. Claims 24, 25 and 26 (claims 2 - 7 and 10 - 20 based on their dependency) recite the limitation " determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; ". There is insufficient antecedent basis for “the one or more trained policy recommendation models providing the partial -recommendation” in the claim. NOTE: Although there appears to be antecedent basis for “the one or more trained policy recommendation models” generally, there is not antecedent basis for “the one or more trained policy recommendation models providing the partial -recommendation” specifically. NOTE: The specific language used is not required, but intended as an aide to the applicant in overcoming one or more of the objections and/ or rejections noted in this office action. Alternative language may be proposed. Please indicate where support may be found in the specification for any amendments made. Response to Arguments 101 Applicant's arguments have been fully considered but they are not persuasive. (1)Applicant argues the claimed invention is not directed to a judicial exception (i.e., an abstract idea). Applicant’s claimed invention is directed to an abstract idea. Certain Method of Organizing Human Activity The claimed invention is directed to certain methods of organizing human activity. Fundamental economic principles or practices relate to the economy and commerce. The claimed invention encompasses fundamental economic principles or practices as it relates to insurance (e.g., recommending insurance policies). This interpretation is consistent with the prosecution history of the instant application. For example, para. [0002] of applicant’s specification as filed 08/25/22 states: [0002] The current disclosure relates to insurance products and in particular to computer implemented systems and methods for providing insurance recommendations. See also, at least claims 24, 25 and 26 as filed (10/10/25). The claimed invention encompasses commercial or legal interactions. The claimed invention relates to insurance (e.g., recommending insurance policies). Insurance, in the instant scenario, pertains to agreements in the form of “contracts” (e.g., between parties such as the “insurance customer” as claimed), “legal obligations” (i.e., because it imposes contractual obligations enforceable by law), and “business relations”. The claimed invention encompasses managing personal behavior or relationships or interactions (e.g., collecting, presenting, receiving, inputting, determining, applying, mapping, outputting, predicting, generating). For example, filtering content (e.g., “determining a second set of questions from a plurality of PNA questions …..”). For example, considering historical information (e.g., “the first subset of PNA data”). See also, MPEP §2106.04(a)(2)(II). Mental Processes The claimed invention is directed to mental processes. The claimed invention encompasses observations, evaluations, judgements and opinions (e.g., “outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation …..”; “determining a future action for contacting the insurance customer in the future …..”) Contrary to applicant’s arguments, the courts do not distinguish between mental processes that are performed entirely in the human mind and mental processes that require a human to use a physical aid. Similarly, the courts do not distinguish between claims that recite mental processes performed by humans and claims that recite mental processes performed on a computer. Whether applicant is describing claims 2 - 7, 10 - 20 and 24 which some of the positively recited steps or acts occur on a computer (e.g., “collecting”, “receiving”) or claims 25 - 26 which suggest the steps or acts occur on a computer (e.g., “computing device”), nothing forecloses applicant’s claimed invention from being performed by a human and thus applicant’s claimed invention is still directed to a mental process. See also, MPEP §2106.04(a)(2)(III). (2)Applicant argues the judicial exception (i.e., an abstract idea) is integrated into a practical application. Applicant suggests the claimed invention presents a “practical application” because it provides improvements to the functioning of a computer, or to any other technology or technical field (e.g., “improve the data collection process”. See pg. 2 of applicant’s arguments/ remarks as filed 08/12/24; “improved question generation process”. See pg. 2 of applicant’s arguments/ remarks as filed 08/12/24.; “improves the functioning of the computer system as it eliminates the need to train different models for determining next questions”. See pg. 3 of applicant’s arguments/ remarks as filed 08/12/24.; “improve the accuracy and relevance of insurance recommendations”. See pg. 3 of applicant’s arguments/ remarks as filed 08/12/24.; “improves a computer-implemented recommendation pipeline by imposing concrete, model-driven controls on data acquisition, explanation generation, and future action selection”. See pg. 11 of applicant’s arguments/ remarks as filed 10/10/25.; “improving the functioning of the computer- implemented recommendation pipeline”. See pg. 12 of applicant’s arguments/ remarks as filed 10/10/25.); and provides a technical solution to a technical problem (e.g., “provide a specific technical solution to the problem of efficiently collecting and utilizing PNA data for insurance recommendations.” See pg. 3 of applicant’s arguments/ remarks as filed 08/12/25.; “addresses a concrete technical challenge in model interpretability and system-level trust, improving the functioning of the computer- implemented recommendation pipeline.” See pg. 12 of applicant’s arguments/ remarks as filed 10/10/25). The Examiner disagrees. Applicant’s arguments suggesting the claimed invention provides improvements to the functioning of a computer, or to any other technology or technical field; and provides a technical solution to a technical problem suggests the applicant believes the technical aspects of the invention are substantial. There exists alternative perspectives however. The “improvements” and “solutions” applicant suggests are really just the benefits of automation itself (e.g., “efficiency”, “accuracy”, “relevance”. See pg. 3 of applicant’s argument’s/ remarks as filed 10/10/25. See pg. 12 of applicant’s argument’s/ remarks as filed 10/10/25. See pg. 12 of applicant’s argument’s/ remarks as filed 10/10/25). There is nothing in the claimed invention that provides any indication regarding how the alleged technology is used or why it would be necessary. Adding the words “apply it” (or an equivalent) with the judicial exception is not not indicative of integration into a practical application. See also, MPEP § 2106.05(f). Merely using a computer as a tool to perform an abstract idea; and mere instructions to implement an abstract idea on a computer are not indicative of integration into a practical application. See also, MPEP §2106.05(f). The role of the device is limited to necessary data gathering and outputting (e.g., “collecting by a computing device personal needs assessment (PNA) data comprising: …..”; “receiving at the computing device a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data”; “receiving at the computing device a second subset of the PNA data in response to the presented second set of questions”; and “outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer” steps as claimed.). Adding insignificant extra-solution activity to the judicial exception is not indicative of integration into a practical application. See also, MPEP §2106.05 (g). Collecting information (e.g., “collecting by a computing device personal needs assessment (PNA) data comprising: …..”; “receiving at the computing device a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data”; and “receiving at the computing device a second subset of the PNA data in response to the presented second set of questions;”); analyzing it (e.g., “determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; determining a second set of questions from a plurality of PNA questions based on the determined feature importance”; “applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation”; and “determining a future action for contacting the insurance customer in the future comprising: predicting a probability that a plurality of lifestage milestones will occur within a given set of time; predicting a persona type of the insurance customer; predicting future insurance needs of the insurance customer based on the predicted probability that the plurality of lifestage milestones will occur and the predicted persona type; determining a difference between current insurance of the insurance customer and future insurance needs; and based on the determined difference, generating a contact action associated with the insurance customer to take at a future time.”); and displaying certain results of the collection and analysis (e.g., “presenting the insurance customer with a first set of questions”; “presenting the insurance customer with the second set of PNA questions”; and “outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer;”) merely indicates a field of use or technical environment in which to apply the judicial exception. Generally linking the use of the judicial exception to a particular technological environment or field of use. See also, MPEP §2106.05(h). (3)Applicant argues the claimed invention provides an inventive concept (i.e., The claim(s) do not include additional elements, or combinations of elements, that are sufficient to amount to significantly more than the judicial exception (i.e., abstract idea)). Applicant argues there is a “non-generic, non-conventional use”. As discussed with respect to Step 2A Prong Two, the additional element(s) in the claim amounts to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception using a generic computer component cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B. Furthermore, the additional element(s) under STEP 2A Prong 2 have been evaluated in STEP 2B to determine if it is more than what is well-understood, routine conventional activity in the field. Applicant’s specification as filed 08/25/22 does not provide any indication that the “computer” is anything other than a generic, off-the-shelf computer component. Furthermore, the prosecution history of the instant application provides McCormick, US Pat. No. 11,538,076; Shiu, US Pub. No. 2021/0295427; and Jain, US Pat. No. 11,456,080 operating in a similar environment, suggesting performing tasks such as (a) data receipt/ transmission (e.g., “collecting”, “receiving”, “inputting”, “outputting”, etc. step(s) as claimed); (b) data display (e.g., “presenting”, etc. step(s) as claimed); and (c) data processing (e.g., “determining”, “applying”, “mapping”, “predicting”, “generating”, etc. step(s) as claimed) are well understood, routine and conventional. Furthermore, the courts have recognized that computer functions or tasks analogous to those claimed such as (a) data receipt/ transmission (e.g., “collecting”, “receiving”, “inputting”, “outputting”, etc. step(s) as claimed); (b) data display (e.g., “presenting”, etc. step(s) as claimed); and (c) data processing (e.g., “determining”, “applying”, “mapping”, “predicting”, “generating”, etc. step(s) as claimed) are well understood, routine and conventional. Symantec, TLI, OIP Techs and buySAFE court decisions cited in MPEP § 2106.05(D) (ii) indicate that mere collection or receipt of data over a network is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as here). SAP America Inc. v. Investpic, LLC, 890 F.3d 1016 USPQ2d 1638 (Fed Cir. 2018) (displaying and disseminating financial information) and Intellectual Ventures 1 LLC v. Capital One Bank (USA) (advanced internet interface providing user display access of customized web pages) indicate displaying information is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as here). Flook, Bancorp court decisions cited in MPEP § 2106.05(D) (ii) indicate performing repetitive calculations is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as here). Accordingly, a conclusion that the additional elements are well-understood, routine, conventional activity is supported under Berkheimer. For these reasons, there is no invention concept in the claim, and thus the claim is ineligible. Dependent claims 2 - 7 and 10 - 20 are rejected as ineligible subject matter under 35 U.S.C. 101 based on a rationale similar to the claims from which they depend. Alice Corp. also establishes that the same analysis should be used for all categories of claims (e.g., product and process claims). Therefore, independent non-transitory computer readable medium claim 25 and independent computing device claim 26 is/are also rejected as ineligible subject matter under 35 U.S.C. 101 for substantially the same reasons as the method claims. The component(s) (i.e., “non-transitory computer readable medium” and “a processor of a computing device”) described in independent non-transitory computer readable medium claim 25 and the component(s) (i.e., “a processor” and “a memory”) described in independent computing device claim 26, add nothing of substance to the underlying abstract idea. At best, the product(s) (non-transitory computer readable medium, computing device) recited in the claim(s) are merely providing an environment to implement the abstract idea. (4)Applicant argues Example 47, claim 3. NOTE: Although the claims refer to “one or more trained policy recommendation models” and “policy explainability model”, applicant’s claimed invention is not directed to “machine learning” as referenced in applicant’s specification as filed 8/25/22 or discussed in the July 2024 Subject Matter Eligibility Examples. The facts associated with the claimed invention are more aligned with Example 47, claim 2 from the July 2024 Subject Matter Eligibility Examples which were found to be ineligible. For example, the claimed invention refers to the steps or acts being performed “by a computing device” and “at the computing device”. This language is very similar to the “at a computer” and “by the computer” language recited in Example 47, claim 2. In Example 47, claim 2 this language was considered to be recited at a high level of generality i.e., as a generic computer performing generic computer functions. For example, the claimed invention refers to “inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data; determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation”; and “applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation;”). This language is very similar to the “using the trained ANN” and “outputting the anomaly data from the trained ANN” recited in Example 47, claim 2. In Example 47, claim 2 this language was determined not to provide any details about how the trained artificial neural network (ANN) operates and merely provided a generic output. NOTE: This equivalency is for analysis purposes only. As noted, the examiner does not interpret the “one or more trained policy recommendation models” and the “policy explainability model” as machine-learning. Adding insignificant extra-solution activity to the judicial exception is not indicative of integration into a practical application. See also, MPEP §2106.05 (g). Mere instructions to implement an abstract idea on a computer, merely using a computer as a tool to perform an abstract idea or an equivalent of an “apply it” rational are not indicative of integration into a practical application. See also, MPEP §2106.05 (f). Generally linking the use of the judicial exception to a particular technological environment or field of use is not indicative of integration into a practical application. See also, MPEP §2106.05 (h). (5) Applicant argues differences with respect to Recentive Analytics. NOTE: Although the claims refer to “one or more trained policy recommendation models” and “policy explainability model”, applicant’s claimed invention is not directed to “machine learning” as referenced in applicant’s specification as filed 8/25/22 or discussed in the July 2024 Subject Matter Eligibility Examples. The feedback applicant references in their arguments/ remarks dated 8/12/25 and 10/10/25 is not an improvement to machine-learning itself, but the application of machine learning to insurance/ insurance recommendations. The court in Recentive Analytics concluded, “patents that do no more than claim the application of generic machine learning to new data environments, without disclosing improvements to the machine learning models to be applied are patent ineligible under § 101.” See pg. 18 of Recentive Analytics. See Recentive Analytics, Inc. v. Fox Corp. United States Cour of Appeals for the Federal Circuit. 2023-2437. (6) Applicant argues Enfish and McRo. Applicant appears to suggest the claimed invention is indicative of integration into a practical application because it technical improvement/ McRo and computer-based improvement/ Enfish. See pg. 3 of applicant’s arguments/ remarks as filed 08/12/15. Examiner disagrees. Applicant’s claim invention is not indicative of integration into a practical application. Applicant’s claimed invention amounts to mere instructions to implement an abstract on a computer or merely uses the computer as a tool to perform the abstract idea. See also, MPEP § 2106.05(f). Furthermore, the nominal or tangential recitation of technology (e.g., (a) “computer-implemented” (preamble only); (b) “computing device” limited to the receipt/ transmission of data. See at least claims 2 - 7, 10 - 20 and 24) is merely adding insignificant extra-solution solution activity to the judicial exception. See also, MPEP § 2106.05(g). Applicant’s claimed invention generally links the use of the judicial exception to a particular technical environment or field of use. See also, MPEP § 2106.05(h). (7) Applicant argues the claimed invention is not recited at a high level of generality. With regard to preemption, the issue comes down to whether the claim is directed to an abstract idea and does it fail the Mayo/Alice step one and step two analysis. In the instant case, the claims are directed to the concept of collecting by a computing device personal needs assessment (PNA) data comprising: presenting the insurance customer with a first set of questions; receiving at the computing device a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data; determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; determining a second set of questions from a plurality of PNA questions based on the determined feature importance; presenting the insurance customer with the second set of PNA questions; and receiving at the computing device a second subset of the PNA data in response to the presented second set of questions; applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation; outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer; and determining a future action for contacting the insurance customer in the future comprising: predicting a probability that a plurality of lifestage milestones will occur within a given set of time; predicting a persona type of the insurance customer; predicting future insurance needs of the insurance customer based on the predicted probability that the plurality of lifestage milestones will occur and the predicted persona type; determining a difference between current insurance of the insurance customer and future insurance needs; and based on the determined difference, generating a contact action associated with the insurance customer to take at a future time which is similar to other (e.g., certain methods of organizing human activity, mental processes) found to be abstract ideas. The fact that the claims do not preempt all ways of collecting by a computing device personal needs assessment (PNA) data comprising: presenting the insurance customer with a first set of questions; receiving at the computing device a first subset of the PNA data in response to the presented first set of questions; inputting the received first subset of the PNA data to one or more trained policy recommendation models, each of the trained policy recommendation models generating a policy recommendation based on a plurality of respective features in the input PNA data; determining feature importance of the one or more trained policy recommendation models providing the partial -recommendation; determining a second set of questions from a plurality of PNA questions based on the determined feature importance; presenting the insurance customer with the second set of PNA questions; and receiving at the computing device a second subset of the PNA data in response to the presented second set of questions; applying at least one of the one or more trained policy recommendation models to the PNA data collected from an insurance customer comprising the first subset of PNA data and the second subset of PNA data to generate a recommendation of one or more insurance policies each of the insurance policies including a policy type and policy amount; applying the generated recommendation of the one or more insurance policies to a policy explainability model to identify one or more of the plurality of features of the respective trained policy recommendation models that led to the generated recommendation; mapping the one or more features identified by the policy explainability model to a human-understandable explanation of the policy recommendation; outputting the generated policy recommendation and the human-understandable explanation of the policy recommendation for presentation to the insurance customer; and determining a future action for contacting the insurance customer in the future comprising: predicting a probability that a plurality of lifestage milestones will occur within a given set of time; predicting a persona type of the insurance customer; predicting future insurance needs of the insurance customer based on the predicted probability that the plurality of lifestage milestones will occur and the predicted persona type; determining a difference between current insurance of the insurance customer and future insurance needs; and based on the determined difference, generating a contact action associated with the insurance customer to take at a future timein a particular setting does not make them any less abstract. See buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355 (Fed. Cir. 2014) (collecting cases); Accenture, 728 F.3d at 1345. Therefore, based on the two-part Alice Corp. analysis, there are no meaningful limitations in the claims that transform the exception (i.e., abstract idea) into a patent eligible application. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARA C HAMILTON whose telephone number is (571)272-1186. The examiner can normally be reached Monday-Thursday, 8-5, EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christine Tran can be reached at 571-272-8103. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. SARA CHANDLER HAMILTON Primary Examiner Art Unit 3695 /SARA C HAMILTON/Primary Examiner, Art Unit 3695
Read full office action

Prosecution Timeline

Aug 25, 2022
Application Filed
Oct 30, 2023
Non-Final Rejection — §101, §112
Feb 01, 2024
Response Filed
Feb 26, 2024
Final Rejection — §101, §112
May 31, 2024
Request for Continued Examination
Jun 03, 2024
Response after Non-Final Action
Aug 12, 2024
Non-Final Rejection — §101, §112
Dec 12, 2024
Response after Non-Final Action
Dec 12, 2024
Response Filed
Jan 21, 2025
Examiner Interview (Telephonic)
Jan 21, 2025
Examiner Interview Summary
Apr 01, 2025
Response Filed
May 07, 2025
Final Rejection — §101, §112
Aug 12, 2025
Response after Non-Final Action
Oct 10, 2025
Response after Non-Final Action
Nov 10, 2025
Request for Continued Examination
Nov 18, 2025
Response after Non-Final Action
Jan 26, 2026
Non-Final Rejection — §101, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12524967
Extended Reality Methods and Systems for Processing Vehicle-Related Information
2y 5m to grant Granted Jan 13, 2026
Patent 12513143
VIRTUAL CREDENTIAL AUTHENTICATION BASED ON BROWSING CONTEXT
2y 5m to grant Granted Dec 30, 2025
Patent 12481974
GROUP DATA OBJECTS AND ASSOCIATED FUNCTIONALITY ENABLEMENT
2y 5m to grant Granted Nov 25, 2025
Patent 12469015
SYSTEMS AND METHODS FOR PRIVATE NETWORK ISSUANCE OF DIGITAL CURRENCY
2y 5m to grant Granted Nov 11, 2025
Patent 12456103
High-throughput, low-latency transaction processing
2y 5m to grant Granted Oct 28, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+53.3%)
3y 9m
Median Time to Grant
High
PTA Risk
Based on 500 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month