Prosecution Insights
Last updated: April 19, 2026
Application No. 18/179,425

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM

Non-Final OA §101§102§103
Filed
Mar 07, 2023
Examiner
PINSKY, DOUGLAS W
Art Unit
3626
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fujifilm Business Innovation Corp.
OA Round
1 (Non-Final)
26%
Grant Probability
At Risk
1-2
OA Rounds
2y 12m
To Grant
41%
With Interview

Examiner Intelligence

Grants only 26% of cases
26%
Career Allow Rate
29 granted / 112 resolved
-26.1% vs TC avg
Strong +16% interview lift
Without
With
+15.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
39 currently pending
Career history
151
Total Applications
across all art units

Statute-Specific Performance

§101
27.9%
-12.1% vs TC avg
§103
31.2%
-8.8% vs TC avg
§102
9.5%
-30.5% vs TC avg
§112
26.8%
-13.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 112 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Acknowledgments The application filed on 03/07/23 is acknowledged. Status of Claims Claims 1-9 are pending. Claims 1-9 are rejected. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. The following title is suggested: Predicting project failure using machine learning Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-9 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1-9 are directed to an apparatus, non-transitory computer-readable medium, or method, which are/is one of the statutory categories of invention. (Step 1: YES) Claims 1, 8 and 9 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claims recite an apparatus, non-transitory computer-readable medium, or method, for predicting a project that is to have an outcome of failure and notifying a user of the same (as per the specification, the project may be an attempted sale, see 0026 ("Here, a project means a task implemented to achieve a certain purpose by a set deadline. Examples of the project include a development project of developing a certain software system or products, a sales project for receiving an order for certain items, …."), 0028 ("In the present exemplary embodiment, a description will be given of, as an example, a sales project in which plural salespersons perform various activities to receive orders for certain items under a superior.")). For Claims 1, 8 and 9 (claim 1 being deemed representative), the limitations (indicated below in bold) of: extract a feature quantity regarding one or more ongoing projects; input the feature quantity extracted from the one or more ongoing projects to a prediction model, to predict a project that is to have a final outcome of failure among the one or more ongoing projects, the prediction model having been subjected to machine learning using, as teaching data, a feature quantity of a past project having a final outcome of failure and the final outcome of failure; and give a warning for the project predicted to have a final outcome of failure. as drafted, constitute a process that, under the broadest reasonable interpretation, covers "certain methods of organizing human activity," specifically, "fundamental economic practices or principles" and/or "commercial or legal interactions" and/or "managing personal behavior or relationships or interactions between people" but for recitation of generic computer components. The Examiner notes that "fundamental economic practices" or "fundamental economic principles" describe concepts relating to the economy and commerce, including hedging, insurance, and mitigating risks, and "commercial interactions" or "legal interactions" include agreements in the form of contracts, legal obligations, advertising, marketing or sales activities or behaviors, and business relations, MPEP 2106.04(a)(2)II.A.,B., and "managing personal behavior or relationships or interactions between people" includes social activities, teaching, and following rules or instructions, MPEP 2106.04(a)(2)II.C. If a claim limitation, under its broadest reasonable interpretation, covers "fundamental economic practices or principles" and/or "commercial or legal interactions" and/or "managing personal behavior or relationships or interactions between people," but for recitation of generic computer components, then it falls within the "certain methods of organizing human activity" grouping of abstract ideas. Accordingly, Claims 1, 8 and 9 recite an abstract idea. (Step 2A - Prong 1: YES. The claims recite an abstract idea.) This judicial exception is not integrated into a practical application. Claims 1, 8 and 9 recite the additional elements of a processor (the foregoing recited by claim 1), a non-transitory computer readable medium storing a program causing a computer to execute a process (the foregoing recited by claim 8), and machine [learning] (the foregoing recited by claims 1, 8 and 9), that implement the abstract idea. These additional elements are not described by the applicant and they are recited at a high level of generality (i.e., one or more generic computer elements performing generic computer functions), such that they amount to no more than mere instructions to apply the exception using generic computer elements. Accordingly, even in combination these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. (Step 2A - prong 2: NO. The additional elements do not integrate the abstract idea into a practical application.) The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception itself. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a processor (the foregoing recited by claim 1), a non-transitory computer readable medium storing a program causing a computer to execute a process (the foregoing recited by claim 8), and machine [learning] (the foregoing recited by claims 1, 8 and 9), to perform the noted steps amount to no more than mere instructions to apply the exception using generic computer elements. Mere instructions to apply an exception using generic computer elements cannot provide an inventive concept ("significantly more"). Accordingly, even in combination, these additional elements do not provide significantly more. As such, Claims 1, 8 and 9 are not patent eligible. (Step 2B: NO. The claims do not provide significantly more.) Dependent claims 2-7 are similarly rejected because they further define/narrow the abstract idea of independent Claims 1, 8 and 9 as discussed above, and/or do not integrate the abstract idea into a practical application or provide an inventive concept such as would render the claims eligible, whether each is considered individually or as an ordered combination. As for further defining/narrowing the abstract idea: Dependent claim 2 merely describes performing the … learning of the prediction model using, as the teaching data, not only the feature quantity of the past project having the final outcome of failure but also a feature quantity of a past project having a final outcome of success and the final outcome of success. Dependent claim 3 merely describes storing…, in time series, measures taken to implement a project, a prediction result of the prediction model for the project that is ongoing, and a feature quantity of the project obtained when the prediction result is obtained; in a case where the prediction result stored … changes from failure to success, obtaining a feature quantity obtained when the prediction result of failure is obtained and measures taken while the prediction result changes from failure to success; inputting, to a learning model, a feature quantity extracted from the ongoing project for which the warning indicating that a final outcome is predicted to be failure has been given, to obtain a plan of additional measures for the ongoing project for which the warning has been given, the learning model having been subjected to … learning using, as teaching data, the feature quantity obtained when the prediction result of failure is obtained and the measures taken while the prediction result changes from failure to success; and presenting the obtained plan of additional measures together with the warning for the project predicted to have a final outcome of failure. Dependent claim 4 merely describes performing the … learning of the learning model using, as the teaching data, not only the feature quantity obtained when the prediction result of failure is obtained but also a feature quantity obtained when a prediction result of success is obtained before a date and time on which the prediction result of failure is obtained. Dependent claim 5 merely describes extracting, as a feature quantity regarding a project, at least one of information on measures taken to implement the project, information on an action performed by a user to implement the project, information on a medium that has been used, information on a user participating in the project, or information on business support software that is being used. Dependent claim 6 merely describes obtaining, from message information transmitted and received in the business support software for performing communication between a plurality of users, the information on measures taken to implement the project, the information on an action performed by a user to implement the project, or the information on a medium that has been used. Dependent claim 7 merely describes determining, based on a condition set in advance, whether a project completed in a past is a failure or a success. As for additional elements: Claims 2 and 4 recite “wherein the processor is configured" and "machine” [learning]. This recitation is at a high level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer element. Even in combination these additional elements do not integrate the abstract idea into a practical application and do not amount to significantly more than the abstract idea itself. Claim 3 recites “wherein the processor is configured," "machine” [learning], and "memory.” This recitation is at a high level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer element. Even in combination these additional elements do not integrate the abstract idea into a practical application and do not amount to significantly more than the abstract idea itself. Claims 5-7 recite “wherein the processor is configured." This recitation is at a high level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer element. Even in combination these additional elements do not integrate the abstract idea into a practical application and do not amount to significantly more than the abstract idea itself. Therefore, dependent claims 2-7 are not patent eligible. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 2, 5 and 7-9 are rejected under 35 U.S.C. 102 as being anticipated by Hsieh et al. (U.S. Patent Application Publication No. 2023/0351283 A1), hereafter Hsieh. Regarding Claims 1, 8 and 9 Hsieh teaches: (claim 1) An information processing apparatus (Fig. 1, 120) comprising: a processor (Fig. 4, 404) configured to: (Fig. 1, 120, Fig. 4, 404, 0048 "FIG. 3 is a flowchart, generally designated 300, illustrating the operational steps of program 122, on server 120 within distributed data processing environment 100 of FIG. 1"; 0025 "Server 120 may include internal and external hardware components, as depicted and described in further detail in FIG. 4."; 0063 "FIG. 4 is a block diagram, generally designated 400, illustrating the components of server 120 within distributed data processing environment 100 of FIG. 1"; 0064 "Computing device 400 includes processor(s) 404") (claim 8) A non-transitory computer readable medium storing a program causing a computer to execute a process, the process comprising: (0067, 0073-0074) (claim 9) An information processing method comprising: (0035, 0048, 0073, Figs. 2, 3, methods of Figs. 2 and 3) extract a feature quantity (attribute) regarding one or more ongoing projects; (0049 "extracts one or more key attributes of the project "; for context see next two bullet points / 0047-0048) input the feature quantity extracted from the one or more ongoing projects to a prediction model, to predict a project that is to have a final outcome of failure among the one or more ongoing projects, (0048, 0055, Fig. 3, 330, for context see 0047; note the predicted "outcome" includes a "success factor" which can be success or failure (unsuccessful), as per 0021 ("compare and analyze the plurality of attributes against a database of previous successful and unsuccessful project executions. Leveraging input from the database of previous successful and unsuccessful project executions, embodiments of the present invention predict an outcome of the project"), 0026 (like 0021), 0038 ("The one or more key attributes are one or more factors that significantly impact the outcome of each project (i.e., lead to a successful outcome of each project or lead to an unsuccessful outcome of each project)."), and 0048 ("evaluate a plurality of attributes of a project, compare and analyze the plurality of attributes against a database of previous successful and unsuccessful project executions; predict an outcome of the project, ….")) the prediction model having been subjected to machine learning using, as teaching data, a feature quantity of a past project having a final outcome of failure and the final outcome of failure; and (0047 note "The neural network model is trained on a ground truth data set that correlates observations or inputs (e.g., one or more key attributes extracted from one or more project related documents) to outcomes (e.g., the success factor, the quality factor, and the risk factor of the project)." -- the training of the machine learning algorithm on the ground truth data set teaches that the inputs to the machine learning algorithm include not only the attributes (feature quantities) but also the correlated past outcomes, which encompass both success and failure (the final outcome of failure)) give a warning for the project predicted to have a final outcome of failure. (0059 "responsive to receiving an updated assessment of the project that shows a less favorable outcome than initially planned (i.e., an updated assessment that does not exceed a threshold of success), program 122 sends an alert notification.") Regarding Claim 2 Hsieh teaches the limitations of base claim 1 as set forth above. Hsieh further teaches: wherein the processor is configured to perform the machine learning of the prediction model using, as the teaching data, not only the feature quantity of the past project having the final outcome of failure but also a feature quantity of a past project having a final outcome of success and the final outcome of success. (0047 "The one or more key attributes (feature quantity) extracted from the one or more previous project related documents are used to train the machine learning algorithm used by the digital profile to predict an outcome of a project"; 0021, 0026, 0038, 0048 as explained with respect to claim 1 above, the predicted "outcome" includes a "success factor," which encompasses both success and failure) Regarding Claim 5 Hsieh teaches the limitations of base claim 1 as set forth above. Hsieh further teaches: wherein the processor is configured to extract, as a feature quantity regarding a project, at least one of information on measures taken to implement the project, information on an action performed by a user to implement the project, information on a medium that has been used, information on a user participating in the project, or information on business support software that is being used. (0038 "The one or more key attributes may include, but are not limited to, a set of information about the user (e.g., name of the user, industry, size, etc.), a budget for each project of the user, a timeline for each project of the user, a size of a team working on each project, one or more members of the team working on each project, a level of experience of the one or more members of the team working on each project, a location of each member of the one or more members of the team working on each project, a relationship between at least two members of the one or more members of the team working on each project, and a targeted revenue from each project.") Regarding Claim 7 Hsieh teaches the limitations of base claim 1 as set forth above. Hsieh further teaches: wherein the processor is configured to determine, based on a condition set in advance, whether a project completed in a past is a failure or a success. (0018 "predict team success based on a set of baseline metrics (a condition set in advance) and machine learning techniques.") Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Hsieh v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Hsieh et al. (U.S. Patent Application Publication No. 2023/0351283 A1), hereafter Hsieh, in view of Watanabe (U.S. Patent Application Publication No. 2021/0295211 A1). Regarding Claim 6 Hsieh teaches the limitations of base claim 1 and intervening claim 5 as set forth above. Hsieh does not explicitly disclose but Watanabe teaches: wherein the processor (Fig. 1, 105) is configured to obtain, from message information transmitted and received in the business support software (chat/collaboration tool) for performing communication between a plurality of users, the information on measures taken to implement the project, the information on an action performed by a user to implement the project, or the information on a medium that has been used. (0010, 0011, 0013, 0061, 0074-0077, 0099, 0102, 0118-0119 (Fig. 4), 0153-0155 (Fig. 12): system extracts information on target case (0055, e.g., an attempted sale) from message collection (e.g., Figs. 8, 11), inputs the extracted information into a machine learning model (Fig. 1, 140, 145), and predicts conclusion (success/failure) of target case based on the extracted information using machine learning; regarding message information transmitted and received in the business support software for performing communication between a plurality of users: 0056-0058, 0080, 0082, 0104, 0134; regarding the information on measures taken to implement the project, the information on an action performed by a user to implement the project, or the information on a medium that has been used: see 0133, Fig. 8, message column 810, which displays contents of messages, e.g., "I presented proposal to Mr. ABC, and then …"; 0134-0142, Fig. 10, 1000 message display screen 1000, which displays message exchange including message titles 1005, which indicate message content, e.g., "visit to customer," etc. as per 0136-0142) It would have been obvious to one of ordinary skill in the art not later than the effective filing date of the claimed invention to have modified Hsieh's systems and methods for predicting an outcome (e.g., success/failure) of a project using machine learning, by incorporating therein these teachings of Watanabe regarding using information extracted from messages to predict an outcome (e.g., success/failure) of a project using machine learning, because messages are an ordinary source of information relevant to predicting the outcome (success/failure) of a project and the proposed combination allows for easy collection and use of information from messages to predict the outcome (success/failure) of a project using machine learning, thereby rendering Hsieh more flexible/more widely applicable in terms of accommodating a greater variety of data sources for collecting and using information to extract attributes (features) therefrom for use as inputs into a machine learning algorithm to predict the outcome of a project. Subject Matter Distinguishable From Prior Art The cited prior art of record, either alone or in combination, fails to expressly teach or suggest the features found in dependent claims 3 and 4. In addition to the teachings set forth by the prior art applied in the rejections above: Hsieh (US-20230351283-A1) (applied in the rejections above) also teaches, as subject matter similar to claim 3, optimizing an outcome of a project, wherein: The digital profile generates the optimization suggestion using the machine learning algorithm that was trained on the project portfolio of the user, including project related documents for one or more previous projects, one or more on-going projects, one or more upcoming projects of the user, and feedback received from the team and the clients. The machine learning algorithm starts with an initial state as defined by the project inputs entered by the user. The machine learning algorithm then explores different project configurations that deviate slightly from the original. The machine learning algorithm generates an outcome for each of the project configurations. The project configurations that seem promising are pursued further, while the project configurations that do not seem promising are terminated. The outcome is a set of project configurations that lead to a more desirable outcome than the project configuration originally provided by the user (i.e., a suggestion to adjust one or more factors of the project to mimic one or more key attributes that lead to a successful outcome of the project). … In an embodiment, program 122 stores the optimization suggestion in the knowledge base of the server to be used for future optimizations and to guide the exploration of promising project configurations. See 0057, Fig. 3, 340. Watanabe (US-20210295211-A1) (applied in the rejections above) also teaches, as subject matter similar to claim 3, storing, in a time series, information on measures taken to implement a project/action taken by a user to implement a project/materials presented to a customer (a medium that has been used), obtained from message information transmitted and received in a chat/collaboration tool (business support software) for performing communication between multiple users, see Figs. 8 and 10. Yan ("Sales pipeline win propensity prediction: a regression approach") teaches training and applying/using a machine learning model to predict the outcome (success/failure; win propensity) of a sales opportunity/lead (attempted sale; attempted conversion of lead to sale), where the training dataset includes historical leads, associated profile features of those leads, and labels, which are win or non-win outcome. Graham (US-20120072260-A1) teaches predicting the success of a project, based on answers to questions related to how a contract for a proposed project was created between a vendor and a customer, generating a warning of projects predicted to fail, and, as subject matter similar to claim 3, remedial actions to be taken, including comparing a project that has been predicted to be successful with other past projects that are similar and also were predicted to be successful, identifying changes made to those other projects which led do their success or failure, and recommending and implementing, for the current project, the changes that led to success in the other projects, if the current project has an unacceptable risk of failure (0030-0034, 0039). Venkataraman (US-20180174066-A1) teaches predicting a success rate of a project, including initiating a natural language conversation with a stakeholder associated with a project to determine one or more features relevant for the stakeholder, creating a prediction model with relative weightage values assigned to relevant features, wherein the prediction model is trained based on historic data and predicts a success rate of the project based on the trained prediction model and current state of the project. Kulkarni (US-20230017316-A1) teaches predicting success of a software product being developed, using a combination of machine learning models, based on project management data, where the success probability is based on a timeliness score, a quality score, and a product readiness score, and, as subject matter similar to claim 3, modifying the software product based on the success probability, and retraining the machine learning models accordingly, see Fig. 1F. Nikolaev (US-8626698-B1) teaches developing a model to estimate a probability of project success, including maintaining a database of historical project management performance data including i) task information associated with at least one completed task and ii) member information associated with at least one team member, forming a predictive model based on the historical project management performance data, further including determining variables of the predictive model and dependencies between the variables. Megahed (US-20170161660-A1) teaches a method comprising mapping project attributes for past projects to a first parameter set associated with a first model that models distribution of event types of project events, and a second parameter set associated with a second model that models distribution of the time intervals of project events, where machine learning is applied to a set of historical data for the past projects to obtain a first and a second set of learned weights, and further comprising predicting information relating to a next project event for an ongoing project by generating a first probability distribution for a set of possible event types for the next project event utilizing the first model, and, for each possible event type, generating a corresponding probability distribution for time intervals of occurrence of the possible event type utilizing the first model and the second model in a pipelined fashion. Siebel (US-20220405775-A1) teaches using a first trained machine learning model to predict a probability that a transaction (sale) opportunity will be successfully completed, using a second trained machine learning model to predict a probable closing date for the transaction opportunity, and determining a probability that the transaction opportunity will be successfully completed by the closing date. Li (US-20150025931-A1) teaches a method and apparatus to determine (a) the likelihood and timing for a sales opportunity to become a sale based on analytical models that incorporate the history of sales stage evolution and other covariates, and (b) the expected number of sales from invisible opportunities prior to a target date. In particular, however, the cited prior art of record, either alone or in combination, fails to expressly teach or suggest all of the features in dependent claims 3 and 4 and more specifically the limitations of: store in the memory, in time series, measures taken to implement a project, a prediction result of the prediction model for the project that is ongoing, and a feature quantity of the project obtained when the prediction result is obtained; in a case where the prediction result stored in the memory changes from failure to success, obtain a feature quantity obtained when the prediction result of failure is obtained and measures taken while the prediction result changes from failure to success; and input, to a learning model, a feature quantity extracted from the ongoing project for which the warning indicating that a final outcome is predicted to be failure has been given, to obtain a plan of additional measures for the ongoing project for which the warning has been given, the learning model having been subjected to machine learning using, as teaching data, the feature quantity obtained when the prediction result of failure is obtained and the measures taken while the prediction result changes from failure to success, in combination with the other claim limitations. Conclusion The prior art made of record and not relied upon, as set forth in the accompanying Notice of References Cited (PTO-892), is considered pertinent to applicant's disclosure. Description of the cited prior art is provided above ("Subject Matter Distinguishable From Prior Art"). Any inquiry concerning this communication or earlier communications from the examiner should be directed to DOUGLAS W PINSKY whose telephone number is (571)272-4131. The examiner can normally be reached on 8:30 am - 5:30 pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jessica Lemieux can be reached on 571-270-3445. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DWP/ Examiner, Art Unit 3626 /JESSICA LEMIEUX/Supervisory Patent Examiner, Art Unit 3626
Read full office action

Prosecution Timeline

Mar 07, 2023
Application Filed
Apr 24, 2023
Response after Non-Final Action
Oct 28, 2025
Non-Final Rejection — §101, §102, §103
Jan 15, 2026
Examiner Interview Summary
Jan 15, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12481976
ENCODED TRANSFER INSTRUMENTS
2y 5m to grant Granted Nov 25, 2025
Patent 12450588
METHOD FOR PROCESSING A SECURE FINANCIAL TRANSACTION USING A COMMERCIAL OFF-THE-SHELF OR AN INTERNET OF THINGS DEVICE
2y 5m to grant Granted Oct 21, 2025
Patent 12450591
SYSTEMS AND METHODS FOR CONTACTLESS CARD ACTIVATION VIA UNIQUE ACTIVATION CODES
2y 5m to grant Granted Oct 21, 2025
Patent 12406309
Auto Filing of Insurance Claim Via Connected Car
2y 5m to grant Granted Sep 02, 2025
Patent 12254516
NETWORK-BASED JOINT INVESTMENT PLATFORM
2y 5m to grant Granted Mar 18, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
26%
Grant Probability
41%
With Interview (+15.5%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 112 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month