Prosecution Insights
Last updated: April 19, 2026
Application No. 18/534,345

Information Technology Framework For Measuring Performance of Business Capabilities of an Enterprise

Non-Final OA §101§102
Filed
Dec 08, 2023
Examiner
GURSKI, AMANDA KAREN
Art Unit
3625
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
T-Mobile Innovations LLC
OA Round
1 (Non-Final)
32%
Grant Probability
At Risk
1-2
OA Rounds
3y 7m
To Grant
66%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
129 granted / 398 resolved
-19.6% vs TC avg
Strong +33% interview lift
Without
With
+33.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
30 currently pending
Career history
428
Total Applications
across all art units

Statute-Specific Performance

§101
39.4%
-0.6% vs TC avg
§103
36.7%
-3.3% vs TC avg
§102
11.6%
-28.4% vs TC avg
§112
10.3%
-29.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 398 resolved cases

Office Action

§101 §102
DETAILED ACTION This office action is in response to communication filed on 8 December 2023. Claims 1 – 20 are presented for examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims1 – 20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to the judicial exception of abstract ideas without significantly more. The independent claims recite generating an initiative comprising action steps to be performed for implementing the initiative, transmitting a request for a strategic dashboard scorecard, receiving a strategic dashboard scorecard responsive to transmitting the request, wherein the strategic dashboard scorecard comprises a business capability for a key performance measure (KPM), data associated with the business capability for the KPM, and the initiative, receiving a second request for creating configuration for the initiative, wherein the configuration information for the initiative comprises a KPM corresponding to at least one business capability, a performance measurement score for the KPM, a scoring factor applied to the initiative, and a scoring value derived for the initiative based on the scoring factor, creating the configuration information for the initiative in response to receiving the request, and transmit the configuration information, receiving the initiative, transmitting project progress data of the initiative wherein the project progress data comprises performance data that is obtained during implementation of the initiative, obtaining a dashboard template, determining a scoring factor for obtaining a scoring value for a project process data, assigning a scoring value for the initiative based on the scoring factor, inputting the scoring value to the dashboard template to obtain the strategic dashboard scorecard, transmitting the strategic dashboard scorecard in response to the request for it, analyzing at least one of the project process data or the data associated with the business capability for the KPM, and transmitting revised action steps for a second initiative whereby the revised action steps permit the enterprise to avoid delays in implementing the second initiative. This judicial exception is not integrated into a practical application. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The eligibility analysis in support of these findings is provided below, in accordance section 2106 of the MPEP (hereinafter, MPEP 2106). With respect to Step 1 of the eligibility inquiry (as explained in MPEP 2106), it is noted that the systems and the method are directed to an eligible categories of subject matter. Step 1 is satisfied. With respect to Step 2A prong 1 of MPEP 2106, it is next noted that the claims recite an abstract idea by reciting concepts of scorecard and KPI evaluation, as these are commercial interactions, which falls into the “certain methods of organizing human activity” group within the enumerated groupings of abstract ideas set forth in the MPEP 2106. The claimed invention also recites an abstract idea that falls within the mental processes grouping, as independent claims recite receiving, assigning, obtaining, and analyzing steps. The limitations reciting the abstract idea in independent claims are recite generating an initiative comprising action steps to be performed for implementing the initiative, transmitting a request for a strategic dashboard scorecard, receiving a strategic dashboard scorecard responsive to transmitting the request, wherein the strategic dashboard scorecard comprises a business capability for a key performance measure (KPM), data associated with the business capability for the KPM, and the initiative, receiving a second request for creating configuration for the initiative, wherein the configuration information for the initiative comprises a KPM corresponding to at least one business capability, a performance measurement score for the KPM, a scoring factor applied to the initiative, and a scoring value derived for the initiative based on the scoring factor, creating the configuration information for the initiative in response to receiving the request, and transmit the configuration information, receiving the initiative, transmitting project progress data of the initiative wherein the project progress data comprises performance data that is obtained during implementation of the initiative, obtaining a dashboard template, determining a scoring factor for obtaining a scoring value for a project process data, assigning a scoring value for the initiative based on the scoring factor, inputting the scoring value to the dashboard template to obtain the strategic dashboard scorecard, transmitting the strategic dashboard scorecard in response to the request for it, analyzing at least one of the project process data or the data associated with the business capability for the KPM, and transmitting revised action steps for a second initiative whereby the revised action steps permit the enterprise to avoid delays in implementing the second initiative. With respect to Step 2A Prong Two of the MPEP 2106, the judicial exception is not integrated into a practical application. The additional elements are directed to user equipment, graphical user interface, an enterprise tool, servers, and strategic performance measurement application, to implement the abstract idea. However, these elements fail to integrate the abstract idea into a practical application because they fail to provide an improvement to the functioning of a computer or to any other technology or technical field, fail to apply the exception with a particular machine, fail to effect a transformation of a particular article to a different state or thing, and fail to apply/use the abstract idea in a meaningful way beyond generally linking the use of the judicial exception to a particular technological environment. Furthermore, these elements have been fully considered, however they are directed to the use of generic computing elements to perform the abstract idea, which is not sufficient to amount to a practical application (as noted in the MPEP 2106) and is tantamount to simply saying “apply it” using a general purpose computer, which merely serves to tie the abstract idea to a particular technological environment by using the computer as a tool to perform the abstract idea, which is not sufficient to amount to particular application. Accordingly, because the Step 2A Prong One and Prong Two analysis resulted in the conclusion that the claims are directed to an abstract idea, additional analysis under Step 2B of the eligibility inquiry must be conducted in order to determine whether any claim element or combination of elements amount to significantly more than the judicial exception. With respect to Step 2B of the eligibility inquiry, it has been determined that the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The additional limitations are directed to: user equipment, graphical user interface, an enterprise tool, servers, and strategic performance measurement application. These elements have been considered, but merely serve to tie the invention to a particular operating environment, though at a very high level of generality and without imposing meaningful limitation on the scope of the claim. This does not amount to significantly more than the abstract idea, and it is not enough to transform an abstract idea into eligible subject matter. Such generic, high-level, and nominal involvement of a computer or computer-based elements for carrying out the invention merely serves to tie the abstract idea to a particular technological environment, which is not enough to render the claims patent-eligible, as noted at pg. 74624 of Federal Register/Vol. 79, No. 241, citing Alice, which in turn cites Mayo. In addition, when taken as an ordered combination, the ordered combination adds nothing that is not already present as when the elements are taken individually. There is no indication that the combination of elements integrates the abstract idea into a practical application. Their collective functions merely provide conventional computer implementation. Therefore, when viewed as a whole, these additional claim elements do not provide meaningful limitations to transform the abstract idea into a practical application of the abstract idea or that the ordered combination amounts to significantly more than the abstract idea itself. The dependent claims have been fully considered as well, however, similar to the finding for claims above, these claims are similarly directed to the abstract idea of concepts of overriding scoring values, receiving user input for selecting indicator flags when initiative completed, and obtaining progress data/template/scoring rules, by way of examples, without integrating it into a practical application and with, at most, a general purpose computer that serves to tie the idea to a particular technological environment, which does not add significantly more to the claims. The ordered combination of elements in the dependent claims (including the limitations inherited from the parent claims) add nothing that is not already present as when the elements are taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. Accordingly, the subject matter encompassed by the dependent claims fails to amount to significantly more than the abstract idea. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1 – 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. P.G. Pub. 2019/0268233 (hereinafter, Singh). Regarding claim 1, Singh teaches a system for an information technology framework for measuring performance of business capabilities for a key performance measure (KPM) of an enterprise according to a business initiative (¶ 5, “The enterprise may utilize a variety of SaaS and/or PaaS based software applications that are deployed on an instance hosted in the cloud for the enterprise. Various users (e.g., process owners, service managers, helpdesk managers, IT staff, analysts, development or project managers, management staff) associated with the enterprise may utilize the instance to access, provide or manage various services, processes, or functions related to the enterprise. The users may wish to submit various improvement initiatives (i.e., improvement processes) to continuously improve one or more of the services, processes, or functions of the enterprise. The improvement processes may be of different types (e.g., solve identified problems, close process gaps, provide coaching opportunities) and may align with different high-level goals and objectives of the enterprise (e.g., improve customer satisfaction, reduce operation cost, increase revenue).”), comprising: a first user equipment configured to: generate, via a first graphical user interface (GUI), an initiative in an enterprise tool of an application server, wherein the initiative comprises action steps that are to be performed for implementing the initiative (¶ 51, “FIG. 6 shows an illustrative screen shot of GUI 600 for setting a monitored metric with a predetermined target in accordance with one or more embodiments. In cases where improvement of the action (i.e., service/process/function 360), that is to be improved and that is associated with the record of the CIP created at block 405, is to be tracked using a KPI or breakdown of the KPI, and analytical data of the KPI is available for consumption to integrated CIM application 315 (e.g., from PA application on client instance 310), the user may decide to set the KPI as the monitored metric and the score of the KPI as the predetermined target.”); transmit, via the first GUI, a request for a strategic dashboard scorecard to a strategic performance measurement server (¶ 53, “FIG. 7 shows a screen shot of GUI 700 illustrating embedded visualization data associated with a KPI as a monitored metric in accordance with one or more embodiments. As shown in FIG. 7, when the user specifies a KPI (e.g., Incident backlog growth) as a monitored metric, goal setting module 335 may embed real-time analytical data of, for example, a scorecard widget of the KPI in CIM form list/view of the record associated with the monitored metric set at block 410 and visualization engine 330 may present the scorecard widget of the KPI in association with the monitored metric within the form to the user.”); and receive, via the first GUI, the strategic dashboard scorecard responsive to transmitting the request, wherein the strategic dashboard scorecard comprises a business capability for a KPM, data associated with the business capability for the KPM, and the initiative (¶ 28, “The solution may leverage capabilities of other existing applications and solutions of the enterprise in managing continual improvement and drive continual improvement at all levels by aligning continual improvement processes or initiatives (CIPs), enterprise data, subject-matter experts, and enterprise objectives and goals to achieve continual improvement of any process, service, or function of the enterprise. Further, the solution may provide visibility into progress data of the CIPs via workbenches that provide simplified access to users and value realization dashboards that provide visibility into outcomes achieved through completion of tasks associated with the CIPs. Techniques disclosed herein enable a user to set measurable goals (i.e., monitored metrics) with predetermined targets for the CIPs; identify and monitor completion of one or more tasks associated with the CIPs; embed real-time analytics and continuously monitor progress data associated with the monitored metrics over time; and take course correction measures depending on the progress data. For example, an improvement KPI with a current base metric and a predetermined target metric to be achieved within a predetermined time period may be set as the monitored metric, and real-time analytical data (e.g., a scorecard widget) associated with the KPI may be embedded into the CIP so the user can track changes to the real-time scorecard of the KPI as tasks associated with the CIP are completed.”); a second user equipment configured to: receive, via a second GUI, a request for creating configuration information for the initiative from the enterprise tool (¶ 28, “The solution may leverage capabilities of other existing applications and solutions of the enterprise in managing continual improvement and drive continual improvement at all levels by aligning continual improvement processes or initiatives (CIPs), enterprise data, subject-matter experts, and enterprise objectives and goals to achieve continual improvement of any process, service, or function of the enterprise. Further, the solution may provide visibility into progress data of the CIPs via workbenches that provide simplified access to users and value realization dashboards that provide visibility into outcomes achieved through completion of tasks associated with the CIPs. Techniques disclosed herein enable a user to set measurable goals (i.e., monitored metrics) with predetermined targets for the CIPs; identify and monitor completion of one or more tasks associated with the CIPs; embed real-time analytics and continuously monitor progress data associated with the monitored metrics over time; and take course correction measures depending on the progress data. For example, an improvement KPI with a current base metric and a predetermined target metric to be achieved within a predetermined time period may be set as the monitored metric, and real-time analytical data (e.g., a scorecard widget) associated with the KPI may be embedded into the CIP so the user can track changes to the real-time scorecard of the KPI as tasks associated with the CIP are completed.”); create, via the second GUI, the configuration information for the initiative in response to receiving the request (¶ 45, “discovery and trend finder module 325 may automatically submit requests for creation of new CIPs based on the breached thresholds without any user operation.”); and transmit, via the second GUI, the configuration information to the strategic performance measurement server (¶ 30, “MID server 107 may be configured to assist functions such as, but not necessarily limited to, discovery, orchestration, service mapping, service analytics, and event management.”); the application server configured to: receive the initiative from the first GUI (¶ 5, “The users may wish to submit various improvement initiatives (i.e., improvement processes) to continuously improve one or more of the services, processes, or functions of the enterprise.”); and transmit project progress data of the initiative to a strategic performance measurement server, wherein the project progress data comprises performance data that is obtained during implementation of the initiative (¶ 28, “identify and monitor completion of one or more tasks associated with the CIPs; embed real-time analytics and continuously monitor progress data associated with the monitored metrics over time”); and the strategic performance measurement server configured to: assign a scoring value to the initiative (¶ 28, “real-time analytical data (e.g., a scorecard widget) associated with the KPI may be embedded into the CIP so the user can track changes to the real-time scorecard of the KPI as tasks associated with the CIP are completed”); obtain the strategic dashboard scorecard responsive to assigning the scoring value (¶ 46, “visualization engine 330 may include logic to visualize indicators, scorecards, dashboards, workbenches, and/or widgets on a client device. Scorecards refer to a graphical visualization of the scores of an indicator”); transmit, to the first user equipment, the strategic dashboard scorecard in response to the request for the strategic dashboard scorecard (¶ 46, “In a scorecard, the scores of an indicator may be analyzed further by viewing the scores by breakdowns (scores per group), aggregates (counts, sums, and maximums), time series (totals and averages applied to different time periods) and (if available) drilling down to the records on which the scores are based.”); analyze at least one of the project progress data and the data associated with the business capability for the KPM (¶ 51, “The monitored metric may measure progress of the CIP over time based on analytical data associated with the monitored metric. The analytical data may be KPI data, survey and assessment data, or external analytical data. FIG. 6 shows an illustrative screen shot of GUI 600 for setting a monitored metric with a predetermined target in accordance with one or more embodiments. In cases where improvement of the action (i.e., service/process/function 360), that is to be improved and that is associated with the record of the CIP created at block 405, is to be tracked using a KPI or breakdown of the KPI, and analytical data of the KPI is available for consumption to integrated CIM application 315 (e.g., from PA application on client instance 310), the user may decide to set the KPI as the monitored metric and the score of the KPI as the predetermined target.”); and automatically transmit, to the enterprise tool, one or more revised action steps for a second initiative whereby the one or more revised action steps permit the enterprise to avoid delays in implementing the second initiative (¶ 49, “Exemplary integration points may include: an integration point in the Benchmarks application when a best practice recommendation to improve a KPI against the peer benchmark involves creating a new CIP (or changing an existing CIP) to improve the benchmark KPI score; an integration point in the Survey and Assessment application to create a new CIP or change an existing CIP when a customer satisfaction survey score is determined to be low; an integration point in the PA application”). Regarding claim 2, Singh teaches the system of claim 1, wherein the strategic performance measurement server is configured to: obtain a dashboard template; determine a scoring factor for obtaining a scoring value for a project process data; and input the data into the dashboard template to obtain the strategic dashboard scorecard (¶ 46, “Dashboards may refer to a visualization (e.g., collection of lists, graphs, charts, or other content items that automatically refresh) presented to a user of client instance 310 based on CIM data. (See FIGS. 9-10 illustrating presented progress data including CIM workbench and CIM value realization dashboard. FIGS. 9-10 are explained in detail later). A dashboard may have multiple tabs to analyze and interact with visualizations of indicator scores, called widgets. Each tab of the dashboard may hold one or more widgets. A user may have one or more dashboards assigned for viewing. Widgets determine how data is presented on dashboards and are visible only when added to a dashboard. Widgets allow visualizations of multiple indicators on a single dashboard in order to visualize multiple score sources. A widget can be configured to have different visualization types to display data as a time series, score, list, or breakdown. For example, a widget can be configured as a chart, latest score, speedometer, dial, scorecard, or column.”). Regarding claim 3, Singh teaches the system of claim 2, wherein the strategic performance measurement server is configured to: derive the scoring value from the scoring factor to obtain a derived scoring value when the initiative includes an indicator; and assign the derived scoring value to the initiative in the dashboard template (¶ 54, “The impacted KPIs may include indicators that have an impact on the primary KPI score or vice-versa. In one embodiment, goal setting module 335 may automatically set the impacted KPIs associated to the primary KPI based on data of various interdependencies and regressions between different KPIs and associated the processes, services, or functions. By associating the impacted KPIs to the primary KPI, when monitoring improvement of the monitored metric primary KPI of the CIP, the user may easily monitor impact on scores of other KPIs to ensure that improvement in one process or service related to the primary KPI does not lead to a decline in another process or service related to the impacted KPIs.”). Regarding claim 4, Singh teaches the system of claim 2, wherein the strategic performance measurement server is configured to: override a scoring value for the initiative that is received from an administrative user associated with the second user equipment to obtain an override scoring value when the initiative does not include an indicator; and assign the override scoring value to the initiative in the dashboard template (¶ 49, “an integration point in the Survey and Assessment application to create a new CIP or change an existing CIP when a customer satisfaction survey score is determined to be low; an integration point in the PA application. to create/change a CIP when a user (or a system) determines a given KPI score to be outside an acceptable range; an integration point in the PPM application to create/change a CIP responsive to an idea/demand request, an integration point in integrated CIM application 315 to create/change a CIP responsive to discovery of new CIP candidates by discovery and trend finder module 325. Alternately, the CIP record may be directly created manually by the user in integrated CIM application 315”). Regarding claim 5, Singh teaches the system of claim 1, wherein the configuration information for the initiative comprises a KPM corresponding to at least one business capability of the enterprise, a performance measurement score for the KPM; a scoring factor applied to the initiative, and a scoring value derived for the initiative based on the scoring factor (¶ 46, “the scores of an indicator may be analyzed further by viewing the scores by breakdowns (scores per group), aggregates (counts, sums, and maximums), time series (totals and averages applied to different time periods) and (if available) drilling down to the records on which the scores are based.”). Regarding claim 6, Singh teaches the system of claim 1, wherein the first user equipment is configured to: receive user input for selecting an indicator flag in the enterprise tool when the initiative is completed; obtain a scoring value in response to the indicator flag; and send a trigger notification message to a strategic performance measurement (SPM) application from the enterprise tool instructing the SPM application to obtain the data for the strategic dashboard scorecard (¶ 28, “For example, an improvement KPI with a current base metric and a predetermined target metric to be achieved within a predetermined time period may be set as the monitored metric, and real-time analytical data (e.g., a scorecard widget) associated with the KPI may be embedded into the CIP so the user can track changes to the real-time scorecard of the KPI as tasks associated with the CIP are completed. The user can thus view progress of the monitored metric and determine whether the CIP is on track to achieve the improvement goal.”). Regarding claim 7, Singh teaches the system of claim 6, wherein the SPM application is configured to execute a performance measurement algorithm to obtain the data for the strategic dashboard scorecard in response to the trigger notification message (¶ 44, “KPIs or indicators (also known enterprise metrics) are a type of performance measurement used by enterprises to measure current conditions and forecast future trends. Indicators are commonly used to evaluate success or the success of a particular activity. Success may be defined as making progress toward strategic goals, or as the repeated achievement of some level of operational goal (e.g., zero defects, or 10/10 customer satisfaction). Indicators may be associated with performance improvement processes such as the CIPs of integrated CIM application 315. Scores associated with indicators are usually presented in graphs to make them easier to read and understand. Breakdowns (also known as dimensions or drill-downs) divide data of indicators in different ways. For example, incident data of a number of open incidents indicator can be divided by breakdowns including priority, category, assignment group, state or age. Client instance 310 may provide “out of the box” indicators and breakdowns that may be utilized by applications (e.g., Benchmarks application, PA application, integrated CIM application 315) deployed on client instance 310. In addition, the user of client instance 310 may create additional KPIs using existing indicators or other analytical data (e.g., survey and assessment data, external analytical data) associated with client instance 310.”). Regarding claim 8, Singh teaches the system of claim 7, wherein the SPM application is configured to: in response to the trigger notification message: obtain project progress data for the initiative; obtain a dashboard template for one or more KPMs; obtain scoring rules for the scoring value; execute the performance measurement algorithm to obtain performance measurement scores of the one or more KPMs; and input the performance measurement scores into a dashboard template (¶ 53, “FIG. 7 shows a screen shot of GUI 700 illustrating embedded visualization data associated with a KPI as a monitored metric in accordance with one or more embodiments. As shown in FIG. 7, when the user specifies a KPI (e.g., Incident backlog growth) as a monitored metric, goal setting module 335 may embed real-time analytical data of, for example, a scorecard widget of the KPI in CIM form list/view of the record associated with the monitored metric set at block 410 and visualization engine 330 may present the scorecard widget of the KPI in association with the monitored metric within the form to the user.”) (¶ 54, “The impacted KPIs may include indicators that have an impact on the primary KPI score or vice-versa. In one embodiment, goal setting module 335 may automatically set the impacted KPIs associated to the primary KPI based on data of various interdependencies and regressions between different KPIs and associated the processes, services, or functions. By associating the impacted KPIs to the primary KPI, when monitoring improvement of the monitored metric primary KPI of the CIP, the user may easily monitor impact on scores of other KPIs to ensure that improvement in one process or service related to the primary KPI does not lead to a decline in another process or service related to the impacted KPIs.”). Regarding claims 9 and 17, the claims recite substantially similar limitations to claim 1. Therefore, claims 9 and 17 are similarly rejected for the reasons set forth above with respect to claim 1. Regarding claim 10, the claim recites substantially similar limitations to claim 2. Therefore, claim 10 is similarly rejected for the reasons set forth above with respect to claim 2. Regarding claims 11 and 18, the claims recite substantially similar limitations to claim 3. Therefore, claims 11 and 18 are similarly rejected for the reasons set forth above with respect to claim 3. Regarding claims 12 and 19, the claims recite substantially similar limitations to claim 4. Therefore, claims 12 and 19 are similarly rejected for the reasons set forth above with respect to claim 4. Regarding claim 13, the claim recites substantially similar limitations to claim 5. Therefore, claim 13 is similarly rejected for the reasons set forth above with respect to claim 5. Regarding claims 14 and 20, the claims recite substantially similar limitations to claim 6. Therefore, claims 14 and 20 are similarly rejected for the reasons set forth above with respect to claim 6. Regarding claim 15, the claim recites substantially similar limitations to claim 7. Therefore, claim 15 is similarly rejected for the reasons set forth above with respect to claim 7. Regarding claim 16, the claim recites substantially similar limitations to claim 8. Therefore, claim 16 is similarly rejected for the reasons set forth above with respect to claim 8. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMANDA GURSKI whose telephone number is (571)270-5961. The examiner can normally be reached Monday to Thursday 7am to 5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian Epstein can be reached at 571-270-5389. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMANDA GURSKI/Primary Examiner, Art Unit 3625
Read full office action

Prosecution Timeline

Dec 08, 2023
Application Filed
Feb 24, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596982
SUSTAINABILITY RECOMMENDATIONS FOR HYDROCARBON OPERATIONS
2y 5m to grant Granted Apr 07, 2026
Patent 12572865
Automatic and Dynamic Adaptation of Hierarchical Reconciliation for Time Series Forecasting
2y 5m to grant Granted Mar 10, 2026
Patent 12541734
SYSTEMS AND METHODS FOR BOOTSTRAP SCHEDULING
2y 5m to grant Granted Feb 03, 2026
Patent 12481963
PROACTIVE SCHEDULING OF SHARED RESOURCES OR RESPONSIBILITIES
2y 5m to grant Granted Nov 25, 2025
Patent 12387284
UTILIZING DIGITAL SIGNALS TO INTELLIGENTLY MONITOR CLIENT DEVICE TRANSIT PROGRESS AND GENERATE DYNAMIC PUBLIC TRANSIT INTERFACES
2y 5m to grant Granted Aug 12, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
32%
Grant Probability
66%
With Interview (+33.3%)
3y 7m
Median Time to Grant
Low
PTA Risk
Based on 398 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month