Prosecution Insights
Last updated: April 19, 2026
Application No. 17/859,333

DATA RELIABILITY INDEX

Non-Final OA §101§102
Filed
Jul 07, 2022
Examiner
SULLIVAN, JESSICA E
Art Unit
3627
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
SAP SE
OA Round
5 (Non-Final)
15%
Grant Probability
At Risk
5-6
OA Rounds
3y 7m
To Grant
36%
With Interview

Examiner Intelligence

Grants only 15% of cases
15%
Career Allow Rate
16 granted / 108 resolved
-37.2% vs TC avg
Strong +21% interview lift
Without
With
+21.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
29 currently pending
Career history
137
Total Applications
across all art units

Statute-Specific Performance

§101
30.7%
-9.3% vs TC avg
§103
40.3%
+0.3% vs TC avg
§102
21.9%
-18.1% vs TC avg
§112
4.6%
-35.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 108 resolved cases

Office Action

§101 §102
DETAILED ACTION This Non-Final Office action is in response to Claims on 01/26/2026. Claims 1-20 are pending. The effective filling date of the claimed invention is 07/07/2022. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/26/2026 has been entered. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract idea without significantly more. Step 1- Claims 1-7 are directed to a system, which is a machine that is deemed appropriate subject matter for a patent. Claims 8-14 are directed to a method, which is a process that is deemed appropriate subject matter for a patent. Claims 15-20 are directed to a non-transient, computer-readable medium storing instructions, which is an article of manufacture that is deemed appropriate subject matter for a patent. Accordingly, claims 1-20 pass Step 1. Step 2A, Prong 1-The independent claims 1, 8 and 15 recites the following abstract ideas; Receive, via a graphical user interface (UI) (additional element), a specification that identifies one or more influence factors related to a set of reporting data, the set of reporting data including at least one reporting attribute (receiving information found within a specification is an action that may be accomplished by a human, mentally or with pen and paper; See MPEP 2106.04(a)(2)(III)(C)(2) Performing a mental process in a computer environment. An example of a case identifying a mental process performed in a computer environment as an abstract idea is Symantec Corp., 838 F.3d at 1316-18, 120 USPQ2d at 1360. In this case, the Federal Circuit relied upon the specification when explaining that the claimed electronic post office, which recited limitations describing how the system would receive, screen and distribute email on a computer network, was analogous to how a person decides whether to read or dispose of a particular piece of mail and that “with the exception of generic computer-implemented steps, there is nothing in the claims themselves that foreclose them from being performed by a human, mentally or with pen and paper”. 838 F.3d at 1318, 120 USPQ2d at 1360. (emphasis added)) and the specification identifying the at least one reporting attribute of the set of reporting data particularly associated with the one or more influence factors (identifying attributes is a method of making a determination, which is grouped as a mental process under MPEP 2106.04(a)(2)); receive an input to designate a weight factor to each of the one or more influence factors for each of the at least one reporting attribute of the set of reporting data (receiving information that would be required to complete an equation is an action that can be performed by a human, whether mentally or with paper and pen; See MPEP 2106.04(a)(2)(III)(C)(2) Performing a mental process in a computer environment. An example of a case identifying a mental process performed in a computer environment as an abstract idea is Symantec Corp., 838 F.3d at 1316-18, 120 USPQ2d at 1360. In this case, the Federal Circuit relied upon the specification when explaining that the claimed electronic post office, which recited limitations describing how the system would receive, screen and distribute email on a computer network, was analogous to how a person decides whether to read or dispose of a particular piece of mail and that “with the exception of generic computer-implemented steps, there is nothing in the claims themselves that foreclose them from being performed by a human, mentally or with pen and paper”. 838 F.3d at 1318, 120 USPQ2d at 1360. (emphasis added)); generate, based on the specification of the one or more influence factors and the weight factor designated to each of the one or more influence factors, a reliability index indicative of a data reliability for the at least one reporting attribute of the set of reporting data (generating a reliability index is done by using information to create a value that would designate importance, which is a mathematical concepts; See MPEP 2106.04(a)(2)(I)(A) iv. organizing information and manipulating information through mathematical correlations, Digitech Image Techs., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014). The patentee in Digitech claimed methods of generating first and second data by taking existing information, manipulating the data using mathematical functions, and organizing this information into a new form. The court explained that such claims were directed to an abstract idea because they described a process of organizing information through mathematical correlations, like Flook's method of calculating using a mathematical formula. 758 F.3d at 1350, 111 USPQ2d at 1721); store a record of the generated reliability index in a data storage device (storing received information is an action that humans perform in their minds, under MPEP 2106.04(a)(2)(III)(C) when a mental process is performed on a computer, it remains an abstract idea because the limitation is a concept that is performed in the human mind, and the claims are drafted so the concept is performed on a computer, and only as a tool to perform the mental process); generate a visualization including a representation of a combination of the set of reporting data and the generated reliability index associated therewith, the generated reliability index being spatially associated with the representation of the value for the at least one reporting attribute of the set of reporting data within the generated visualization (generating a visualization consists of obtaining data, creating index and values and spatially placing them to graphically indicate their relation, which is an action that a person may perform when comparing financial data or performance metrics with a business organization, and is therefore grouped as business relation, under the heading of a certain method of organizing human activity; See MPEP 2106.04(a)(2)(II)(B) he business relation at issue in Credit Acceptance is the relationship between a customer and dealer when processing a credit application to purchase a vehicle. The patentee claimed a "system for maintaining a database of information about the items in a dealer’s inventory, obtaining financial information about a customer from a user, combining these two sources of information to create a financing package for each of the inventoried items, and presenting the financing packages to the user." 859 F.3d at 1054, 123 USPQ2d at 1108. The Federal Circuit described the claims as directed to the abstract idea of "processing an application for financing a loan" and found "no meaningful distinction between this type of financial industry practice" and the concept of intermediated settlement in Alice or the hedging concept in Bilski. 859 F.3d at 1054, 123 USPQ2d at 1108); receive, via a transmission from a remote alert system monitoring the generated reliability index, an alert in response to the value for the at least one reporting attribute of the set of reporting data associated with the generated reliability index deviating in excess of a predefined threshold value from at least one of a set value and a range of values (receiving an alert, which is in response to a deviation in business practice is a manner in which someone may manage business relations within a company, and is therefore grouped as a certain method of organizing human activity; See MPEP 2106.04(a)(2)(II)(B) the business relation at issue in Credit Acceptance is the relationship between a customer and dealer when processing a credit application to purchase a vehicle. The patentee claimed a "system for maintaining a database of information about the items in a dealer’s inventory, obtaining financial information about a customer from a user, combining these two sources of information to create a financing package for each of the inventoried items, and presenting the financing packages to the user." 859 F.3d at 1054, 123 USPQ2d at 1108. The Federal Circuit described the claims as directed to the abstract idea of "processing an application for financing a loan" and found "no meaningful distinction between this type of financial industry practice" and the concept of intermediated settlement in Alice or the hedging concept in Bilski. 859 F.3d at 1054, 123 USPQ2d at 1108)); receive, via a user interface, an indication of a selection of the generated reliability index included in association with the set of reporting data including the at least one reporting attribute of the set of reporting data within a presentation of the generated visualization (receiving information, such as selection of an index presented, is an action that may be accomplished in the human mind, because a person may receive another parties selection without the use of a computer; See MPEP 2106.04(a)(2)(III)(C)(2) Performing a mental process in a computer environment. An example of a case identifying a mental process performed in a computer environment as an abstract idea is Symantec Corp., 838 F.3d at 1316-18, 120 USPQ2d at 1360. In this case, the Federal Circuit relied upon the specification when explaining that the claimed electronic post office, which recited limitations describing how the system would receive, screen and distribute email on a computer network, was analogous to how a person decides whether to read or dispose of a particular piece of mail and that “with the exception of generic computer-implemented steps, there is nothing in the claims themselves that foreclose them from being performed by a human, mentally or with pen and paper”. 838 F.3d at 1318, 120 USPQ2d at 1360. (emphasis added)); navigate, in response to receiving the indication of the selection of the generated reliability index within the generated visualization, to at least one of an application and a service that provides an explanation of the generation of the reliability index (navigating between an application and a service is the action of relating between two parties, this is an action of following rules, the rule being how to navigate, see MPEP 2106.04(a)(2)(II)(C) ii. a series of instructions of how to hedge risk, Bilski v. Kappos, 561 U.S. 593, 595, 95 USPQ2d 1001, 1004 (2010)); and present, in a second user interface, a message from the at least one of the application and the service that provides the explanation of the generation of the reliability index including a description of the one or more influence factors used to generate the reliability index when the generated reliability index deviates in excess of the predefined threshold value from at least one of a set value and a range of values (presenting a message to a user with descriptive information about what is used to generate the reliability index is an action that can be accomplished by a business, because a business needs to let users know how the business is being operated, See MPEP 2106.04(a)(2)(II)(B) the business relation at issue in Credit Acceptance is the relationship between a customer and dealer when processing a credit application to purchase a vehicle. The patentee claimed a "system for maintaining a database of information about the items in a dealer’s inventory, obtaining financial information about a customer from a user, combining these two sources of information to create a financing package for each of the inventoried items, and presenting the financing packages to the user." 859 F.3d at 1054, 123 USPQ2d at 1108. The Federal Circuit described the claims as directed to the abstract idea of "processing an application for financing a loan" and found "no meaningful distinction between this type of financial industry practice" and the concept of intermediated settlement in Alice or the hedging concept in Bilski. 859 F.3d at 1054, 123 USPQ2d at 1108)) . The analysis of data for use within a business is considered a part of business relations, and therefore, the claims are directed to the enumerated grouping of economic principle, and under the umbrella category of a certain method of organizing human activity. When viewed alone and in ordered combination, these abstract idea limitations of claims 1, 8 and 15 recite abstract idea. Step 2A, Prong 2-The additional elements include a memory, processor, a user interface, second user interface, non-transient, computer-readable medium and data storage device. This judicial exception is not integrated into a practical application because the claims fail to transform the nature of the claims from the abstract into a patent eligible application. MPEP 2106.05(a) describes ways in which the additional elements improve the functioning of a computer. See MPEP 2106.05(a)(I) “Similarly, a claimed process covering embodiments that can be performed on a computer, as well as embodiments that can be practiced verbally or with a telephone, cannot improve computer technology. See RecogniCorp, LLC v. Nintendo Co., 855 F.3d 1322, 1328,122 USPQ2d 1377, 1381 (Fed. Cir. 2017) (process for encoding/decoding facial data using image codes assigned to particular facial features held ineligible because the process did not require a computer)”. The memory, processors and storage devices are used to store the instruction to implement the abstract idea, but the claims are directed to sending and receiving information using the processor, not ways in which the method would improve the functionality of the computer by using a connection that differs from standard sending and receiving. The claims describe receiving information, using weighting factors to make a decision and storing the generate information, but the claims fail to describe how the computer would store/generate or retrieve information, that would improve the functionality of computer, beyond the process being faster by its application on a computer. The user interface and second user interface both are tools to displays and receive the information, and are a tool to implement the abstract idea of presenting and navigating, and therefore under MPEP 2106.05(f) is not more than a tool. Additionally, MPEP 2106.05(f) describe when the claim recites mere instruction to apply an exception, in which the claims recites only the idea of a solution. The solution is application on a computer, but does not recite a problem or solution that is more than an improvement to an analysis of information beyond its application on a computer. Therefore, the claims fail to be more than application on a computer. See MPEP 2016.05(f)(2) mere instructions to apply the abstract idea. A network and a network interface, are two elements that are used to send and receive the data information between users and the company, and does not do more than be the moderator of information. The network is being used as tool to exchange information, and when the additional elements are used in their ordinary capacity, such as using a network to exchange information, courts have found that the additional element did not add more to the abstract idea. See also MPEP 2106.05(d)(II) The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); but see DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245, 1258, 113 USPQ2d 1097, 1106 (Fed. Cir. 2014) (“Unlike the claims in Ultramercial, the claims at issue here specify how interactions with the Internet are manipulated to yield a desired result‐‐a result that overrides the routine and conventional sequence of events ordinarily triggered by the click of a hyperlink.” (emphasis added)). When these additional elements are viewed alone and in ordered combination the examiner finds that claim 1, 8 and 15 are directed to abstract idea. Step 2B-The claims 1, 8 and 15 do not include additional elements that are sufficient to amount to significantly more than the judicial exception because computer elements either alone or in combination, fail to provide more than a tool to execute the abstract idea, and does not provide significantly more than the abstract idea of analyzing business information for economic reasons. The additional elements are presented as a means to implement the abstract idea. Overall the abstract idea is to analyze, receive and generate information about business data. All of the actions completed by the computer elements are actions that may also be accomplished mentally for the purpose of business relations. When the additional elements are configured to perform the abstract idea, they are simply being used as a tool(s) to implement the abstract idea. (MPEP 2106.05(f)). Additionally, as a whole the claims do not provide a technical solution to a technical problem, but rather use the computer elements to make the entire process faster. Without details on how the abstract actions would affect the functionality of a computer, the claims fail to showcase that there would be a technical improvement under MPEP 2106.05(a). Dependent Claims-Dependent claims 2-3, 9-10 and 16-17 describe what information is being formatted, and labels for the information. Formatting attributes to make the process of analyzing data is a way to make the mathematical processing of the information easier, and is grouped within a mathematical process. The actions are being performed on a processor. The type of information that is being used does not alter how the computer would function, or provide more than the abstract idea of analysis of data. MPEP 2106.05(a). Dependent claims 4, 11 and 18 describes updating information based on an input from the user to trigger an association. Regeneration of data based on new factors is how a user would analyze new business relations, and therefore is considered a certain method of organizing human activity. Performance on a processor does not alone integrate the abstract idea into a practical application. The creation of a reliability index is the manipulation of information, but the information is not used to improve computer functionality, and therefore does not provide more than the abstract idea. MPEP 2106.05(a). Dependent claims 5, 12 and 19 describe additional receiving retrieving and storing of additional information. The exchange of information is something may happen in the human mind, and therefore is grouped as a mental process. The actions taking place on a processor does not provide more than the abstract idea, since the exchange of information on a computer does no more than use the computer as an instrument to apply the abstract idea. MPEP 2106.05(f). Dependent claims 6-7, 13-14 and 20 describe using the information to create a visual representation of the information. Generating a visualization consists of obtaining data, creating index and values and spatially placing them to graphically indicate their relation, which is an action that a person may perform when comparing financial data or performance metrics with a business organization, and is therefore grouped as business relation, under the heading of a certain method of organizing human activity. A visual representation of information does not provide an improvement to the functionality of a computer, or provide more than the use of a computer to implement the visualization, does not provide more than the abstract idea of presenting information. MPEP 2106.05(f). Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 2019/0362397 A1 Psota et al. ("Psota"). Regarding claim 1, Psota teaches a system comprising: a memory storing processor-executable program code (Psota [0186] the records may be stored in a computer memory; [0530] processor stores the particular application on an internal or external memory, Claim 1); and a processor to execute the processor-executable program code (Psota [0530] processor of the computer system) to cause the system to: receive, via a graphical user interface (UI) (Psota [0081] Fig. 12, depicts a user interface), a specification that identifies one or more influence factors related to a set of reporting data, the set of reporting data including at least one reporting attribute (Psota [0025] method and system to collect and store a plurality of data that is aggregated and analyzed, the analysis is accomplished using prioritization factors chosen by the user, with a specific rating system) and the specification identifying the at least one reporting attribute of the set of reporting data particularly associated with the one or more influence factors (Psota [0153] the attributes of records and the value determined based on the attributes is assigned to the specific record, and stored and associated); receive, in the UI (Psota [0081] Fig. 12, depicts a user interface), an input to designate a weight factor to each of the one or more influence factors for each of the at least one reporting attribute of the set of reporting data (Psota [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight; [0171] a user may prioritize, and therefore give more weight, to specific factors); generate, based on the specification of the one or more influence factors and the weight factor designated to each of the one or more influence factors, a reliability index indicative of a data reliability for the at least one reporting attribute of the set of reporting data (Psota [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; [0171] a user may prioritize, and therefore give more weight, to specific factors); store a record of the generated reliability index in a data storage device (Psota [0025] the information obtained and analyzed is collected and stored); generate, in the UI (Psota [0081] Fig. 12, depicts a user interface), a visualization including a representation of a combination of the set of reporting data and the generated reliability index associated therewith, the generated reliability index being spatially associated with the representation of the value for the at least one reporting attribute of the set of reporting data within the generated visualization (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location); receive, via a transmission from a remote alert system monitoring the generated reliability index, an alert in response to the value for the at least one reporting attribute of the set of reporting data associated with the generated reliability index deviating in excess of a predefined threshold value from at least one of a set value and a range of values (Psota [0331] an alert may be sent based on a change in data; [0025] the information obtained and analyzed is collected and stored; [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; the alert may be a negative alert, which would only be triggered if the attribute it is monitoring is outside of the threshold value, an attribute that is monitored can be reliability by calculations from the database; [0171] a user may prioritize, and therefore give more weight, to specific factors); receive, via the UI (Psota [0081] Fig. 12, depicts a user interface), an indication of a selection of the generated reliability index included in association with the set of reporting data including the at least one reporting attribute of the set of reporting data within a presentation of the generated visualization (Psota [0139-141] the entity score is determined based on the selected transactional data, or may be based on a set rule to make the determination, the entity score is presented to users, and the user may choose to select different factors to change the entity score originally presented); navigate, in response to receiving via the UI the indication of the selection of the generated reliability index within the generated visualization (Psota [0081] Fig. 12, depicts a user interface), to at least one of an application and a service that provides an explanation of the generation of the reliability index (Psota [0141-142] the user interface is used to present the user ratings, scores and the factors used to make the ratings and score. The user selects the factors, which is the user indicating selection of factors to create the reliability index, making the user informed about how the reliability index is created); and present, in a second user interface, a message from the at least one of the application and the service that provides the explanation of the generation of the reliability index including a description of the one or more influence factors used to generate the reliability index when the generated reliability index deviates in excess of the predefined threshold value from at least one of a set value and a range of values (Psota [0139-142] the entity score is determined based on the selected transactional data, or may be based on a set rule to make the determination, the entity score is presented to users, and the user may choose to select different factors to change the entity score originally presented; in order for a user to change or select the factors, the details of how that particular entity score is calculated is presented to the user, and the user may change the factors). Regarding claim 2, Psota teaches the system of claim 1, wherein the generated reliability index is formatted as at least one of a percentage value, a color-coded index, a ranked value, a positive or negative index, and combinations thereof (Psota [0266] the rating may be in the form of a value, integer, percentage, and then the rating may be used to rank). Regarding claim 3, Psota teaches the system of claim 1, further comprising the processor to execute the processor-executable program code to cause the system to: designate the one or more influence factors to specific attributes of the at least one reporting attribute of the set of reporting data (Psota [0141] the user interface is able to select different factors for specific attributes). Regarding claim 4, Psota teaches the system of claim 1, wherein the reliability index for the at least one reporting attribute of the set of reporting data is regenerated to update the reliability index, the reliability index being updated in response to at least one of a manual input and a trigger from an autonomous system associated with the set of reporting data (Psota [0194] continuous updating is accomplished, since new data is regularly added, and may be automatic, or in batches). Regarding claim 5, Psota teaches the system of claim 1, further comprising the processor to execute the processor-executable program code to cause the system to: receive a record of the generated reliability index for the set of reporting data (Psota [0025] the information obtained and analyzed is collected and stored; [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; [0171] a user may prioritize, and therefore give more weight, to specific factors); retrieve a record including a value for at least one reporting attribute of the set of reporting data (Psota [0025] method and system to collect and store a plurality of data that is aggregated and analyzed, the analysis is accomplished using prioritization factors chosen by the user, with a specific rating system); associate the value for the at least one reporting attribute for the set of reporting data and the generated reliability index corresponding to the at least one reporting attribute for the set of reporting data (Psota [0153] the attributes of records and the value determined based on the attributes is assigned to the specific record, and stored and associated); and store a record of the value for the at least one reporting attribute for the set of reporting data associated with the generated reliability index corresponding to the at least one reporting attribute of the set of reporting data (Psota [0025]; [0141]; [0153] the information received from the individual reports are assigned weighted attributes, and those attributes are used to generate a value based on important attributes assigned by the user, and that value is associated with a specific record which is stored in a database, and may be retrieved and later referenced; [0171] a user may prioritize, and therefore give more weight, to specific factors). Regarding claim 6, Psota teaches the system of claim 5 further comprising the processor to execute the processor-executable program code to cause the system to: generate a visualization including a representation of the value for the at least one reporting attribute for the set of reporting data and the generated reliability index corresponding to the at least one reporting attribute associated therewith (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location). Regarding claim 7, Psota teaches the system of claim 6, wherein the generated reliability index corresponding to the at least one reporting attribute of the set of reporting data is, within the generated visualization, spatially associated with the representation of the value for the at least one reporting attribute of the set of reporting data (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location). Regarding claim 8, Psota teaches a computer-implemented method (Psota [0186] the records may be stored in a computer memory; [0530] processor stores the particular application on an internal or external memory, Claim 1), the method comprising: receiving, by a processor (Psota [0530] processor of the computer system), via a graphical user interface (UI) (Psota [0081] Fig. 12, depicts a user interface), a specification that identifies one or more influence factors related to a set of reporting data, the set of reporting data including at least one reporting attribute (Psota [0025] method and system to collect and store a plurality of data that is aggregated and analyzed, the analysis is accomplished using prioritization factors chosen by the user, with a specific rating system) and the specification identifying the at least one reporting attribute of the set of reporting data particularly associated with the one or mor influence factors (Psota [0153] the attributes of records and the value determined based on the attributes is assigned to the specific record, and stored and associated); receiving, via the UI (Psota [0081] Fig. 12, depicts a user interface), an input to designate, by the processor via the graphical user interface, a weight factor to each of the one or more influence factors for each of the at least one reporting attribute of the set of reporting data (Psota [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight; [0171] a user may prioritize, and therefore give more weight, to specific factors); generating, by the processor based on the specification of the one or more influence factors and the weight factor designated to each of the one or more influence factors, a reliability index indicative of a data reliability for the at least one reporting attribute of the set of reporting data (Psota [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; [0171] a user may prioritize, and therefore give more weight, to specific factors); storing, by the processor, a record of the generated reliability index in a data storage device (Psota [0025] the information obtained and analyzed is collected and stored); generating, in the UI (Psota [0081] Fig. 12, depicts a user interface), a visualization including a representation of a combination of the set of reporting data and the generated reliability index associated therewith, the generated reliability index being spatially associated with the representation of the value for the at least one reporting attribute of the set of reporting data within the generated visualization (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location); receiving, via a transmission from a remote alert system monitoring the generated reliability index, an alert in response to the value for the at least one reporting attribute of the set of reporting data associated with the generated reliability index deviating in excess of a predefined threshold value from at least one of a set value and a range of values (Psota [0331] an alert may be sent based on a change in data; [0025] the information obtained and analyzed is collected and stored; [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; the alert may be a negative alert, which would only be triggered if the attribute it is monitoring is outside of the threshold value, an attribute that is monitored can be reliability by calculations from the database; [0171] a user may prioritize, and therefore give more weight, to specific factors); receiving, via a UI (Psota [0081] Fig. 12, depicts a user interface), an indication of a selection of the generated reliability index included in association with the set of reporting data including the at least one reporting attribute within a presentation of the generated visualization (Psota [0139-141] the entity score is determined based on the selected transactional data, or may be based on a set rule to make the determination, the entity score is presented to users, and the user may choose to select different factors to change the entity score originally presented); and navigate, in response to receiving, via the UI (Psota [0081] Fig. 12, depicts a user interface), the indication of the selection of the generated reliability index within the generated visualization, to at least one of an application and a service that provides an explanation of the generation of the reliability index (Psota [0141-142] the user interface is used to present the user ratings, scores and the factors used to make the ratings and score. The user selects the factors, which is the user indicating selection of factors to create the reliability index, making the user informed about how the reliability index is created); and present, in a second user interface, a message from the at least one of the application and the service that provides the explanation of the generation of the reliability index including a description of the one or more influence factors used to generate the reliability index when the generated reliability index deviates in excess of the predefined threshold value from at least one of a set value and a range of values (Psota [0139-142] the entity score is determined based on the selected transactional data, or may be based on a set rule to make the determination, the entity score is presented to users, and the user may choose to select different factors to change the entity score originally presented; in order for a user to change or select the factors, the details of how that particular entity score is calculated is presented to the user, and the user may change the factors). Regarding claim 9, Psota teaches the method of claim 8, wherein the generated reliability index is formatted as at least one of a percentage value, a color-coded index, a ranked value, a positive or negative index, and combinations thereof (Psota [0266] the rating may be in the form of a value, integer, percentage, and then the rating may be used to rank). Regarding claim 10, Psota teaches the method of claim 8, further comprising designating the one or more influence factors to specific attributes of the at least one reporting attribute of the set of reporting data (Psota [0141] the user interface is able to select different factors for specific attributes). Regarding claim 11, Psota teaches the method of claim 8, wherein the reliability index for the at least one reporting attribute of the set of reporting data is regenerated to update the reliability index, the reliability index being updated in response to at least one of a manual input and a trigger from an autonomous system associated with the set of reporting data (Psota [0194] continuous updating is accomplished, since new data is regularly added, and may be automatic, or in batches). Regarding claim 12, Psota teaches the method of claim 8, further comprising: receiving, by the processor, a record of the generated reliability index for the set of reporting data (Psota [0025] the information obtained and analyzed is collected and stored; [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; [0171] a user may prioritize, and therefore give more weight, to specific factors); retrieving, by the processor, a record including a value for at least one reporting attribute of the set of reporting data (Psota [0025] method and system to collect and store a plurality of data that is aggregated and analyzed, the analysis is accomplished using prioritization factors chosen by the user, with a specific rating system); associating, by the processor, the value for the at least one reporting attribute for the set of reporting data and the generated reliability index corresponding to the at least one reporting attribute for the set of reporting data (Psota [0153] the attributes of records and the value determined based on the attributes is assigned to the specific record, and stored and associated); and storing a record of the value for the at least one reporting attribute for the set of reporting data associated with the generated reliability index corresponding to the at least one reporting attribute of the set of reporting data (Psota [0025]; [0141]; [0153] the information received from the individual reports are assigned weighted attributes, and those attributes are used to generate a value based on important attributes assigned by the user, and that value is associated with a specific record which is stored in a database, and may be retrieved and later referenced; [0171] a user may prioritize, and therefore give more weight, to specific factors). Regarding claim 13, Psota teaches the method of claim 12 further comprising generating a visualization including a representation of the value for the at least one reporting attribute for the set of reporting data and the generated reliability index corresponding to the at least one reporting attribute associated therewith (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location). Regarding claim 14, Psota teaches the method of claim 13, wherein the generated reliability index corresponding to the at least one reporting attribute of the set of reporting data is, within the generated visualization, spatially associated with the representation of the value for the at least one reporting attribute for the set of reporting data (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location). Regarding claim 15, Psota teaches a non-transient, computer-readable medium storing instructions to be executed by a processor to perform a method (Psota [0186] the records may be stored in a computer memory; [0530] processor stores the particular application on an internal or external memory, Claim 1), the method comprising: Receiving, via the graphical user interface (UI) (Psota [0081] Fig. 12, depicts a user interface), a specification that identifies one or more influence factors related to a set of reporting data, the set of reporting data including at least one reporting attribute (Psota [0025] method and system to collect and store a plurality of data that is aggregated and analyzed, the analysis is accomplished using prioritization factors chosen by the user, with a specific rating system) and the specification identifying the at least one reporting attribute of the set of reporting data particularly associated with the one or more influence factors (Psota [0153] the attributes of records and the value determined based on the attributes is assigned to the specific record, and stored and associated); Receiving, via the UI (Psota [0081] Fig. 12, depicts a user interface), an input to designate signing a weight factor to each of the one or more influence factors for each of the at least one reporting attribute of the set of reporting data (Psota [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight; [0171] a user may prioritize, and therefore give more weight, to specific factors); generating, based on the specification of the one or more influence factors and the weight factor designated to each of the one or more influence factors, a reliability index indicative of a data reliability for the at least one reporting attribute of the set of reporting data (Psota [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; [0171] a user may prioritize, and therefore give more weight, to specific factors); storing a record of the generated reliability index in a data storage device (Psota [0025] the information obtained and analyzed is collected and stored); generating, in the UI (Psota [0081] Fig. 12, depicts a user interface), a visualization including a representation of a combination of the set of reporting data and the generated reliability index associated therewith, the generated reliability index being spatially associated with the representation of the value for the at least one reporting attribute of the set of reporting data within the generated visualization (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location; the information being attribute to make an analysis); receiving, via a transmission from a remote alert system monitoring the generated reliability index, an alert in response to the value for the at least one reporting attribute of the set of reporting data associated with the generated reliability index deviating in excess of a predefined threshold value from at least one of a set value and a range of values (Psota [0331] an alert may be sent based on a change in data; [0025] the information obtained and analyzed is collected and stored; [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; the alert may be a negative alert, which would only be triggered if the attribute it is monitoring is outside of the threshold value, an attribute that is monitored can be reliability by calculations from the database; [0171] a user may prioritize, and therefore give more weight, to specific factors); receiving, via the UI (Psota [0081] Fig. 12, depicts a user interface), an indication of a selection of the generated reliability index included in association with the set of reporting data including the at least one reporting attribute of the set of reporting data within a presentation of the generated visualization (Psota [0139-141] the entity score is determined based on the selected transactional data, or may be based on a set rule to make the determination, the entity score is presented to users, and the user may choose to select different factors to change the entity score originally presented); navigate, in response to receiving, via the UI (Psota [0081] Fig. 12, depicts a user interface), the indication of the selection of the generated reliability index within the generated visualization, to at least one of an application and a service that provides an explanation of the generation of the reliability index (Psota [0141-142] the user interface is used to present the user ratings, scores and the factors used to make the ratings and score. The user selects the factors, which is the user indicating selection of factors to create the reliability index, making the user informed about how the reliability index is created); and present, in a second user interface, a message from the at least one of the application and the service that provides the explanation of the generation of the reliability index including a description of the one or more influence factors used to generate the reliability index when the generated reliability index deviates in excess of the predefined threshold value from at least one of a set value and a range of values (Psota [0139-142] the entity score is determined based on the selected transactional data, or may be based on a set rule to make the determination, the entity score is presented to users, and the user may choose to select different factors to change the entity score originally presented; in order for a user to change or select the factors, the details of how that particular entity score is calculated is presented to the user, and the user may change the factors). Regarding claim 16, Psota teaches the medium of claim 15, wherein the generated reliability index is formatted as at least one of a percentage value, a color-coded index, a ranked value, a positive or negative index, and combinations thereof (Psota [0266] the rating may be in the form of a value, integer, percentage, and then the rating may be used to rank). Regarding claim 17, Psota teaches the medium of claim 15, wherein the method to be performed by the processor executing the stored instructions thereon further comprises designating the one or more influence factors to specific attributes of the at least one reporting attribute of the set of reporting data (Psota [0141] the user interface is able to select different factors for specific attributes). Regarding claim 18, Psota teaches the medium of claim 15, wherein the reliability index for the at least one reporting attribute of the set of reporting data is regenerated to update the reliability index, the reliability index being updated in response to at least one of a manual input and a trigger from an autonomous system associated with the set of reporting data (Psota [0194] continuous updating is accomplished, since new data is regularly added, and may be automatic, or in batches). Regarding claim 19, Psota teaches the medium of claim 15, wherein the method to be performed by the processor executing the stored instructions thereon further comprises: receiving a record of the generated reliability index for the set of reporting data (Psota [0025] the information obtained and analyzed is collected and stored; [0141-0142] generating an overall score based on different factors, each factor having a different prioritization weight, making each influence factor having a specific weight, the rating score is the reliability index, because it uses weighted factors to create a final score; [0171] a user may prioritize, and therefore give more weight, to specific factors); retrieving a record including a value for at least one reporting attribute of the set of reporting data (Psota [0025] method and system to collect and store a plurality of data that is aggregated and analyzed, the analysis is accomplished using prioritization factors chosen by the user, with a specific rating system); associating the value for the at least one reporting attribute for the set of reporting data and the generated reliability index corresponding to the at least one reporting attribute for the set of reporting data (Psota [0153] the attributes of records and the value determined based on the attributes is assigned to the specific record, and stored and associated); and storing a record of the value for the at least one reporting attribute for the set of reporting data associated with the generated reliability index corresponding to the at least one reporting attribute of the set of reporting data (Psota [0025]; [0141]; [0153] the information received from the individual reports are assigned weighted attributes, and those attributes are used to generate a value based on important attributes assigned by the user, and that value is associated with a specific record which is stored in a database, and may be retrieved and later referenced; [0171] a user may prioritize, and therefore give more weight, to specific factors). Regarding claim 20, Psota teaches the medium of claim 19 further comprising generating a visualization including a representation of the value for the at least one reporting attribute for the set of reporting data and the generated reliability index corresponding to the at least one reporting attribute associated therewith (Psota [0498] the information can be displayed in a visualization, that can use a map which associates the information spatially with its location). Response to Arguments Applicant's arguments filed 01/26/2026 have been fully considered but they are not persuasive. Regarding 101 Step 2A, Prong 1- Abstract Ideas. Applicant describes how the specification of the instant invention discloses how a reliability index is generated using reporting attributes from a set of reporting data, and that the reliability index is an indicator of, for example, trustworthiness, soundness and veracity. The information is received from a user interface, and specifically points to Fig. 6, which shows multiple boxes connected to each other, and those boxes represent calculations and determinations performed by the machine. Applicant also discusses multiple different embodiments found within the specification that include generating a visualization and navigating an application. Examiner notes that no mathematical formulas are recited in the claim limitations, but the focus on Applicant arguments has been for the specification support of “reliability factors” which are equations and relationships between data points. Therefore, if the reliability factors are not relationships between data points, they are data points that a person may use to make decisions, and those decisions are based on relationships. As demonstrated by the specification, the reliability index factors are calculation, and therefore Examiners analysis that the generation steps are to be grouped as a mathematical calculation, is accurate. Additionally, the generation is done for business purposes, and making decision based on sales information can be grouped as a certain method or organizing human activity, and finally, when using calculations to make decisions, that act of making a choice can be done mentally, without the need of a computer, and therefore can be grouped as a mental process. Step 2A, Prong 2-Integration into a Practical Application Applicant claims the calculations are a technical solutions that would integrate the abstract idea into a practical applications. However, under MPEP 2106.05(f) the claim must recite details to how a solution to a problem is accomplished to indicate that the claim recite more than the abstract idea. However, Applicant is relying on the abstract idea, the calculation, to showcase the technical solution. The mathematical calculation is not a technical solution, it is the abstract idea. In order to integrate the calculation into a practical application there must be an element so integrated with the abstract idea to be more than just a tool to perform the abstract idea. However, the claims as recited merely state the user interface and computer process the calculations, and using computer as tools to implement the abstract idea does not indicate integration into a practical application. The claims does not need to recite “business” or “finance” in exact words to be a certain methods of human activity, but as recited by the instant claims, the idea is to create a reliability index, based on influence factors, and reporting data. Examiner takes note that these terms are broad and can mean any variety of things, and therefore looked to the Specification for some insight. Specification [0001] discuses how organization may want to gain insights into their process, organization, system or other entity, and may include snapshots of key performance indicators, and continues in [0002] where is might be used to analyze financial postings, and therefore the problem and goals of the instant claims is to make an analysis about how an organization (a business) operates. For the reasons set forth above, claims 1-20 remain ineligible under 101. Regarding 102, Applicant points specifically to a user making a selection of various parameters to generate a rating score as not being found within Psota. However, Examiner points to Psota [0141-142] when it teaches that an entity rating, correlated to the reporting attribute, is calculated using factors, correlating to the influence factors, and a user can select a degree of specialization that make some factors more important than others, which correlates to the weight factor. Additionally, Psota [0171] teaches that a user may prioritize, and therefore give more weight, to specific factors. Therefore Psota is able to teach that a user may designate a weight factor (specific weight assigned) to an influence factors (attribute) giving an overall reporting attribute (entity rating), and the user makes the decisions to give the specific factors more weight. Examiner does not see where in the claim language where applicant argues that the “other entity” must be opposing buyer and seller. The claims limitation specifically relating to navigation, responds to a user selection by navigating to a service that provides an explanation of the generated reliability index. Examiner recited Psota [0141-142] where the user interface is presented with ratings, score and details about the index, and therefore, is able to teach the limitation of giving the user an explanation of the reliability index. Therefore, Psota is able to teach the claimed invention, and remain rejected under 102. Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 2018/0150777 A1 Han et al. teaches management of plural documents over a network (Abstract) and US 11,126,635 B2 Behzadi et al. teaches processing data (Abstract); US 2012/0221370 A1 Ostertag et al. teaches exchanging information (Abstract); US 2018/0168028 A1 Kruger et al. teaches that the sources are checked for accuracy (Abstract); US 2021/0326392 A1 Andrus et al. teaches assigning attributes (Abstract). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JESSICA E SULLIVAN whose telephone number is (571)272-9501. The examiner can normally be reached M-Th; 9:00 AM-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FAHD OBEID can be reached at (571) 270-3324. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JESSICA E SULLIVAN/ Examiner, Art Unit 3627 /AARON TUTOR/Primary Examiner, Art Unit 3627
Read full office action

Prosecution Timeline

Jul 07, 2022
Application Filed
May 29, 2024
Non-Final Rejection — §101, §102
Aug 20, 2024
Examiner Interview Summary
Aug 20, 2024
Applicant Interview (Telephonic)
Aug 30, 2024
Response Filed
Dec 01, 2024
Final Rejection — §101, §102
Jan 24, 2025
Applicant Interview (Telephonic)
Jan 28, 2025
Examiner Interview Summary
Feb 05, 2025
Response after Non-Final Action
Feb 26, 2025
Request for Continued Examination
Feb 27, 2025
Response after Non-Final Action
May 22, 2025
Non-Final Rejection — §101, §102
Aug 11, 2025
Applicant Interview (Telephonic)
Aug 12, 2025
Examiner Interview Summary
Aug 22, 2025
Response Filed
Nov 20, 2025
Final Rejection — §101, §102
Jan 26, 2026
Request for Continued Examination
Feb 20, 2026
Response after Non-Final Action
Mar 17, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12548088
Transaction data processing systems and methods
2y 5m to grant Granted Feb 10, 2026
Patent 12524817
Transaction data processing systems and methods
2y 5m to grant Granted Jan 13, 2026
Patent 12511635
NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM, NOTIFICATION METHOD, AND INFORMATION PROCESSING DEVICE
2y 5m to grant Granted Dec 30, 2025
Patent 12499491
INTELLIGENT PLATFORM FOR AUDIT RESPONSE USING A METAVERSE-DRIVEN APPROACH FOR REGULATOR REPORTING REQUIREMENTS
2y 5m to grant Granted Dec 16, 2025
Patent 12462236
LOTTERY TICKET DATA INTERCEPTOR FOR A POINT-OF-SALE SYSTEM
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
15%
Grant Probability
36%
With Interview (+21.4%)
3y 7m
Median Time to Grant
High
PTA Risk
Based on 108 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month