Prosecution Insights
Last updated: April 19, 2026
Application No. 18/323,930

APPLICATION PROGRAMMING INTERFACE SIMULATION SYSTEM AND METHOD

Non-Final OA §103
Filed
May 25, 2023
Examiner
GONZALES, VINCENT
Art Unit
2124
Tech Center
2100 — Computer Architecture & Software
Assignee
Capital One Services LLC
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
3y 6m
To Grant
89%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
410 granted / 522 resolved
+23.5% vs TC avg
Moderate +10% lift
Without
With
+10.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
26 currently pending
Career history
548
Total Applications
across all art units

Statute-Specific Performance

§101
21.2%
-18.8% vs TC avg
§103
39.9%
-0.1% vs TC avg
§102
13.2%
-26.8% vs TC avg
§112
14.6%
-25.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 522 resolved cases

Office Action

§103
DETAILED ACTION This action is written in response to the application filed 5/25/23. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Subject Matter Eligibility In determining whether the claims are subject matter eligible, the examiner has considered and applied the 2019 USPTO Patent Eligibility Guidelines, as well as guidance in the MPEP chapter 2106. The examiner finds that the independent claims are directed to the practical application of simulating an Application Programming Interface (API) on a computer. Claim Objections The Examiner objects to dependent claim 16, which recites “herein the instructions….”. This should instead recite “wherein the instructions…”. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action: (a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made. The following are the references relied upon in the rejections below: Jha (US 10,705,942 B1) Muttik (US 2018/0097829 A1) Claims 1-6, 9-14 and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Jha. Regarding claims 1, 10 and 18, Jha discloses a computer-implemented method comprising: receiving, by a simulator generator, an application programming interface (API) specification defining one or more parameters of an API; Col. 10, lines 17 et seq., “At 202, a design of an API is received. In some embodiments, receiving the design includes receiving an API definition. Examples of the API definition include a specification (e.g., Swagger specification), a schema, and other identification of one or more properties and/or capabilities of an API.” automatically generating, by the simulator generator, simulated data based on the API specification; Col. 10, “At 204, the API is tested. In some embodiments, testing the API includes verifying whether the API will function as desired, e.g., provide expected responses when an API call is made. In some embodiments, testing the API includes modeling the API definition of the API and simulating the API without requiring a developer to provide server implementation code of the API. For example, a simulated response to a call to an operation defined in the API definition is provided to a developer to allow the developer to test and verify the design of the operation.” (Emphasis added.) determining, by a simulator manager, an API type based on the API specification and the simulated data, the API type defining a statefulness of the API; Col. 10, lines 17 et seq., “In some embodiments, receiving the design includes receiving an API definition. The API definition defines how an underlying resource can be accessed. Examples of the API definition include a specification (e.g., Swagger specification), a schema, and other identification of one or more properties and/or capabilities of an APL The API definition may expose content and/or services (e.g., API resources) to internal or external audiences. The design of the API definition may be performed by a developer in a development environment. For example, using the development interface shown in FIG. 3, a developer creates, imports, and/or edits the API definition. In some embodiments, the received API definition is validated automatically and API documentation is automatically generated.” (Emphasis added.) Col. 18, line 18 et seq., “At 506, a simulation state is stored. For example, a simulated response to the simulated execution is stored in 20 storage 106 shown in FIG. 1A. By storing the simulation state, the simulation/testing state can referenced later. Additionally, multiple developers may access the simulation state, and the stored simulation state allows the developers to collaborate to develop the API. For example, a link such 25 as a universal resource identifier (URI) linked to a simulated state/response of the API development environment enables a developer to access the API in the associated state. That is, a link to the development environment in a saved state is accessible by users with knowledge of the URI, allowing for 30 easy collaboration.” (Emphasis added.) automatically generating, by the simulator generator, a simulation of the API based on the API specification and the simulated data; and Col. 10, lines 32 et seq., “At 204, the API is tested. In some embodiments, testing the API includes verifying whether the API will function as desired, e.g., provide expected responses when an API call is made. In some embodiments, testing the API includes modeling the API definition of the API and simulating the API without requiring a developer to provide server implementation code of the API. For example, a simulated response to a call to an operation defined in the API definition is provided to a developer to allow the developer to test and verify the design of the operation.” (Emphasis added.) deploying, by a simulator engine, the simulated API based on the determined API type to facilitate access to the simulated API by an external agent. Col. 10, lines 41 et seq., “In some embodiments, a test server is automatically created to host a simulated version of the API that returns simulated/mock responses to a call to the API. Not only does this simplify testing of the design of the API definition, it allows parallel development of user applications that will be utilizing the API before the implementation code of the API has been fully developed.” (Emphasis added.) Although Jha clearly discloses the ability so simulate stateful/dynamic APIs (see excerpt from col. 18 supra), Jha does not clearly describe simulating both stateful/dynamic and stateless/static APIs. However, at the time of filing, it would have been obvious to a person of ordinary skill to use the received “a specification… schema, and other identification of one or more properties and/or capabilities” to simulate either type of API. Static APIs are simpler in the sense that they do not require a stored state. Whether static or dynamic is preferable depends on the intended use. Regarding independent claim 10, Jha also discloses its further limitations comprising “one or more processors” and “a memory storing instructions” (see col. 2, lines 25 et seq.). Regarding independent claim 18, Jha discloses its further limitation comprising: … generate, by the simulator engine using the simulated API and based on an API request received from the external computing device, an API response; and Col. 1, lines 60 et seq., “FIG. 6D is an example of a GUI for an API development environment including a simulation function with an example response.”(cont.) “FIG. 6E is an example of a GUI for an API development environment including a simulation function with an example response.” provide, by the simulator engine, the API response to the external computing device. Id. Regarding claims 2, 11 and 19, Jha discloses the further limitation comprising creating, based on the determined API type, a tenancy for the simulated API, wherein the tenancy comprises one or more dedicated resources for the simulated API. Col. 4, lines 61 et seq. “API service platform 102 provides its services using one or more API edge servers and/or one or more test servers that each handles requests. In some embodiments, the API service platform includes a plurality of edge servers. A plurality 65 of API edge servers may assist fault tolerance, load balancing, and geographical distribution.” Regarding claims 3 and 12, Jha discloses the further limitation wherein the tenancy is a dynamic tenancy configured to support a stateful API simulation. Col. 18, line 18 et seq., “At 506, a simulation state is stored. For example, a simulated response to the simulated execution is stored in 20 storage 106 shown in FIG. 1A. By storing the simulation state, the simulation/testing state can referenced later. Additionally, multiple developers may access the simulation state, and the stored simulation state allows the developers to collaborate to develop the API. For example, a link such 25 as a universal resource identifier (URI) linked to a simulated state/response of the API development environment enables a developer to access the API in the associated state. That is, a link to the development environment in a saved state is accessible by users with knowledge of the URI, allowing for 30 easy collaboration.” (Emphasis added.) Col. 4, lines 61 et seq. “API service platform 102 provides its services using one or more API edge servers and/or one or more test servers that each handles requests. In some embodiments, the API service platform includes a plurality of edge servers. A plurality 65 of API edge servers may assist fault tolerance, load balancing, and geographical distribution.” Regarding claims 4 and 13, Jha discloses the further limitation comprising: receiving, by the simulator engine, an API request from external agent; and Figs. 6A and 6B. Col. 4, line 61 et seq., “API service platform 102 provides its services using one or more API edge servers and/or one or more test servers that each handles requests. In some embodiments, the API service platform includes a plurality of edge servers. A plurality 65 of API edge servers may assist fault tolerance, load balancing, and geographical distribution.“ providing, based on the API type, a stateful API response corresponding to the received API request using a state manager engine. Col. 18, line 18 et seq., “At 506, a simulation state is stored. For example, a simulated response to the simulated execution is stored in 20 storage 106 shown in FIG. 1A. By storing the simulation state, the simulation/testing state can referenced later. Additionally, multiple developers may access the simulation state, and the stored simulation state allows the developers to collaborate to develop the API. For example, a link such 25 as a universal resource identifier (URI) linked to a simulated state/response of the API development environment enables a developer to access the API in the associated state. That is, a link to the development environment in a saved state is accessible by users with knowledge of the URI, allowing for 30 easy collaboration.” (Emphasis added.) See also col. 5, line 10 et seq. discussing API responses. Regarding claim 5, Jha discloses the further limitation comprising retrieving the stateful API response from a datastore using the state manager engine. Col. 18, line 18 et seq., “At 506, a simulation state is stored. For example, a simulated response to the simulated execution is stored in 20 storage 106 shown in FIG. 1A. By storing the simulation state, the simulation/testing state can referenced later. Additionally, multiple developers may access the simulation state, and the stored simulation state allows the developers to collaborate to develop the API. For example, a link such 25 as a universal resource identifier (URI) linked to a simulated state/response of the API development environment enables a developer to access the API in the associated state. That is, a link to the development environment in a saved state is accessible by users with knowledge of the URI, allowing for 30 easy collaboration.” (Emphasis added.) Col. 8, “For example, processor 152 can also directly and very rapidly retrieve and store frequently needed data in a cache memory included in memory 160.”. Regarding claims 6, 14 and 20, Jha discloses the further limitation comprising generating, based on the API specification and the simulated data, a simulated frontend interface, configured for the simulated API. Fig. 6A, reproduced below. PNG media_image1.png 582 594 media_image1.png Greyscale See also figs. 6-10. Regarding claims 9 and 17, Jha discloses the further limitation wherein generating the simulated data is based on at least one of: one or more data parameters defining one or more data requirements, or one or more data-field parameters defining field constraints of a corresponding frontend interface. Col. 10, lines 17 et seq., “At 202, a design of an API is received. In some embodiments, receiving the design includes receiving an API definition. Examples of the API definition include a specification (e.g., Swagger specification), a schema, and other identification of one or more properties and/or capabilities of an API.” Claims 7-8 and 15-16 are rejected under 35 U.S.C. 103 as being unpatentable over Jha and Muttik. Regarding claim 7 and 15, Muttik discloses the further limitation which Jha does not disclose wherein generating the simulated data comprises anonymizing a dataset by removing non-public information from the dataset. [0082] “Data aggregator 410 aggregates the data provided by data collector 408. This may include categorizing, compressing, packaging, sorting, organizing, or otherwise processing data to make it suitable for export to and consumption by a third party. Note that data aggregator 410 may also have the task of masking, obfuscating, or rejecting certain portions of the data collected by data collector 408. For example, if the data include emails, data aggregator 410 may redact specific email addresses and other personally identifying information. Other types of data that may be obfuscated or redacted include names, addresses, phone number, social security numbers, account numbers, credit card data, billing data, personal preferences, locations, or any other sensitive data. This enables secured enterprise 100 to export certain documents that contain sensitive data without compromising those sensitive data.” (Emphasis added.) At the time of filing, it would have been obvious to a person of ordinary skill to apply the redaction techniques of Muttik to the API simulation system of Jha because this would protect confidential, private or sensitive data from public disclosure. In some cases, this may be required by law. Both disclosures pertain to APIs. Regarding claim 8 and 16, Muttik discloses the further limitation which Jha does not disclose wherein generating the simulated data comprises generating a dummy dataset free of non-public information. [0082] “Data aggregator 410 aggregates the data provided by data collector 408. This may include categorizing, compressing, packaging, sorting, organizing, or otherwise processing data to make it suitable for export to and consumption by a third party. Note that data aggregator 410 may also have the task of masking, obfuscating, or rejecting certain portions of the data collected by data collector 408. For example, if the data include emails, data aggregator 410 may redact specific email addresses and other personally identifying information. Other types of data that may be obfuscated or redacted include names, addresses, phone number, social security numbers, account numbers, credit card data, billing data, personal preferences, locations, or any other sensitive data. This enables secured enterprise 100 to export certain documents that contain sensitive data without compromising those sensitive data.” (Emphasis added.) Additional Relevant Prior Art The following references were identified by the Examiner as being relevant to the disclosed invention, but are not relied upon in any particular prior art rejection: Sarid discloses an API notebook tool, facilitating the creation, testing and documentation of APIs. (US 10,216,554 B2) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Vincent Gonzales whose telephone number is (571) 270-3837. The examiner can normally be reached on Monday-Friday 7 a.m. to 4 p.m. MT. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Miranda Huang, can be reached at (571) 270-7092. Information regarding the status of an application may be obtained from the USPTO Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. /Vincent Gonzales/Primary Examiner, Art Unit 2124
Read full office action

Prosecution Timeline

May 25, 2023
Application Filed
Jan 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585920
PREDICTING OPTIMAL PARAMETERS FOR PHYSICAL DESIGN SYNTHESIS
2y 5m to grant Granted Mar 24, 2026
Patent 12580040
DIFFUSION MODEL FOR GENERATIVE PROTEIN DESIGN
2y 5m to grant Granted Mar 17, 2026
Patent 12566984
METHODS AND SYSTEMS FOR EXPLAINING ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
2y 5m to grant Granted Mar 03, 2026
Patent 12561402
IDENTIFICATION OF A SECTION OF BODILY TISSUE FOR PATHOLOGY TESTS
2y 5m to grant Granted Feb 24, 2026
Patent 12547647
Unsupervised Machine Learning System to Automate Functions On a Graph Structure
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
89%
With Interview (+10.5%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 522 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month