Prosecution Insights
Last updated: April 19, 2026
Application No. 19/179,499

Estimating Query Execution Performance Using A Sampled Counter

Non-Final OA §DP
Filed
Apr 15, 2025
Examiner
OBISESAN, AUGUSTINE KUNLE
Art Unit
2156
Tech Center
2100 — Computer Architecture & Software
Assignee
Oracle International Corporation
OA Round
1 (Non-Final)
64%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
86%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
480 granted / 755 resolved
+8.6% vs TC avg
Strong +22% interview lift
Without
With
+22.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
34 currently pending
Career history
789
Total Applications
across all art units

Statute-Specific Performance

§101
15.0%
-25.0% vs TC avg
§103
58.8%
+18.8% vs TC avg
§102
13.3%
-26.7% vs TC avg
§112
5.9%
-34.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 755 resolved cases

Office Action

§DP
DETAILED ACTION 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This action is in response to application filed on 4/15/2025, in which claims 1 – 20 was presented for examination. 3. Claims 1 – 20 are pending in the application. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. 4. Claims 1, 10, and 19 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 10, and 19 of U.S. Patent No. US 12,292,886 B2. Although the claims at issue are not identical, they are not patentably distinct from each other because both applications discloses some similar features. Additionally, both applications disclose features that are not similar to each other, as enumerated in the detailed comparison below. Instant Application #: 19/179,499 Patent#: US 12,292,886 B2 1. A method comprising: monitoring database activity by generating sample data for a set of one or more active database sessions at sample intervals, wherein the sample data is generated for a fraction of total database queries that are executed within a sample interval and are part of the database activity; identifying, in the sample data, a set of active database queries and sample counter values associated with the set of active database queries at different sample interval times, wherein the sample counter values include a first sample counter value that tracks executions of an individual database query at a first sample time and a second sample counter value that tracks executions of the individual database query at a second sample time; estimating, based at least on the sample counter values associated with the set of active database queries at the different sample interval times, a set of probabilistic performance metrics for the set of active database queries, wherein estimating the set of probabilistic performance metrics for the set of active database queries includes computing a probabilistic performance metric for the individual database query using at least the first sample counter value and the second sample counter value; selecting at least one active database query from the set of active database queries for non-probabilistic monitoring based on the set of probabilistic performance metrics; and responsive to selecting the at least one active database query, tracking a set of one or more non- sampled performance metrics associated with the at least one active database query. 1. A method comprising: probabilistically monitoring database activity by generating sample data for a set of one or more active database sessions at sample intervals, wherein the sample data is generated for a fraction of total database queries that are executed within a sample interval and are part of the database activity; identifying, in the sample data, a set of active database queries and sample counter values associated with the set of active database queries at different sample interval times, wherein the sample counter values include a first sample counter value that tracks executions of an individual database query at a first sample time and a second sample counter value that tracks executions of the individual database query at a second sample time; estimating, based at least on the sample counter values associated with the set of active database queries at the different sample interval times, one or more performance metrics for the set of active database queries, wherein estimating the one or more performance metrics for the set of active database queries includes computing a probabilistic performance metric for the individual database query using at least the first sample counter value and the second sample counter value, wherein the probabilistic performance metric is at least one of an estimated execution frequency or latency of the individual database query computed as a function of the first sample counter value and the second sample counter value; and generating, based at least in part on the one or more performance metrics for the set of active database queries, an alert indicative of performance degradation associated with at least one active database query in the set of active database queries. 10. One or more non-transitory computer-readable media storing instruction which, when executed by one or more hardware processors, cause: monitoring database activity by generating sample data for a set of one or more active database sessions at sample intervals, wherein the sample data is generated for a fraction of total database queries that are executed within a sample interval and are part of the database activity; identifying, in the sample data, a set of active database queries and sample counter values associated with the set of active database queries at different sample interval times, wherein the sample counter values include a first sample counter value that tracks executions of an individual database query at a first sample time and a second sample counter value that tracks executions of the individual database query at a second sample time; estimating, based at least on the sample counter values associated with the set of active database queries at the different sample interval times, a set of probabilistic performance metrics for the set of active database queries, wherein estimating the set of probabilistic performance metrics for the set of active database queries includes computing a probabilistic performance metric for the individual database query using at least the first sample counter value and the second sample counter value; selecting at least one active database query from the set of active database queries for non- probabilistic monitoring based on the set of probabilistic performance metrics; and responsive to selecting the at least one active database query, tracking a set of one or more non- sampled performance metrics associated with the at least one active database query. 10. One or more non-transitory computer-readable media storing instruction which, when executed by one or more hardware processors, cause: probabilistically monitoring database activity by generating sample data for a set of one or more active database sessions at sample intervals, wherein the sample data is generated for a fraction of total database queries that are executed within a sample interval and are part of the database activity; identifying, in the sample data, a set of active database queries and sample counter values associated with the set of active database queries at different sample interval times, wherein the sample counter values include a first sample counter value that tracks executions of an individual database query at a first sample time and a second sample counter value that tracks executions of the individual database query at a second sample time; estimating, based at least on the sample counter values associated with the set of active database queries at the different sample interval times, one or more performance metrics for the set of active database queries, wherein estimating the one or more performance metrics for the set of active database queries includes computing a probabilistic performance metric for the individual database query using at least the first sample counter value and the second sample counter value, wherein the probabilistic performance metric is at least one of an estimated execution frequency or latency of the individual database query computed as a function of the first sample counter value and the second sample counter value; and generating, based at least in part on the one or more performance metrics for the set of active database queries, an alert indicative of performance degradation associated with at least one active database query in the set of active database queries. 19. A system comprising: one or more hardware processors; one or more non-transitory computer-readable media storing instructions which, when executed by the one or more hardware processors cause: monitoring database activity by generating sample data for a set of one or more active database sessions at sample intervals, wherein the sample data is generated for a fraction of total database queries that are executed within a sample interval and are part of the database activity; identifying, in the sample data, a set of active database queries and sample counter values associated with the set of active database queries at different sample interval times, wherein the sample counter values include a first sample counter value that tracks executions of an individual database query at a first sample time and a second sample counter value that tracks executions of the individual database query at a second sample time; estimating, based at least on the sample counter values associated with the set of active database queries at the different sample interval times, a set of probabilistic performance metrics for the set of active database queries, wherein estimating the set of probabilistic performance metrics for the set of active database queries includes computing a probabilistic performance metric for the individual database query using at least the first sample counter value and the second sample counter value; selecting at least one active database query from the set of active database queries for non-probabilistic monitoring based on the set of probabilistic performance metrics; and responsive to selecting the at least one active database query, tracking a set of one or more non-sampled performance metrics associated with the at least one active database query. 19. A system comprising: one or more hardware processors; one or more non-transitory computer-readable media storing instructions which, when executed by the one or more hardware processors cause: probabilistically monitoring database activity by generating sample data for a set of one or more active database sessions at sample intervals, wherein the sample data is generated for a fraction of total database queries that are executed within a sample interval and are part of the database activity; identifying, in the sample data, a set of active database queries and sample counter values associated with the set of active database queries at different sample interval times, wherein the sample counter values include a first sample counter value that tracks executions of an individual database query at a first sample time and a second sample counter value that tracks executions of the individual database query at a second sample time; estimating, based at least on the sample counter values associated with the set of active database queries at the different sample interval times, one or more performance metrics for the set of active database queries, wherein estimating the one or more performance metrics for the set of active database queries includes computing a probabilistic performance metric for the individual database query using at least the first sample counter value and the second sample counter value, wherein the probabilistic performance metric is at least one of an estimated execution frequency or latency of the individual database query computed as a function of the first sample counter value and the second sample counter value; and generating, based at least in part on the one or more performance metrics for the set of active database queries, an alert indicative of performance degradation associated with at least one active database query in the set of active database queries. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to AUGUSTINE KUNLE OBISESAN whose telephone number is (571)272-2020. The examiner can normally be reached 9:00am - 5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ajay Bhatia can be reached at (571) 272-3906. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AUGUSTINE K. OBISESAN/ Primary Examiner Art Unit 2156 3/6/2026
Read full office action

Prosecution Timeline

Apr 15, 2025
Application Filed
Mar 07, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602616
SECURE MACHINE LEARNING MODEL TRAINING USING ENCRYPTION
2y 5m to grant Granted Apr 14, 2026
Patent 12591573
AUTOMATIC ERROR MITIGATION IN DATABASE STATEMENTS USING ALTERNATE PLANS
2y 5m to grant Granted Mar 31, 2026
Patent 12566784
PREDICTIVE QUERY COMPLETION AND PREDICTIVE SEARCH RESULTS
2y 5m to grant Granted Mar 03, 2026
Patent 12566788
Conversation Graphs
2y 5m to grant Granted Mar 03, 2026
Patent 12566738
Methods and Apparatus to Estimate Audience Sizes of Media Using Deduplication Based on Vector of Counts Sketch Data
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
64%
Grant Probability
86%
With Interview (+22.5%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 755 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month