Prosecution Insights
Last updated: April 19, 2026
Application No. 18/083,073

LABELING PLATFORM DECLARATIVE MODEL

Non-Final OA §101§103
Filed
Dec 16, 2022
Examiner
KEATON, SHERROD L
Art Unit
2148
Tech Center
2100 — Computer Architecture & Software
Assignee
Alegion Inc.
OA Round
1 (Non-Final)
52%
Grant Probability
Moderate
1-2
OA Rounds
4y 6m
To Grant
88%
With Interview

Examiner Intelligence

Grants 52% of resolved cases
52%
Career Allow Rate
295 granted / 563 resolved
-2.6% vs TC avg
Strong +36% interview lift
Without
With
+36.1%
Interview Lift
resolved cases with interview
Typical timeline
4y 6m
Avg Prosecution
32 currently pending
Career history
595
Total Applications
across all art units

Statute-Specific Performance

§101
14.9%
-25.1% vs TC avg
§103
62.0%
+22.0% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 563 resolved cases

Office Action

§101 §103
DETAILED ACTION This action is in response to the original filing of 12-16-2022. Claims 1-22 are pending and have been considered below: Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-22 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1: Claims 1-22 represent method and medium type claims. Therefore claims 1-22 are directed to either a process, machine, manufacture or composition of matter. Regarding claims 1 and 12: 2A Prong 1: interpreting the declarative model to implement the processing graph of labelers; As drafted, under the broadest reasonable interpretation, the claim covers mental processes (concepts performed in the human mind including an observation, evaluation, judgment, opinion-a user can interpret a model for labeling). 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product comprising a non-transitory, computer-readable medium storing thereon a set of computer-executable instructions, the set of computer-executable instructions comprising instructions (mere instructions to apply the exception using a generic computer component) storing a declarative model describing a processing graph of labelers for a use case at logical level, the declarative model defining a configuration for each labeler in the processing graph of labelers in a declarative language, wherein each labeler in the processing graph of labelers is a wrapper on executable code; and executing the processing graph of labelers to label a set of records. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product comprising a non-transitory, computer-readable medium storing thereon a set of computer-executable instructions, the set of computer-executable instructions comprising instructions (mere instructions to apply the exception using a generic computer component) storing a declarative model describing a processing graph of labelers for a use case at logical level, the declarative model defining a configuration for each labeler in the processing graph of labelers in a declarative language, wherein each labeler in the processing graph of labelers is a wrapper on executable code; and executing the processing graph of labelers to label a set of records. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 2 and 13: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the configuration for each labeler in the processing graph of labelers is specified as a collection of key-value pairs. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the configuration for each labeler in the processing graph of labelers is specified as a collection of key-value pairs. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 3 and 14: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the declarative model has a canonical structure and wherein interpreting the declarative model to implement the processing graph of labelers comprises interpreting names in the collection of key-value pairs, in context of the canonical structure of the declarative model to configure the processing graph of labelers. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the declarative model has a canonical structure and wherein interpreting the declarative model to implement the processing graph of labelers comprises interpreting names in the collection of key-value pairs, in context of the canonical structure of the declarative model to configure the processing graph of labelers. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 4 and 15: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the declarative model includes configuration assumptions for the use case and a user-provided configuration for the use case. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the declarative model includes configuration assumptions for the use case and a user-provided configuration for the use case. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 5 and 16: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) receiving, for the use case, a selection of a use case template from plural templates, the use case template comprising the configuration assumptions for the use case; based on the use case template, allowing a user to input the user-provided configuration for the use case; and populating the declarative model with the configuration assumptions from the use case template and the user-provided configuration input by the user. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) receiving, for the use case, a selection of a use case template from plural templates, the use case template comprising the configuration assumptions for the use case; based on the use case template, allowing a user to input the user-provided configuration for the use case; and populating the declarative model with the configuration assumptions from the use case template and the user-provided configuration input by the user. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 6 and 17: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the configuration for each labeler in the processing graph of labelers includes a general labeler configuration and a labeler type-specific configuration. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the configuration for each labeler in the processing graph of labelers includes a general labeler configuration and a labeler type-specific configuration. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 7 and 18: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the general labeler configuration for each labeler in the processing graph of labelers includes: a labeler name; a labeler type; a request pipe configuration for a request pipe of the labeler; a result pipe configuration for a result pipe of the labeler; and an exception pipe configuration for an exception pipe of the labeler. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the general labeler configuration for each labeler in the processing graph of labelers includes: a labeler name; a labeler type; a request pipe configuration for a request pipe of the labeler; a result pipe configuration for a result pipe of the labeler; and an exception pipe configuration for an exception pipe of the labeler. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 8 and 19: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the request pipe configuration includes a request pipe schema, wherein the result pipe configuration includes a result pipe schema, and wherein the exception pipe configuration includes an exception pipe schema. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the request pipe configuration includes a request pipe schema, wherein the result pipe configuration includes a result pipe schema, and wherein the exception pipe configuration includes an exception pipe schema. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 9 and 20: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) configuration for at least one labeler in the processing graph of labelers defines a conditioning pipeline for at least one of: the request pipe, the result pipe, or the exception pipe of the labeler. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) configuration for at least one labeler in the processing graph of labelers defines a conditioning pipeline for at least one of: the request pipe, the result pipe, or the exception pipe of the labeler. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 10 and 21: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the declarative model defines a machine learning (ML) labeler configuration and a human labeler configuration. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the declarative model defines a machine learning (ML) labeler configuration and a human labeler configuration. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Regarding claims 11 and 22: 2A Prong 1: No abstract ideas 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the ML labeler configuration defines a label space for the ML labeler and comprises one or more of: a training pipe declaration for a training pipe of the ML labeler, an input conditioning declaration, an output conditioning declaration, a target conditioning declaration, a target de-conditioning declaration, a machine learning (ML) algorithm declaration, or a training configuration declaration. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: Computer program product (mere instructions to apply the exception using a generic computer component) wherein the ML labeler configuration defines a label space for the ML labeler and comprises one or more of: a training pipe declaration for a training pipe of the ML labeler, an input conditioning declaration, an output conditioning declaration, a target conditioning declaration, a target de-conditioning declaration, a machine learning (ML) algorithm declaration, or a training configuration declaration. (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 4-6, 11-13, 15-17 and 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Edwards et al. (“Edwards” 10916241 B1) in view of “Wrapper methods to correct mislabeled training data” Young et al. © 2013 pages 170-173 (“Young”). Claim 1: Edwards discloses a computer-implemented method for configuring a labeling platform, the method comprising: storing a declarative model describing a processing graph of labelers for a use case at logical level (Column 10, Lines 27-54; deterministic mapping with stored parameters), the declarative model defining a configuration for each labeler in the processing graph of labelers in a declarative language(Column 10, Lines 27-54; deterministic mapping); interpreting the declarative model to implement the processing graph of labelers; and executing the processing graph of labelers to label a set of records (Column 9, Line 44-Column 10, Line 8 and Column 10, Lines 27-54; deterministic model provides processing of theme to apply label). Edwards may not explicitly disclose wherein each labeler in the processing graph of labelers is a wrapper on executable code; Young is disclosed because it provides a functionality where wrapper methods are utilized for labeling data (abstract). Additionally, each method of wrappers utilize classification algorithms for performing labeling of data (Page 170, Column 2, Paragraph 2, investigation of wrapper methods and Page 171, Column 3; wrapper methods for labeling functionality). Therefore it would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to utilize a known technique to improve a similar device and provide wrapper methods for the labeling found in Edwards. One would have been motivated to provide the functionality to allow different methods for labeling of data offering a more robust and comprehensive system. Claim 2: Edwards and Young disclose a computer-implemented method of claim 1, wherein the configuration for each labeler in the processing graph of labelers is specified as a collection of key-value pairs(Edwards: Column 10, Lines 27-54 and Column 23, Lines 1-14; deterministic mapping and or key-value pairs). Claim 4: Edwards and Young disclose a computer-implemented method of claim 1, wherein the declarative model includes configuration assumptions for the use case and a user-provided configuration for the use case (Edwards: Column 8, Line 59-Column 9, Line 20; system adjusted configurations, also user indicated/provided configurations). Claim 5: Edwards and Young disclose a computer-implemented method of claim 4, further comprising: receiving, for the use case, a selection of a use case template from plural templates, the use case template comprising the configuration assumptions for the use case; based on the use case template, allowing a user to input the user-provided configuration for the use case; and populating the declarative model with the configuration assumptions from the use case template and the user-provided configuration input by the user (Edwards: Column 6, Line 50-Column 7, Line 7; theme provides “template” for deterministic mapping Column 10, Lines 27-54). System allows user to provide feedback regarding the theme (configuration) which would be considered when utilizing the mapping. Claim 6: Edwards and Young disclose a computer-implemented method of claim 1, wherein the configuration for each labeler in the processing graph of labelers includes a general labeler configuration and a labeler type-specific configuration (Edwards: Column 7, Lines 21-35 and Column 19, Lines 30-35; general and specific training for theme and labeling). Claim 11: Edwards and Young disclose a computer-implemented method of claim 10, wherein the ML labeler configuration defines a label space for the ML labeler and comprises one or more of: a training pipe declaration for a training pipe of the ML labeler (Edwards: Column 6, Lines 20-50), an input conditioning declaration(Edwards: Column 6, Lines 20-50; input encoded for conditioning), an output conditioning declaration, a target conditioning declaration, a target de-conditioning declaration, a machine learning (ML) algorithm declaration, or a training configuration declaration (Edwards: Column 19, Line 30-Column 20, Line 15 (encoded data for training condition)). Claim 12 is similar in scope to claim 1 and therefore rejected under the same rationale. Claim 13 is similar in scope to claim 2 and therefore rejected under the same rationale. Claim 15 is similar in scope to claim 4 and therefore rejected under the same rationale. Claim 16 is similar in scope to claim 5 and therefore rejected under the same rationale. Claim 17 is similar in scope to claim 6 and therefore rejected under the same rationale. Claim 22 is similar in scope to claim 11 and therefore rejected under the same rationale. Claims 3 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Edwards et al. (“Edwards” 10916241 B1) and “Wrapper methods to correct mislabeled training data” Young et al. © 2013 pages 170-173 (“Young”) in further view of Borthakur et al. (“Borthakur” 11327962 B1). Claim 3: Edwards and Young disclose a computer-implemented method of claim 2, wherein the declarative model has a canonical structure and wherein interpreting the declarative model to implement the processing graph of labelers comprises interpreting names in the collection of key-value pairs, in context of the canonical structure of the declarative model to configure the processing graph of labelers (Edwards: Column 19, Lines 5-22, key value pair for labeling). However to capture the canonical structure of the key value pair Borthakur is disclosed. Borthakur also provides a functionality where document/data is converted and the transformation is that of a canonical format using key value pairs (Column 11, Lines 12-36). This key-value pair format could be incorporated with the mapping key-value pair of Edwards. Therefore it would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to utilize a known technique to improve a similar device and provide canonical formatting for the key-value pair labeling found in Edwards. One would have been motivated to provide the functionality as a way to simplify operations when indexing data (Borthakur: Column 11, Lines 33-35). Claim 14 is similar in scope to claim 3 and therefore rejected under the same rationale. Claims 7-9 and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Edwards et al. (“Edwards” 10916241 B1) and “Wrapper methods to correct mislabeled training data” Young et al. © 2013 pages 170-173 (“Young”) in further view of Hsu et al. (“Hsu” 20190205794 A1). Claim 7: Edwards and Young disclose a computer-implemented method of claim 6, wherein the general labeler configuration for each labeler in the processing graph of labelers includes: a labeler name; a labeler type (Column 9, Line 45-Column 10, Lines 10; theme and sub-set of labels provide name and type -i.e. shopping); a request pipe configuration for a request pipe of the labeler; a result pipe configuration for a result pipe of the labeler (Column 12, Line 49-Column 13, Line 22 and Column 21, Line 50-Column 22, Line 8; provides rules for mapping/labeling objects for output); Edwards may not explicitly disclose and an exception pipe configuration for an exception pipe of the labeler. Hsu is incorporated to disclose a labeling functionality documents/data (abstract), further the system determines if a data record cannot not attributed to a cluster (labeling) (Paragraph 89) and provides a correction unit (exception pipe), which determines a method to address the error (Paragraphs 28, 89 and 95-98). Therefore it would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to utilize a known technique to improve a similar device and provide an exception/correction method for the label functionality found in Edwards. One would have been motivated to provide the functionality as a way to ensure accurate labels for improved performance of a model (Hsu: Paragraph 28). Claim 8: Edwards, Young and Hsu disclose a computer-implemented method of claim 7, wherein the request pipe configuration includes a request pipe schema, wherein the result pipe configuration includes a result pipe schema (Edward: Column 12, Line 49-Column 13, Line 22; provides rules/configuration of parameters for processing input and providing output (results)) and wherein the exception pipe configuration includes an exception pipe schema (Hsu: Paragraphs 95-98; instructions and processes for implementing the correction). Claim 9: Edwards, Young and Hsu disclose a computer-implemented method of claim 7, wherein the configuration for at least one labeler in the processing graph of labelers defines a conditioning pipeline for at least one of: the request pipe, the result pipe, or the exception pipe of the labeler. (Edwards: Column 19, Line 30-Column 20, Line 15) and (Hsu: Paragraphs 95-98; instructions and processes for implementing the correction). Claim 18 is similar in scope to claim 7 and therefore rejected under the same rationale. Claim 19 is similar in scope to claim 8 and therefore rejected under the same rationale. Claim 20 is similar in scope to claim 9 and therefore rejected under the same rationale. Claims 10 and 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Edwards et al. (“Edwards” 10916241 B1) and “Wrapper methods to correct mislabeled training data” Young et al. © 2013 pages 170-173 (“Young”) in further view of Oshinaike et al. (“Oshinaike” 20220044298 A1). Claim 10: Edwards and Young disclose a computer-implemented method of claim 1, wherein the declarative model defines a machine learning (ML) labeler configuration(Edwards: Column 10, Lines 27-54); however may not explicitly disclose and a human labeler configuration. Oshinaike is incorporated to disclose a labeling functionality, further the system provides a human review pipeline (human labeler configuration). The system determines a human annotation configuration is to be utilized based on the system pipeline (Paragraphs36-37). Therefore it would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to utilize a known technique to improve a similar device and provide the human label pipeline configuration with the labeling capabilities found in Edwards. One would have been motivated to provide the functionality as a way to trigger actions for error prone data thereby ensuring accurate labels for improved performance of a model. Claim 21 is similar in scope to claim 10 and therefore rejected under the same rationale. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. 20040013304 A1 VIOLA ET AL. Figure 1 Applicant is required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action. It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 U.S.P.Q. 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 U.S.P.Q. 275, 277 (C.C.P.A. 1968)). In the interests of compact prosecution, Applicant is invited to contact the examiner via electronic media pursuant to USPTO policy outlined MPEP § 502.03. All electronic communication must be authorized in writing. Applicant may wish to file an Internet Communications Authorization Form PTO/SB/439. Applicant may wish to request an interview using the Interview Practice website: http://www.uspto.gov/patent/laws-and-regulations/interview-practice. Applicant is reminded Internet e-mail may not be used for communication for matters under 35 U.S.C. § 132 or which otherwise require a signature. A reply to an Office action may NOT be communicated by Applicant to the USPTO via Internet e-mail. If such a reply is submitted by Applicant via Internet e-mail, a paper copy will be placed in the appropriate patent application file with an indication that the reply is NOT ENTERED. See MPEP § 502.03(II). Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHERROD KEATON whose telephone number is 571-270-1697. The examiner can normally be reached 9:30am to 5:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, Applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor MICHELLE BECHTOLD can be reached at 571-431-0762. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHERROD L KEATON/ Primary Examiner, Art Unit 2148 11-2-2025
Read full office action

Prosecution Timeline

Dec 16, 2022
Application Filed
Nov 08, 2025
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566823
SYSTEMS AND METHODS FOR INTERPOLATIVE CENTROID CONTRASTIVE LEARNING
2y 5m to grant Granted Mar 03, 2026
Patent 12547820
Automated Generation Of Commentator-Specific Scripts
2y 5m to grant Granted Feb 10, 2026
Patent 12530587
SYSTEMS AND METHODS FOR CONTRASTIVE LEARNING WITH SELF-LABELING REFINEMENT
2y 5m to grant Granted Jan 20, 2026
Patent 12524147
Modality Learning on Mobile Devices
2y 5m to grant Granted Jan 13, 2026
Patent 12524603
METHODS FOR RECOGNIZING AND INTERPRETING GRAPHIC ELEMENTS
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
52%
Grant Probability
88%
With Interview (+36.1%)
4y 6m
Median Time to Grant
Low
PTA Risk
Based on 563 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month