Prosecution Insights
Last updated: April 19, 2026
Application No. 17/891,796

INFORMATION PROCESSING APPARATUS, NON-TRANSITORY COMPUTER READABLE MEDIUM, AND METHOD FOR DEFINING AN ARITHMETIC EXPRESSION USING FEATURE AND ARRANGEMENT INFORMATION

Non-Final OA §101§103
Filed
Aug 19, 2022
Examiner
MERCADO VARGAS, ARIEL
Art Unit
2118
Tech Center
2100 — Computer Architecture & Software
Assignee
Fujifilm Business Innovation Corp.
OA Round
3 (Non-Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
322 granted / 454 resolved
+15.9% vs TC avg
Strong +30% interview lift
Without
With
+30.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
23 currently pending
Career history
477
Total Applications
across all art units

Statute-Specific Performance

§101
12.9%
-27.1% vs TC avg
§103
46.9%
+6.9% vs TC avg
§102
14.4%
-25.6% vs TC avg
§112
16.1%
-23.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 454 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/05/2026 has been entered. Applicant’s Response In Applicant’s response dated 01/05/2026, Applicant amended Claims 1, 3, 4, 6, 10, 19 and 20; canceled Claims 7, 9, 12, 13, 15, 16 and 18; and argued against all objections and rejections previously set forth in the Office Action dated 10/14/2025. Accordingly, Claims 1, 3, 4, 6, 10, 19 and 20 remain pending for examination. In light of Applicant amendments and remarks, the previously set forth objection to the claims is withdrawn. In light of Applicant amendments and remarks, the previously set forth rejection under 35 U.S.C. 102 is withdrawn. Status of the Claims Claims 1, 3, 4, 6, 10, 19 and 20 are rejected under 35 U.S.C. 101 and Claims 1, 3, 4, 6, 10, 19 and 20 are rejected under 35 U.S.C. 103. Examiner Note The Examiner cites particular columns, line numbers and/or paragraph numbers in the references as applied to the claims below for the convenience of the Applicant(s). Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the Applicant fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 3, 4, 6, 10, 19 and 20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 recites an information processing apparatus, thus a machine, one of the four statutory categories of patentable subject matter. However, Claim 1 further recites [a memory; and a processor] configured to: identify two set strings each comprising a unit and numerical information, located in adjacent columns within a same row of a table included in the image data (which is an observation, evaluation or judgment that can be performed in the human mind or with the help of pen and paper, thus falling within the metal process grouping of abstract ideas). determine an operator corresponding to the identified two set strings based on information, [which is stored in the memory], that associates a relationship between the two set strings with operators (which is an evaluation or judgment that can be performed in the human mind or with the help of pen and paper, thus falling within the metal process grouping of abstract ideas). provide an arithmetic expression using the operator applicable to the two set strings in the same row of the table (which is an evaluation or judgment that can be performed in the human mind or with the help of pen and paper, thus falling within the metal process grouping of abstract ideas). Therefore, Claim 1 recites an abstract idea. The claim does not include any additional elements which integrate the abstract idea into a practical application, since the additional element consists of: “a memory; and a processor configured to” perform the mental process, however, the performance of an abstract idea on a computer is not more than instructions to “apply it” on a computer (See MPEP 2106.05(f)). obtain image data in which optical character recognition (OCR) processing is performed this is recited at a high level of generality and amount to linking the abstract idea to a particular technological environment (See MPEP 2106.05(h) and it is an insignificant extra solution activity (See 2106.05(g)). Thus, the claim is directed towards the abstract idea. Further, the additional element, alone or in combination, do not provide significantly more than the abstract idea itself, because implementation on a computer (MPEP 2106.05(f)) cannot provide significantly more and the combination of additional elements does not provide an inventive concept. Furthermore, nothing in the claim recites more than routine optical character recognition which is a well-understood, routine and conventional activity (See 2106.05(d)(II)(v) “Electronically scanning or extracting data from a physical document, Content Extraction and Transmission, LLC v. Wells Fargo Bank, 776 F.3d 1343, 1348, 113 USPQ2d 1354, 1358 (Fed. Cir. 2014) (optical character recognition)” Thus, the claim is ineligible. Claim 3, this claim recites further embellishment of the abstract idea discussed at claim 1 with the addition of identifying a particular type of data. This amounts to further observations, evaluations, judgments and opinions, which are abstract ideas. Claims 4 and 6, these claims recite further embellishment of the abstract idea discussed at claim 1 with the addition of referencing a table to identify a corresponding operator. This amounts to further observations, evaluations, judgments and opinions, which are abstract ideas. Claim 10, this claim recite further embellishment of the abstract idea discussed at claim 1 with the addition determining particular operators associated with an arithmetic operation. This amounts to further observations, evaluations, judgments and opinions, which are abstract ideas. Furthermore, the claims recite providing an operation result associated with the arithmetic operation, which amounts to mere data gathering (see MPEP 2106.05(g)(3)) using a generic computer component (MPEP 2106.05(b)). Claim 19, recites a non-transitory computer readable medium storing a program causing a computer to execute precisely the instructions recited in Claim 1, respectively. As performance on a computer cannot integrate an abstract idea into a practical application nor provide significantly more than the abstract idea itself (MPEP 2106.05(f)), Claim 19 is rejected as subject-matter ineligible for reasons set forth in the rejection of claim 1 above, respectively. Claim 20, recites a method for processing information, thus a process, one of the four statutory categories of patentable subject matter. The method is performing precisely the instructions of Claim 1. As performance on a computer cannot integrate an abstract idea into a practical application nor provide significantly more than the abstract idea itself (See MPEP 2106.05(f)), Claim 20 is rejected as subject-matter ineligible for reasons set forth in the rejection of Claim 1. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3, 4, 6, 10, 19 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Ichikawa et al. (hereinafter, Ichikawa) (cited in IDS dated 07/27/2023) in view of Zhang et al. (US 2016/0055374) (hereinafter, Zhang). Regarding Claim 1, Ichikawa teaches an information processing apparatus (See Ichikawa’s Abstract) comprising: a memory (Ichikawa in par 0033, teaches a memory unit 25); and a processor (Ichikawa in par 0035 and Fig. 2, teaches that the image processing unit 22 has a character recognizing unit 31) configured to: obtain image data on which optical character recognition (OCR) processing is performed (Ichikawa in par 0005, teaches that the optical character recognition (OCR) technology has been known for a while in which documents are read by scanners, strings of characters are recognized from the scanned images, table layout formats are recognized, etc. Ichikawa in par 0036, and Fig. 2, further teaches that the character recognizing unit 31 takes out character images one by one from the scanned image, recognizes said character image, converts them into character code data. The character recognizing unit 31 identifies the character images included in the text area and the table area as the objects of character recognition. The recognizable characters include all the characters, numbers and symbols to which character code data are assigned as well as external characters the user might have registered based on the user's own patterning actions. Ichikawa in par 0043 - 0045 and Fig. 3, further teaches that table 41 in Fig. 3 is a table recognized by the OCR process); identify two set strings [each comprising a unit and numerical information], located in adjacent columns within a same row of a table included in the image data (Ichikawa in par 0036, teaches that the recognizable characters include all the characters, numbers and symbols to which character code data are assigned. Ichikawa in par 0038 – 0040, further teaches that the extracting unit 33 extracts table areas by recognizing ruled lines of the table areas, as well as each rectangular cell that constitutes each table. The table operational unit 34 converts character strings consisting of numbers from the character strings recognized in the table areas into numerical data. The table operational unit 34 defines operational expressions based on specific character strings from the recognized character strings. The table operational unit 34 further executes specified operations based on the defined operational expressions and the converted numerical data. The verifying unit 35 compares the calculated value processed by table operation unit 34 with the numerical data. Ichikawa in par 0043 - 0045 and Fig. 3, further teaches that table 41 in Fig. 3 is a table recognized by the OCR process. In the cell [A7], a character string "total" that specifies the kind of operation is recognized. Let us call the character strings that specify the kinds of operations as "specific character strings". Ichikawa in par 0047 – 0048, further teaches that table 43 shown in FIG. 5 is a table recognized by the OCR process and has a table structure of 3 rows x 6 columns. Character strings, "electricity" and "gas," that represent the row headings are recognized in the second and third rows of the first column. A specific character string "total" is recognized in the cell [F1]. The memory 25 stores multiple table formats 51, one of which is shown as an example in FIG. 6. The table format contains a table structure and specific character strings. Specific character strings contain such characters as "average," "total," "sum," or "subtotal."); determine an operator corresponding to the identified two set strings based on information, which is stored in the memory, that associates a relationship between the two set strings with operators (Ichikawa in par 0038 – 0040, further teaches that a table operational unit 34 defines operational expressions based on specific character strings from the recognized character strings. The table operational unit 34 further executes specified operations based on the defined operational expressions and the converted numerical data. Ichikawa in par 0055 and Fig. 9, further teaches that the memory unit 25 stores the second operation table 54. The second operational table 54 describes specific character strings as well as operational expressions and arithmetic operators (generically called operational expressions). Specific character strings contain such characters or character strings as "average," "total," "sum," or "subtotal" as mentioned before. An operational expression "+" for addition is described for the specific character strings "total" or "subtotal." Ichikawa in par 0103 – 0105, further teaches that the character strings "total" appearing in the tables 42b and 43 shown in FIGS. 4B and 5 match with the character strings stored in the second operational table 54. The operational expression "+" for addition is defined in correspondence with the specific character string "total." Next, the positions in the table where the specific character strings exist are identified by the table operational unit 34 (S33, S35). In case of the table 42b, the specific character string "total" is identified to exist in the lowest row of the first column (cell [A11]). In case of the table 43, the specific character string "total" is identified to exist in the column F of the first row (cell [F1]). If a specific character string exists in the first column, i.e., among the row labels (S33:Yes), the table operational unit 34 calculates in the column direction according to the defined operational expressions (S34). In other words, the calculation of 2300+200+350+780+1500+240+9- 80+480+10000 for the column B); and provide an arithmetic expression using the operator applicable to the two set strings in the same row of the table (Ichikawa in par 0103 – 0105, further teaches that the character strings "total" appearing in the tables 42b and 43 shown in FIGS. 4B and 5 match with the character strings stored in the second operational table 54. The operational expression "+" for addition is defined in correspondence with the specific character string "total." Next, the positions in the table where the specific character strings exist are identified by the table operational unit 34 (S33, S35). In case of the table 42b, the specific character string "total" is identified to exist in the lowest row of the first column (cell [A11]). In case of the table 43, the specific character string "total" is identified to exist in the column F of the first row (cell [F1]). If a specific character string exists in the first column, i.e., among the row labels (S33:Yes), the table operational unit 34 calculates in the column direction according to the defined operational expressions (S34). In other words, the calculation of 2300+200+350+780+1500+240+9- 80+480+10000 for the column B). Ichikawa discloses that the OCR process identify all characters including number and symbols, however, does not specifically disclose [each comprising a unit and numerical information]. Zhang teaches that the data type may indicate one or more attributes of the arranged data such as a format, font, date, language or currency (See Zhang’s Abstract). Zhang in par 0065 – 0066 and Fig. 4, further teaches that other data types may be identified and associated with the characters or groups of characters depicted in the first arrangement 202. For instance, since all of the numbers aligned in the column under the column header “POP” are integers, the numbers, such as “5,214,000,” may be labeled as an “integer” data type. Other data types may be identified by using techniques similar to those described above. For instance, if a group of characters included a number with a fraction or decimal value, other data types, such as a floating point may be identified. In other examples, if currency symbols are used, different data types may be identified for characters or groups of characters associated with such respective symbols. Zhang in par 0076 – 0077, further teaches that in addition, given the text and symbols of the column header, the numbers positioned below the “GDP” heading may be associated with a data type that is characterized and/or labeled as a “currency” or “US currency in the millions.” In such an example, the meaning of “GDP” or other keywords may indicate a data type for characters that are aligned with, or associated with, such keywords. Contextual data may be obtained or interpreted to determine one or more data types. For example, the program module 111 may interpret a combination of the words in an arrangement, such as “COUNTRY,” “GDP” and “EXPORT.” The meaning of the words, whether they are used together or individually, may allow the program module 111 to determine a data type indicating, for example, a currency in units of “millions.” Zhang in par 0087 and Fig. 6B, further teaches that given the pattern of numbers in Column C and the association between the data and the header “BIRTH RATE,” the program module 111 may automatically generate a formula or equation to calculate a sum, average or other useful value. Row 9 of the table 601B shown in FIG. 6B illustrates one such example where averages for the various “integer” data types are calculated. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to utilize the teachings as in Zhang with the teachings as in Ichikawa to include symbols within the table of Ichikawa as disclosed in Zhang. The motivation for doing so would have been to properly identify contextual information associated with the characters such as the particular currency associated with the particular numbers (See Zhang’s par 0077). Regarding Claim 3, Ichikawa in view of Zhang teaches the limitations contained in parent Claim 1. Ichikawa further teaches: wherein the unit indicates at least quantity, amount of money, weight, or length (Ichikawa in par 0046, further teaches that a table 42b shown in FIG. 4B is a table recognized by OCR processing a table 42a shown in FIG. 4A and has a table structure of 11 rows times 2 columns. Character strings, "electric train," "bus," . . . "lodging expense," that represent the row headings are recognized in the second through 10th rows of the first column. A specific character string "total" is recognized in the cell [A11]. A character string "traveling expenses" that represent the column heading is recognized in the column B of the first row). Regarding Claim 4, Ichikawa in view of Zhang teaches the limitations contained in parent Claim 3. Ichikawa further teaches: further teaches: wherein the processor is configured to use a prediction table to determine the operator (Ichikawa in par 0039, further teaches that table operational unit 34 converts character strings consisting of numbers from the character strings recognized in the table areas into numerical data. The table operational unit 34 defines operational expressions based on specific character strings from the recognized character strings. The table operational unit 34 further executes specified operations based on the defined operational expressions and the converted numerical data), and wherein, in the prediction table, the operation between a left-side term and a right-side term is specified in advance as a predicter operator (Ichikawa in par 0055, further teaches that the memory unit 25 stores the second operational table 54, an example of which is shown in FIG. 9. The second operational table 54 describes specific character strings as well as operational expressions and arithmetic operators (generically called operational expressions). Specific character strings contain such characters or character strings as "average," "total," "sum," or "subtotal" as mentioned before. An operational expression "+" for addition is described for the specific character strings "total" or "subtotal." An operational expression "total/N" for dividing the total by the number of items N is described for the specific character string "average."). Regarding Claim 6, Ichikawa in view of Zhang teaches the limitations contained in parent Claim 4. Zhang further teaches: wherein the processor is configured to identify a combination of two units of the two strings, and to determine the predicter operator by referring to the combination in the prediction table (Zhang in par 0076 – 0077, further teaches that in addition, given the text and symbols of the column header, the numbers positioned below the “GDP” heading may be associated with a data type that is characterized and/or labeled as a “currency” or “US currency in the millions.” In such an example, the meaning of “GDP” or other keywords may indicate a data type for characters that are aligned with, or associated with, such keywords. Contextual data may be obtained or interpreted to determine one or more data types. For example, the program module 111 may interpret a combination of the words in an arrangement, such as “COUNTRY,” “GDP” and “EXPORT.” The meaning of the words, whether they are used together or individually, may allow the program module 111 to determine a data type indicating, for example, a currency in units of “millions.” Zhang in par 0087 and Fig. 6B, further teaches that given the pattern of numbers in Column C and the association between the data and the header “BIRTH RATE,” the program module 111 may automatically generate a formula or equation to calculate a sum, average or other useful value. Row 9 of the table 601B shown in FIG. 6B illustrates one such example where averages for the various “integer” data types are calculated). Regarding Claim 10, Ichikawa in view of Zhang teaches the limitations contained in parent Claim 4. Ichikawa further teaches: wherein the processor is configured to estimate, on a basis of the feature information and the arrangement information, operators between a plurality of pieces of the numerical information included the plurality of pieces of character string information (Ichikawa in par 0055 and Fig. 9, further teaches that the memory unit 25 stores the second operation table 54. The second operational table 54 describes specific character strings as well as operational expressions and arithmetic operators (generically called operational expressions). Specific character strings contain such characters or character strings as "average," "total," "sum," or "subtotal" as mentioned before. An operational expression "+" for addition is described for the specific character strings "total" or "subtotal." An operational expression "total/N" for dividing the total by the number of items N is described for the specific character string "average"), and wherein the processor is configured to define, using the estimated operators, the arithmetic expression including an operation term in which one of the operators is used between some of the plurality of pieces of numerical information included in the plurality of pieces of character string information and, as an operation result, numerical information included in one of the plurality of pieces of character string information other than some of the plurality of pieces of character string information corresponding to the some of the plurality of pieces of numerical information (Ichikawa in par 0062 – 0063, further teaches that the table operational unit 34 retrieves operational expressions that match with the selected table format from the first operational table 52 and defines them. The table operational unit 34 calculates the numerical data based on the defined operational expressions. Furthermore, the table operational unit 34 calls verification expressions that correspond with the selected table format from the verification table 53 and define them. The table operational unit 34 identifies the positions that the specific character strings exist in a table where they exist. The table operational unit 34 calculates numerical data based on the defined operational expressions in the directions corresponding to the positions of the specified character strings in the table). Regarding Claim 19, this Claim merely recites a non-transitory computer readable medium storing a program causing a computer to execute a process for processing information as similarly recited in Claim 1. Accordingly, Ichikawa in view of Zhang discloses/teaches every limitation of Claim 19, as indicated in the above rejection of Claim 1. Regarding Claim 20, this Claim merely recites a method for processing information as similarly recited in Claim 1. Accordingly, Ichikawa in view of Zhang discloses/teaches every limitation of Claim 20, as indicated in the above rejection of Claim 1. Response to Arguments Applicant's arguments filed 01/05/2026 with respect to the 35 U.S.C. 101 have been fully considered but they are not persuasive. Applicant's arguments filed 01/05/2026 with respect to the 35 U.S.C. 102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Zhang et al. Regarding 35 U.S.C. 101: (1) Applicant argues: that the claims have been amended and the features that allegedly recite an abstract idea are no longer present. The newly added limitation neither recite abstract ideas, nor are the independent claims directed to any such abstract ideas. None of the claimed features, which are complex features done by a specifically programmed computer, could reasonably be done in the human mind. The office action alleged that a generic computer could read tables by OCR, thus imply implementing something that could be done in the human mind, but no such generic computer could reasonably perform the complex steps, which are claimed in claim 1. Even if considered that some steps recite an abstract idea, the claims are not directed to any such abstract ideas as the claims allow, subsequent to complex calculations, providing an arithmetic expression using the operator applicable to the two set strings in the same row of the table. (See Applicant’s remarks dated 01/05/2026 pages 1 – 2). The examiner respectfully disagrees. Firstly, the claim is claiming the OCR in a high level of generality, without providing any specific or additional details regarding the OCR that differentiate the well-understood OCR in the technological environment. The OCR claimed in a high level of generality is merely linking the abstract idea to a particular technological environment. Furthermore, it is an insignificant extra solution activity. Accordingly, the additional elements does not integrate the abstract idea into a practical application. Furthermore, nothing in the claim recites more than routine optical character recognition which is a well-understood, routine and conventional activity (See 2106.05(d)(II)(v) “Electronically scanning or extracting data from a physical document, Content Extraction and Transmission, LLC v. Wells Fargo Bank, 776 F.3d 1343, 1348, 113 USPQ2d 1354, 1358 (Fed. Cir. 2014) (optical character recognition)” The identifying, determining and providing steps are claimed in a high level of generality, the claims does not provide any specific regarding the complex nature as indicated by the applicant. The claims as claimed, could be interpreted as merely looking a piece of paper containing a table with amounts of money (numbers and currency symbols) and identifying a specific symbols or characters that denote an arithmetic operation and performing an arithmetic operation, accordingly, this is an evaluation or judgment that can be performed in the human mind or with the help of pen and paper, thus falling within the metal process grouping of abstract ideas. Therefore, the examiner maintains that the claims are ineligible. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ARIEL MERCADO VARGAS whose telephone number is (571)270-1701. The examiner can normally be reached M-F 8:00am - 4:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Baderman can be reached at 571-272-3644. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ARIEL MERCADO-VARGAS/ Primary Examiner, Art Unit 2118
Read full office action

Prosecution Timeline

Aug 19, 2022
Application Filed
Mar 28, 2023
Response after Non-Final Action
May 20, 2025
Non-Final Rejection — §101, §103
Jul 31, 2025
Applicant Interview (Telephonic)
Jul 31, 2025
Examiner Interview Summary
Aug 21, 2025
Response Filed
Oct 09, 2025
Final Rejection — §101, §103
Jan 05, 2026
Request for Continued Examination
Jan 12, 2026
Response after Non-Final Action
Jan 30, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596357
GENERATION SYSTEM, GENERATION METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12591211
TUNING OF CONTROL PARAMETERS FOR SIMULATION SYSTEMS AND APPLICATIONS
2y 5m to grant Granted Mar 31, 2026
Patent 12585259
PROCESS MANAGEMENT DEVICE FOR VISUALIZING PRODUCTION STATUS
2y 5m to grant Granted Mar 24, 2026
Patent 12561622
PREPROCESSING, LAYOUT AND PRODUCTION OPTIMIZATION METHODS FOR NESTING AND SHEAR CUTTING OF A DEFECTIVE PLATE
2y 5m to grant Granted Feb 24, 2026
Patent 12561653
DIGITAL SOCIAL NETWORKING FRAMEWORK WITH ANALYTICS DASHBOARD
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+30.2%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 454 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month