Prosecution Insights
Last updated: April 19, 2026
Application No. 17/201,646

NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM FOR STORING MACHINE-LEARNING PROGRAM, MACHINE-LEARNING METHOD, AND INFORMATION PROCESSING DEVICE

Final Rejection §101§112
Filed
Mar 15, 2021
Examiner
BEAN, GRIFFIN TANNER
Art Unit
2121
Tech Center
2100 — Computer Architecture & Software
Assignee
Fujitsu Limited
OA Round
4 (Final)
21%
Grant Probability
At Risk
5-6
OA Rounds
4y 4m
To Grant
50%
With Interview

Examiner Intelligence

Grants only 21% of cases
21%
Career Allow Rate
4 granted / 19 resolved
-33.9% vs TC avg
Strong +28% interview lift
Without
With
+28.4%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
45 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
37.7%
-2.3% vs TC avg
§103
40.4%
+0.4% vs TC avg
§102
11.2%
-28.8% vs TC avg
§112
9.7%
-30.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 19 resolved cases

Office Action

§101 §112
DETAILED ACTION This Action is responsive to Claims filed 08/12/2025. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims Claims 1, 4, 6, 7, and 8 have been amended. Claim 2 is canceled. Claims 1 and 3-8 are pending. Response to Amendment The amendments to Claims 1, 6, 7, and 8 have corrected the minor informalities cited in the Office Action dated 05/02/2025. The Objections to the Claims have been withdrawn. Response to Arguments Applicant’s arguments, see Pages 12-20, filed 08/12/2025, regarding the 35 U.S.C. 112(b) Rejection of Claims 1-8 have been fully considered and are persuasive. The 35 U.S.C. 112(b) Rejection of Claims 1-8 has been withdrawn. Applicant's arguments, see Pages 20-29, filed 08/12/2025, regarding the 35 U.S.C. 101 Rejection of Claims 1-8 have been fully considered but they are not persuasive. The Examiner respectfully disagrees with the Applicant regarding the eligibility of claims 1-8. The Applicant argues, without claim amendments indicating such, that the interpretable abstract idea mental process steps are not practically performed within the human mind or with the aid of pen and paper. As presently drafted, there is no indication the alleged abstract idea mental process steps cannot be practically performed within the human mind or with the aid of pen and paper. There is no specific structure claimed for the specific improvement. The Examiner contends the specific improvement, as presently drafted, is a direct result of the data curation abstract idea mental process steps, which a generic machine-learning program operating on a generic computer. Per MPEP 2106.05(a), the specific improvement cannot come from the abstract idea. See the updated 35 U.S.C. 101 Rejection below. Applicant’s arguments, see Pages 29-34, filed 08/12/2025, with respect to the 35 U.S.C. 103 Rejection of Claims 1-8 have been fully considered and are persuasive. The 35 U.S.C. 103 Rejection of Claims 1-8 has been withdrawn. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1 and 3-8 rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. There is insufficient written description for “transforming…” or “transformed datasets” or “transforming an order of records”. The word “transform” does not appear in the Specification, nor is the plain meaning of “transform” established. The terms “convert(ed)” and “rearrangement” are used throughout the Specification. The plain meaning of the word “transform” is interpretably broader than a mere rearrangement or conversion in the context of the Specification. The newly amended limitations therefore claim an element broader in scope than the Specification supports. There is insufficient written description for the term “replacement pattern” in the Specification. The term “replacement” does not appear in the Specification. The Examiner notes “rearrangement pattern” is used throughout the Specification, but the plain meaning of “replacement” is broader in scope than “rearrangement” (replacing may introduce new values, for example, rearranging merely shuffles values). The newly amended limitations therefore claim an element broader in scope than the Specification supports. There is insufficient written description for the “evaluating…”, “specifying…”, and “generating…” steps based on the above deficiencies involving the “transformed” data and “replacement patterns.” The Examiner notes there would be sufficient written description for the evaluation of similarity(s) and generation of conversion information when the above bullets are rectified. The Dependent Claims 3-6 do not rectify the deficiencies of claim 1, and are therefore similarly rejected by inheritance. Claims 7 and 8 recite similar limitations to claim 1 and are therefore similarly rejected. Claim Rejections - 35 USC § 101 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claim 1 and 3-8 rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more; and because the claims as a whole, considering all claim elements both individually and in combination, do not amount to significantly more than the abstract idea, see Alice Corporation Pty. Ltd. v. CLS Bank International, et al, 573 U.S. (2014). In determining whether the claims are subject matter eligible, the Examiner applies the 2019 USPTO Patent Eligibility Guidelines. (2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, Jan. 7, 2019.) Step 1: Claims 1 and 3-6 recite a non-transitory computer-readable storage medium for storing a machine-learning program which causes a processor to perform processing, which falls under the statutory category of a manufacture. Claim 7 recites a machine-learning method implemented by a computer, which falls under the statutory category of a process. Claim 8 recites an information processing device comprising a memory; and a processor circuit coupled to the memory, which falls under the statutory category of a machine. Step 2A – Prong 1: Claim 1 recites an abstract idea, law of nature, or natural phenomenon. The limitations of “generating common conversion information to be commonly applied to the plurality of datasets, the common conversion information including a common correspondence between each item value of a common item in the plurality of datasets and each item value of the common item in the collation dataset, the common item being any one of the plurality of items present in all of the plurality of datasets;”, “generating, for each dataset of the plurality of datasets, individual conversion information to be individually applied to the each dataset, the individual conversion information including an individual correspondence between each item value of an individual item in the each dataset and each item value of the individual items in the collation dataset, the individual items being any of remaining items of the plurality of items other than the common item;”, “generating, for each dataset of the plurality of datasets, a converted dataset by converting the order of the plurality of first records in the each dataset, the converting including:”, “transforming an order of records included in each dataset of the plurality of datasets by using a plurality of first replacement patterns applied to the common item to obtain a plurality of first transformed datasets, each of the plurality of first transformed datasets having a plurality of records whose order has been transformed using a corresponding first replacement pattern of the plurality of first replacement patterns,”, “evaluating, for each first transformed dataset of the plurality of first transformed datasets, a first similarity between the each first transformed dataset and the collation data,”, “specifying a first transformed dataset for which the evaluated first similarity is maximum, among the plurality of first transformed datasets,”, “generating the common conversion information based on, among the plurality of first replacement pattern, a first replacement pattern used for the specified first transformed dataset,”, “transforming an order of records included in the each dataset by using a plurality of second replacement patterns applied to the individual item to obtain a plurality of second transformed datasets, each of the plurality of second transformed datasets having a plurality of records whose order has been transformed using a corresponding first replacement pattern of the plurality of first replacement patterns,”, “evaluating, for each second transformed dataset of the plurality of second transformed datasets, a second similarity between the each second transformed dataset and the collation data,”, “specifying a second transformed dataset for which the evaluated second similarity is maximum, among the plurality of second transformed datasets,”, and “generating the individual conversion information for the each dataset based on, among the plurality of second replacement pattern, a second replacement pattern used for the specified second transformed dataset.” Under the broadest reasonable interpretation, these steps cover a mental process including an observation, evaluation, judgment or opinion that could be performed in the human mind or with the aid of pencil and paper. The generation of conversion information and subsequent converted dataset based on said conversion information is practically performed within the human mind or with the aid of pencil and paper. Transforming the order of a dataset and evaluating similarities between other datasets is practically performed within the human mind or with the aid of pen and paper. Therefore, these limitations fall within the mental process group. Step 2A – Prong 2: The additional elements of claim 1 do not integrate the abstract idea into a judicial exception. The claim recites the additional elements “A non-transitory computer-readable storage medium for storing a machine-learning program which causes a processor to perform processing…”, “datasets”, “items”, “records”, “an input value”, “a computer”, “instructions”, and “item values” which are recognized as generic computer components recited at a high level of generality. Although they have and execute instructions to perform the abstract idea itself, this also does not serve to integrate the abstract idea into a practical application as it merely amounts to instructions to "apply it." (See MPEP 2106.04(d)(2) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). The additional elements of “a collation dataset”, “common conversion information”, “input node”, “individual conversion information”, “a neural network model”, are recognized as not being generic computer components, however they are recited at a high level of generality and found to generally linking the abstract idea to a particular technological environment or field of use (See MPEP 2016.05(h)). The additional elements recited in the limitations of “obtaining a plurality of datasets defined by a plurality of items, each of the plurality of datasets including a plurality of first records each of which includes a combination of: an input value and a plurality of item values corresponding to the plurality of items, respectively;”, “obtaining a collation dataset including a plurality of second records each of which includes a combination of: a criterion value and the plurality of item values corresponding to the plurality of items, respectively, the plurality of second records being stored in the collation dataset in an order of a correspondence between each second record of the plurality of second records and each input node of a plurality of input nodes in the neural network model;” and “…inputting, based on a correspondence between the order of the plurality of first records in the converted dataset and the plurality of input nodes of the neural network model, the input value of each first record of the plurality of first records in on the basis of the generated converted dataset into a corresponding input node in the neural network model,” Are mere data-gathering or data manipulation extra-solution activity steps (See MPEP 2106.05(g)). The additional elements recited in the limitations of “applying the generated common conversion information to the common item in the each dataset to change the order of the first records in the each dataset based on the common correspondence indicated in the common conversion information such that a position of each first record in the each dataset matches a position of a corresponding second record having an item value of the common item matching the each first record in terms of the common correspondence in the common conversion information, among the plurality of second records in the collation dataset,”, “and applying, for each dataset which has been converted by applying the generated common conversion information, the generated individual conversion information of the each dataset to the individual item in the each dataset to change the order of the first records in the each dataset based on the individual correspondence indicated in the individual conversion information such that a position of each first record in the each dataset matches a position of a corresponding second record having an item value of the individual item matching the each first record in terms of the individual correspondence in the individual conversion information, among the plurality of second records in the collation dataset;” and “training the neural network model…” amount to instructions to "apply it." (See MPEP 2106.05(f) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). Step 2B: The only limitation on the performance of the described method is limitations reciting “A non-transitory computer-readable storage medium for storing a machine-learning program which causes a processor to perform processing…”, “datasets”, “items”, “records”, “an input value”, “a computer”, “instructions”, and “item values”. These elements are insufficient to transform a judicial exception to a patentable invention because the recited elements are considered insignificant extra-solution activity (generic computer system, processing resources, links the judicial exception to a particular, respective, technological environment). The claim thus recites computing components only at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using generic computer components; mere instructions to apply an exception using a generic computer component cannot provide an inventive concept (see MPEP 2106.05(f)). The additional elements of “a collation dataset”, “common conversion information”, “input node”, “individual conversion information”, “a neural network model”, are recognized as not being generic computer components, however they are recited at a high level of generality and found to generally linking the abstract idea to a particular technological environment or field of use (See MPEP 2016.05(h)). The additional elements of “obtaining a plurality of datasets defined by a plurality of items, each of the plurality of datasets including a plurality of first records each of which includes a combination of: an input value and a plurality of item values corresponding to the plurality of items, respectively;”, “obtaining a collation dataset including a plurality of second records each of which includes a combination of: a criterion value and the plurality of item values corresponding to the plurality of items, respectively, the plurality of second records being stored in the collation dataset in an order of a correspondence between each second record of the plurality of second records and each input node of a plurality of input nodes in the neural network model;” and “…inputting, based on a correspondence between the order of the plurality of first records in the converted dataset and the plurality of input nodes of the neural network model, the input value of each first record of the plurality of first records in on the basis of the generated converted dataset into a corresponding input node in the neural network model,” are found to be well-understood, routine, or conventional activity (See MPEP 2016.05(d)(II)). The additional elements of the limitations “applying the generated common conversion information to the common item in the each dataset to change the order of the first records in the each dataset based on the common correspondence indicated in the common conversion information such that a position of each first record in the each dataset matches a position of a corresponding second record having an item value of the common item matching the each first record in terms of the common correspondence in the common conversion information, among the plurality of second records in the collation dataset,”, “and applying, for each dataset which has been converted by applying the generated common conversion information, the generated individual conversion information of the each dataset to the individual item in the each dataset to change the order of the first records in the each dataset based on the individual correspondence indicated in the individual conversion information such that a position of each first record in the each dataset matches a position of a corresponding second record having an item value of the individual item matching the each first record in terms of the individual correspondence in the individual conversion information, among the plurality of second records in the collation dataset;” and “training the neural network model…” amount to instructions to "apply it." (See MPEP 2106.05(f) indicating mere instructions to apply an abstract idea do not recite significantly more than a judicial exception). Taken alone or in ordered combination, these additional elements do not amount to significantly more than the above-identified abstract idea. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. For the reasons above, claim 1 is rejected as being directed to non-patentable subject matter under §101. This rejection applies equally to independent claims 7 and 8, which recite a machine-learning method implemented by a computer and an information processing device comprising a memory; and a processor circuit coupled to the memory, respectively, and wherein it is noted that claim 8 recites generic computer components (processing device, memory, processor circuit) at high levels of generality. Dependent Claims: The limitations of dependent claims but for those addressed below merely set forth further refinements of the abstract idea without changing the analysis already presented. Claim 3 merely recites aforementioned additional elements and refinements of the data gathering steps and extra-solution activity steps (“…updating the common conversion information on the basis of the generated converted dataset.”) evaluated under Steps 2A Prong 2 and 2B, and found to be mere extra-solution activity and well-understood, routine, or conventional activity. Claim 4 merely recites aforementioned additional elements and further refines the nature of the additional elements (“…wherein first the similarity is expressed by an inner product of a first vector in which input values in the each transformed dataset are arranged and a second vector in which input values in the collation dataset are arranged.”) Claim 5 recites aforementioned additional elements. Claim 5 also recites abstract idea mental process steps (“calculating, by error back propagation, an error vector in which errors of input values in the converted dataset are arranged, the errors of the input values being obtained by inputting the converted dataset generated from the plurality of datasets to the neural network model;” and “calculating a variation vector in which differences in input values between the converted dataset and another converted dataset are arranged, the another converted dataset being generated from the plurality of dataset in a case of varying the common conversion information or the collation dataset, wherein the updating includes updating the collation dataset and the neural network model on the basis of the error vector and the variation vector.”) Claim 6 recites aforementioned additional elements and data-gathering steps (obtaining…, inputting…) evaluated under Steps 2A Prong 2 and 2B, and found to be mere extra-solution activity and well-understood, routine, or conventional activity. Claim 6 further recites mental process steps (“generating, for each second dataset of the plurality of second datasets, a second converted dataset obtained by converting the order of the plurality of third records in the each second dataset on the basis of the generated common conversion information and the updated collation data,”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRIFFIN T BEAN whose telephone number is (703)756-1473. The examiner can normally be reached M - F 7:30 - 4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Li Zhen can be reached at (571) 272-3768. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GRIFFIN TANNER BEAN/Examiner, Art Unit 2121 /Li B. Zhen/Supervisory Patent Examiner, Art Unit 2121
Read full office action

Prosecution Timeline

Mar 15, 2021
Application Filed
Jun 27, 2024
Non-Final Rejection — §101, §112
Sep 26, 2024
Response Filed
Dec 19, 2024
Final Rejection — §101, §112
Mar 31, 2025
Request for Continued Examination
Apr 03, 2025
Response after Non-Final Action
Apr 29, 2025
Non-Final Rejection — §101, §112
Aug 01, 2025
Response Filed
Aug 01, 2025
Response after Non-Final Action
Aug 12, 2025
Response Filed
Nov 05, 2025
Final Rejection — §101, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12424302
ACCELERATED MOLECULAR DYNAMICS SIMULATION METHOD ON A QUANTUM-CLASSICAL HYBRID COMPUTING SYSTEM
2y 5m to grant Granted Sep 23, 2025
Patent 12314861
SYSTEMS AND METHODS FOR SEMI-SUPERVISED LEARNING WITH CONTRASTIVE GRAPH REGULARIZATION
2y 5m to grant Granted May 27, 2025
Patent 12261947
LEARNING SYSTEM, LEARNING METHOD, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 25, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
21%
Grant Probability
50%
With Interview (+28.4%)
4y 4m
Median Time to Grant
High
PTA Risk
Based on 19 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month