Prosecution Insights
Last updated: April 19, 2026
Application No. 17/901,073

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM

Final Rejection §101§103
Filed
Sep 01, 2022
Examiner
BROCKETTI, JULIE K
Art Unit
3700
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
NEC Corporation
OA Round
2 (Final)
24%
Grant Probability
At Risk
3-4
OA Rounds
3y 6m
To Grant
-1%
With Interview

Examiner Intelligence

Grants only 24% of cases
24%
Career Allow Rate
4 granted / 17 resolved
-46.5% vs TC avg
Minimal -24% lift
Without
With
+-24.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
9 currently pending
Career history
26
Total Applications
across all art units

Statute-Specific Performance

§101
10.8%
-29.2% vs TC avg
§103
44.6%
+4.6% vs TC avg
§102
15.4%
-24.6% vs TC avg
§112
23.1%
-16.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 17 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Applicant filed an amendment to the claims on November 13, 2025. Claims 2 and 3 were cancelled and claims 1, 7 and 8 were amended. Claim Objections Claims 1, 7 and 8 are objected to because of the following informalities: The claims recite “…in which there is operation input to the length of the non-working time during the student works on the task.” It appears that a word is missing from the bolded phrase of the claim. The examiner believes the word “which” should be inserted after the word “during”. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1 and 4-8 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claim 1 is directed to “an information processing device,” (i.e. a machine), claim 7 is directed to “an information processing method,” (i.e. a process), and claim 8 is directed to “a non-transitory recording medium,” (i.e. a machine), hence the claims are directed to one of the four statutory categories (i.e. process, machine, manufacture, or composition of matter). In other words, Step 1 of the subject-matter eligibility analysis is “Yes.” However, the claims are drawn to the abstract idea of “information processing and recording” in the form of “certain methods of organizing human activity,” in terms of managing personal behavior or relationships or interactions between people (including social activities, teaching and following rules or instructions), or reasonably in the form of “mental processes,” in terms of processes that can be performed in the human mind (including an observation, evaluation, judgement or opinion). Regardless, the claims are reasonably understood as either “certain methods of organizing human activity” or “mental processes” which require the following limitations: “ …collecting keystroke data indicating an operation input amount to a user terminal; extracting, from the keystroke data, a keystroke pattern indicating a feature of an operation input to the user terminal; and classifying approach of a student to a task, based on the keystroke pattern, wherein the keystroke pattern comprises a length of non-working time in which there is no operation input to the user terminal and a length of time for working on that is required for the task by the student, and wherein the at least one processor is further configured to execute the instructions to perform; evaluating qualities or abilities of the student based on a type of the approach of the student to the task, wherein the type of approach of the student to the task is based on a ration of a length of working time in which there is operation input to the length of the non-working time during the student works on the task.” The “collecting keystroke data”, “extracting… a keystroke pattern”, “classifying approach … to a task” and “evaluating qualities or abilities of the student based on a type of approach” limitations simply describe a process of data gathering and manipulation, i.e. extra-solution activity which is sufficiently analogous to “collecting information, analyzing it, and displaying certain results of the collection analysis” (i.e. Electric Power Group, LLC, v. Alstom, 830 F.3d 1350, 119 U.S.P.Q.2d 1739 (Fed. Cir. 2016)) as well as the mental process of observation and evaluation. Hence, these limitations are akin to an abstract idea which has been identified among non-limiting examples to be an abstract idea. In other words, Step 2A, Prong 1 of the subject-matter eligibility analysis is “Yes.” Furthermore, this judicial exception is not integrated into a practical application because: (a) It does not improve the functioning of a computer or to any other technology or technical field; (b) Applying the judicial exception does not affect a particular treatment or prophylaxis for a disease or medical condition; (c) Do not apply the judicial exception with, or by use of a particular machine; (d) It does not effect a transformation or reduction of a particular article to a different state or thing; (e) It does not apply or use the judicial exception in some other meaningful way beyond generally linking the use of the exception to a particular technological environment such that the claims as a whole are more than a drafting effort designed to monopolize the exception. Namely, the applicants claimed elements of “a memory”, “at least one processor”, and “a user terminal”, are merely claimed to generally link the use of a judicial exception (e.g., pre-solution activity of data gathering and post-solution activity of presenting data) to (1) a particular technological environment or (2) field of use, per MPEP §2106.05(h); and are applying the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea, per MPEP §2106.05(f). In other words, the claimed “information processing device, information processing method, and recording medium,” is not providing a practical application, thus Step 2A, Prong 2 of the subject-matter eligibility analysis is “No.” Likewise, the claims do not include additional elements that either alone or in combination are sufficient to amount to significantly more than the judicial exception because to the extent that, e.g. “a memory”, “at least one processor”, and “a user terminal” are claimed, these are generic, well-known, and conventional computer elements. As evidence that these are generic, well-known, and a conventional computer elements (or an equivalent term), as a commercially available product, or in a manner that indicates that the additional elements are sufficiently well-known, the Applicant’s specification discloses these in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a), per MPEP § 2106.07(a) III (a). As such, this satisfies the Examiner’s evidentiary burden requirement per the Berkheimer memo. Specifically, the claimed “a memory” and “at least one processor” are described on page 23, lines 18-26 as follows: “As illustrated in Fig. 13, the information processing device 900 includes the following configuration as an example. - CPU(Central Processing Unit)901 - ROM(Read Only Memory)902 - RAM(Random Access Memory)903 -Program 904 to be loaded into the RAM 903Storage device 905 storing the program 904” These elements are reasonably interpreted as a generic computer and provide no details of anything beyond their use as a ubiquitous standard computer. Also, the claimed “a user terminal” is described on page 2, lines 6-8 as follows: “… each student can participate in online classes through a user terminal such as a tablet terminal and a personal computer” This element is reasonably interpreted as generic computer terminal and provides no details of anything beyond its use as a ubiquitous standard computer interface. As such, the elements claimed are reasonably interpreted as generic computer elements that exist as ubiquitous standard equipment with computers and is thereby reasonably understood as not providing anything significantly more. Therefore, Step 2B of the subject-matter eligibility analysis is “No.” In addition, dependent claims 4-6 do not provide a practical application and are insufficient to amount to significantly more than the judicial exception. As such, dependent claims 4-6 are also rejected under 35 U.S.C. § 101, based on their dependencies to claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1 and 4-8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (hereinafter ‘Kim’, US 2016/0189554) in view of Kutty et al. (US 2014/0349272) in further view of Holstein et al. (US 20200193859). Regarding claim 1, and substantially similar limitations in claims 7 and 8, Kim discloses an information processing device comprising: a memory configured to store instructions; and at least one processor configured to execute the instructions (see para. [0037]: The smart device 101, which serves to determine the reaction of the user depending on the reproduced content, acquires the content, the reaction speed via an input device, the duration for which the user watches a screen, and the like to be used as input information based on which the learning situation recognition unit 300 makes a determination. For example, the smart device 101 may include an IPTV, a desktop computer, a notebook computer, a tablet PC, and a smartphone, each including a display unit, an imaging unit, and a user interface unit) to perform: collecting keystroke data indicating an operation input amount to a user terminal (see para. [0056]: the input information recording unit 200 collects the device input information and the sensing information at the time of reproducing the learning content. For example, the input information may be processed such that the learning situation recognition unit 300 may make a determination on the user state information; [0037]: the user interface unit is an input device, which is a means for the user to directly input data, such as a mouse, a keyboard, a touch screen, a pen, and the like); extracting, from the keystroke data, a keystroke pattern indicating a feature of an operation input to the user terminal (see para. [0042]: the learning situation recognition unit 300 determines the user state information based on the input information and sensing information input from the user device 100, such as a frequency of motion of the user while learning, a mouse click reaction speed, a keyboard input time, an extent of watching the screen, eyelid motion, head movement, and the user learning pattern management profile 502 and user learning history management profile 504 information, stored in the profile management unit 500, and transfers the determination result to the device control unit 400); and classifying approach of a student to a task, based on the keystroke pattern (see para. [0066]: checks the user learning pattern for the corresponding content, and compares the frequency of motion of the user while studying, the response time for the content, and the like with the input information recording unit 200 to update the user learning pattern management profile 502 when the user is in the optimal learning state and transmit the current state information to the device control unit 400 when the user is not in the optimal learning state, thereby enabling control appropriate for the situation; [0042]: the learning situation recognition unit 300 determines the user state information based on the input information and sensing information input from the user device 100, such as a frequency of motion of the user while learning, a mouse click reaction speed, a keyboard input time); wherein the keystroke pattern includes a length of non-working time in which there is no operation input to the user terminal (see para. [0016]: the user state information may include the reaction speed of the user with respect to a response/selection speed; [0044] The device control unit 400 enables appropriate device control for the user based on the device input information, the sensing information, and the profile information stored in the input information recording unit 200 and the profile management unit 500 depending on the user state information input from the learning situation recognition unit 300. For example, in the case in which a result message indicating that the user is not concentrating when studying content) and a length of time for working on that is required for the task by the student (see para. [0048]: This functions to manage the life habits of the user and content related to the analysis of the user's learning patterns while studying the content, and may manage the daily learning time, subjects, content, difficulty level, the extent of change in the learning pattern for each season and the school schedule, to be utilized as material for recommendations to enable appropriate learning); wherein the at least one processor is further configured to execute the instructions to perform: evaluating qualities or abilities of the student based on a type of the approach of the student to the task (see para. [0048]: The user learning pattern management profile 502 reflects the result of the user learning state, determined by the learning situation recognition unit 300, and contains daily and weekly personal learning pattern information about the user. This functions to manage the life habits of the user and content related to the analysis of the user's learning patterns while studying the content, and may manage the daily learning time, subjects, content, difficulty level, the extent of change in the learning pattern for each season and the school schedule, to be utilized as material for recommendations to enable appropriate learning. Also see para. [0066]: the learning situation recognition unit 300 acquires information on whether the content that the user is learning has characteristic information, such as the interest level, difficulty level, data format, and the like, similar to that of the content previously learned by the user, checks the user learning pattern for the corresponding content, and compares the frequency of motion of the user while studying, the response time for the content, and the like with the input information recording unit 200 to update the user learning pattern management profile 502 when the user is in the optimal learning state). Kim lacks in disclosing “wherein the type of approach of the student to the task is based on a ratio of a length of working time in which there is operation input to the length of the non-working time during which the student works on the task”. Kutty discloses an analogous invention of evaluating the performance of a user on an e-learning system. Kutty teaches of collecting data from students including the time spent on assignments, time spent on quizzes, discussion questions, time spent per week, etc. (Kutty ¶0005, ¶0033, ¶0034). Kutty tracks activity data including log data (Kutty ¶0017, ¶0031). Kutty further teaches of using the data to then determine an evaluation index to evaluate the performance of the student and either upgrade or downgrade the student performance (Kutty ¶0018). Kutty clearly tracks the length of working time in which there is input operation into the e-learning system. There are 24 hours in a day, so since Kutty is tracking the working time of a student through time stamps it is implicitly also tracking the non-working time of a student, i.e. all the time in the day in which there are no inputs from a student. Nevertheless, Holstein et al. teaches of an educational environment in which the time students spend on assignments is tracked as well as the time the student is idle, i.e. non-working time during which the student works on the task (Holstein Fig. 2A, ¶0042, ¶0079-¶0082). In Holstein, a teacher is notified of the non-working time in order to determine if the student requires additional assistance (¶0016). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to determine the type of approach of the student to the task based on a ratio of a length of working time in which there is operation input to the length of the non-working time during which the student works on the task” in Kim as Kutty teaches of evaluating a student based on the working time and Holstein teaches of analyzing student progress based on non-working time. It is well known throughout the art of education, that one can determine student performance based on how long or how little a student worked on an assignment as time devoted to an assignment has a direct correlation to success or understanding of the material. Regarding claim 4, Kim discloses the information processing device according to claim 1 as above, wherein the at least one processor is configured to execute the instructions to perform: calculating an index indicating comprehension of the student based on the type of the approach of the student to the task and accuracy of an answer (see FIG. 2, pop quiz. A pop quiz is reasonably understood to calculate an index indicating comprehension [i.e., a score, a percent correct] of a student based on accuracy of the answers. Also see para. [0059]: the learning situation recognition unit 300 may select learning content having a higher difficulty level than that of the currently reproduced learning content as the recommended content when the reaction speed of the user is above the reference range). Regarding claim 5, Kim discloses the information processing device according to claim 1 as above, wherein the at least one processor is configured to execute the instructions to perform: calculating an index indicating concentration of the student based on the keystroke pattern and a behavior of the student (see para. [0043]: Further, the device control unit 400 may control the smart device 101 and the reality device 103 to execute the reality-check operation to attract the attention of the user when the extent to which the user watches the screen falls below the reference value… [0044]: … in the case in which a result message indicating that the user is not concentrating when studying. Also see para. [0041]: The learning situation recognition unit 300 may calculate the user state information based on the device input information and the sensing information and select the recommended content depending on the user state information. The user state information may include the reaction speed of the user with respect to a response/selection speed and the pace at which the content is learned. Further, the user state information may further include the extent to which the user watches the screen, determined using the imaging unit). Regarding claim 6, Kim discloses the information processing device according to claim 1 as above, wherein the at least one processor is configured to execute the instructions to perform: outputting information indicating time series of a type of the approach of the student to the task (see para. [0041]: The learning situation recognition unit 300 may calculate the user state information based on the device input information and the sensing information and select the recommended content depending on the user state information. The user state information may include the reaction speed of the user with respect to a response/selection speed and the pace at which the content is learned. Further, the user state information may further include the extent to which the user watches the screen, determined using the imaging unit). Response to Arguments Applicant's arguments filed November 13, 2025 have been fully considered but they are not persuasive. With respect to the 35 USC 101 rejection applicant argues that the claim limitations "collecting keystroke data indicating an operation input amount to a user terminal; extracting, from the keystroke data, a keystroke pattern indicating a feature of an operation input to the user terminal, ... wherein the keystroke pattern comprises a length of non-working time in which there is no operation input to the user terminal and a length of time for working on that is required for the task by the student, ... evaluating qualities or abilities of the student based on a type of the approach of the student to the task, wherein the type of the approach of the student to the task is based on a ratio of a length of working time in which there is operation input to the length of the non-working time during the student works on the task," do not fall under the categories of methods of organizing human activity and cannot be performed in the mind. The examiner respectfully disagrees and notes that “collecting keystroke data” is extra-solution activity of data gathering. “Extracting…a keystroke pattern” and “evaluating qualities or abilities of the student…” are mental steps of judgement and determination based on the data that was gathered. Furthermore, they are also considered as a method of organizing human activity in that a teacher collects data on a student, categorizes that data and evaluates the abilities of students. This is a standard teacher/student relationship. Applicants further argue that the claims recite a technical solution to the problems relating to helping students with tasks in the context of online classes. Specifically, “collecting keystroke data indicating an operation input amount to a user terminal; extracting, from the keystroke data, a keystroke pattern indicating a feature of an operation input to the user terminal, ... wherein the keystroke pattern comprises a length of non- working time in which there is no operation input to the user terminal and a length of time for working on that is required for the task by the student, ... evaluating qualities or abilities of the student based on a type of the approach of the student to the task, wherein the type of the approach of the student to the task is based on a ratio of a length of working time in which there is operation input to the length of the non-working time during the student works on the task,". It is noted that these limitations do not provide a technological solution to a problem to overcome the rejection under 35 USC 101 but rather use an information processing device, i.e. a generic computer as merely a tool to implement a common mental process or organization of human activity. The claims merely recite, collecting keystroke data (i.e. data gathering/extra solution activity), extracting a keystroke pattern from the data (mental process which can be done by a person just viewing the keystroke data and determining if there is a pattern), classifying approach of a student to a task based on the pattern (mental judgement and categorization) and evaluating qualities or abilities of the student based on the approach classified (mental judgement). Nowhere in the claims or the specification is it determined that there is a specific technological solution to the problem. Applicant’s arguments with respect to the 35 USC 102 rejection have been considered but are moot in view of the new 35 USC 103 rejection of Kim in view of Kutty and Holstein. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JULIE K BROCKETTI whose telephone number is (571)272-0206. The examiner can normally be reached M-Th 8:00 a.m. - 5:00 p.m. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Barrett can be reached at 571-272-4746. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JULIE K BROCKETTI/Primary Examiner, Art Unit 3700
Read full office action

Prosecution Timeline

Sep 01, 2022
Application Filed
Aug 08, 2025
Non-Final Rejection — §101, §103
Nov 13, 2025
Response Filed
Feb 13, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 8748689
Device for the treatment of vaginal fungal infection
2y 5m to grant Granted Jun 10, 2014
Patent 8380120
NULL
2y 5m to grant Granted Feb 19, 2013
Patent 8303311
SPORT PERSONAL COACH SYSTEM
2y 5m to grant Granted Nov 06, 2012
Patent 7621213
SALAD SPINNER
2y 5m to grant Granted Nov 24, 2009
Patent null
Method and System for Employee Training and Reward
Granted
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
24%
Grant Probability
-1%
With Interview (-24.5%)
3y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 17 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month