Prosecution Insights
Last updated: April 19, 2026
Application No. 18/502,309

PROTECTING CONFIDENTIAL INFORMATION IN A NEURAL-COMPUTER INTERFACE SYSTEM

Non-Final OA §101§102
Filed
Nov 06, 2023
Examiner
NATNITHITHADHA, NAVIN
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
International Business Machines Corporation
OA Round
1 (Non-Final)
71%
Grant Probability
Favorable
1-2
OA Rounds
4y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
685 granted / 963 resolved
+1.1% vs TC avg
Strong +31% interview lift
Without
With
+30.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
45 currently pending
Career history
1008
Total Applications
across all art units

Statute-Specific Performance

§101
12.6%
-27.4% vs TC avg
§103
30.9%
-9.1% vs TC avg
§102
29.2%
-10.8% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 963 resolved cases

Office Action

§101 §102
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections 2. Claims 2, 9, and 16 are objected to because of the following informalities: In the claims, “receiving a response to the notification input by the user” lacks proper antecedent basis for “the notification input”, and should be amended to “receiving a response to the notification, wherein the response is input by the user”. Appropriate correction is required. Claim Rejections - 35 USC § 101 3. 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 4. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception, i.e. abstract idea, without significantly more. Step 1 of the Patent Subject Matter Eligibility Guidance (see MPEP 2106.03): Claims 1-7 are directed to a “method”, which describes one of the four statutory categories of patentable subject matter, i.e. a process. Claims 8-14 are directed to a “device”, which describes one of the four statutory categories of patentable subject matter, i.e. a machine. Claims 15-20 are directed to a “computer program product”, which describes one of the four statutory categories of patentable subject matter, i.e. a machine. Step 2A of the Revised Patent Subject Matter Eligibility Guidance (see MPEP 2106.04): Claim(s) 1-20, recite the following mental process: determining, by the processor and in response to the detecting, that the inhibitory signal is aligned with an emotionally salient signal; generating, by the processor and in response to the determining, a notification; and … Based on broadest reasonable interpretation, these limitations are directed to receiving data and performing a mathematical operation, which can be done mentally or using pen and paper. This judicial exception is not integrated into a practical application because the additional limitations of “monitoring … neural signals collected from a user by a neural-computer interface (NCI) device” and “detecting … an inhibitory signal in the neural signals” in claims 1, 8, and 15, add insignificant pre-solution activity to the abstract idea that merely collects data to be used by the mental process. The additional limitation of “displaying, at an output device, the notification” in claims 1, 8, and 15, add insignificant post-solution activity to the abstract idea as it merely presents the result of the mental process of collecting and analyzing information, without more, and thus, is an ancillary part of such collection and analysis. Furthermore, “by the processor” in claims 1, 8, and 15, “a memory” in claims 1 and 8, “a processor communicatively coupled to the memory, wherein the processor is configured to perform a method comprising:” in claim 8, and “A computer program product, the computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause a device to perform a method, the method comprising:” in claim 15, are merely parts of a computer to be used as a tool to perform the mental process. Step 2B of the Patent Subject Matter Eligibility Guidance (see MPEP 2106.05): The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception, when considered separately and in combination. Analyzing the additional claim limitations individually, the additional limitations that are not directed to the mental process are “monitoring … neural signals collected from a user by a neural-computer interface (NCI) device” and “detecting … an inhibitory signal in the neural signals” in claims 1, 8, and 15. Such limitations are conventional and routine in the art (see Frank et al., U.S. Patent Application Publication No. 2016/0302711 A1, which is discussed below in the rejection under 35 U.S.C. 102), and add insignificant pre-solution activity to the abstract idea that merely collects data to be used by the abstract idea. The additional limitation of “displaying, at an output device, the notification” in claims 1, 8, and 15, add insignificant post-solution activity to the abstract idea as it merely presents the result of the mental process of collecting and analyzing information, without more, and thus, is an ancillary part of such collection and analysis. Furthermore, “by the processor” in claims 1, 8, and 15, “a memory” in claims 1 and 8, “a processor communicatively coupled to the memory, wherein the processor is configured to perform a method comprising:” in claim 8, and “A computer program product, the computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause a device to perform a method, the method comprising:” in claim 15, are merely parts of a computer to be used as a tool to perform the mental process. The additional limitations of dependent claims 2-7, 9-14, and 16-20 are merely directed to and further narrow the scope of the mental process. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. Their collective functions merely provide computer implementation of the abstract idea using collected data without: improvement to the functioning of a computer or to any other technology or technical field; applying the mental process with, or by use of, a particular machine; effecting a transformation or reduction of a particular article to a different state or thing; applying or using the mental process in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment; or adding a specific limitation other than what is well-understood, routine, conventional activity in the field. Claim Rejections - 35 USC § 102 5. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 6. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 7. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Frank et al., U.S. Patent Application Publication No. 2016/0302711 A1 (“Frank”). As to Claim 1, Frank teaches the following: A method (see “Some aspects of this disclosure include systems, methods, and/or computer programs that may be used to notify a user about a cause of emotional imbalance.” in Abstract), comprising: monitoring, by a processor (“processor”) 401 communicatively coupled to a memory (“memory”) 402 (see “Since it involves computer-executable modules (also referred to herein simply as “modules”), one more processors and memory may be required in order to realize the system. For example, a system similar to computer system 400 illustrated in FIG. 8, which includes processor 401 and memory 402, may be utilized in order to implement an embodiment of the system illustrated in FIG. 2.” in para. [0057]), neural signals (“a physiological signal of the user”) collected from a user by a neural-computer interface (NCI) device (“sensor” or “user interface”, which may be “brain-computer interface”) 102/404 (see “In one embodiment, the measurement is taken with the sensor 102, which is coupled to the user 101, and the measurement comprises at least one of the following values: a physiological signal of the user 101, and a behavioral cue of the user 101.” in para. [0059]; and see “Still continuing the example, the user interface 404 may include one or more of the following components: … a brain-computer interface.” in para. [0370]); detecting, by the processor 401 and based on the monitoring, an inhibitory signal (“affective response”) in the neural signals (see “In some embodiments, the sensor 102 is utilized to take a plurality of measurements of affective response of the user 101 during the day in which the measurement corresponding to the event was taken.” in para. [0061]; and see “In another example, a measurement of affective response may be taken during a contiguous stretch of time (e.g., brain activity measured using EEG over a period of one minute).” in para. [0241]); determining, by the processor 401 and in response to the detecting, that the inhibitory signal (“affective response”) is aligned with an emotionally salient signal (“state of emotional imbalance”) (see “In one example, a determination of a state of emotional imbalance is made based on an assumption that when the user 101 is in the state of emotional imbalance, the measurement of affective response corresponding to the event indicates that an extent, to which the user 101 felt a certain emotion, reaches a threshold. In this example, the certain emotion may be one or more of the following emotions: anger, contempt, disgust, distress, and fear. Thus, for example, if the user 101 is feeling at least a certain level of distress, the user 101 may be considered to be in a state of emotional imbalance. In another example, a determination of a state of emotional imbalance is made based on an assumption that when the user 101 is in the state of emotional imbalance, the measurement of affective response corresponding to the event indicates that a level of stress felt by the user 101 reaches a threshold.” in para. [0062]); generating, by the processor 401 and in response to the determining, a notification (see “In a similar fashion to it role in embodiments modelled according to FIG. 2, in embodiments modelled according to FIG. 5, the awareness module 916 is configured to notify the user 101 about the certain factor selected by the event analyzer module 914. Optionally, notifying the user 101 is done responsive to a determination generated by the emotional state analyzer module 910 that the predicted affective response of the user 101 indicates a certain emotional state. In one embodiment, the certain emotional state corresponds to a state of emotional balance. In another embodiment, the certain emotional state corresponds to a state of emotional imbalance.” in para. [0159]); and displaying, at an output device (“user interface”) 404 (see “Still continuing the example, the user interface 404 may include one or more of the following components: (i) an image generation device, such as a video display, an augmented reality system, a virtual reality system, and/or a mixed reality system, (ii) an audio generation device, such as one or more speakers, (iii) an input device, such as a keyboard, a mouse, a gesture based input device that may be active or passive, and/or a brain-computer interface.” in para. [0370]), the notification (see para. [0159]). As to Claims 2 and 3, Frank teaches the following: receiving a response to the notification input by the user, wherein the response comprises instructions to delete a portion of the neural signals (see “The approval to execute the computer program may be explicit, e.g., a user may initiate the execution of the program (e.g., by issuing a voice command, pushing an icon that initiates the program's execution, and/or issuing a command via a terminal and/or another form of a user interface with an operating system).” in para. [0303]). As to Claim 4, Frank teaches the following: wherein the NCI device is a wearable device (see “In another example, the sensor may be embedded in, and/or attached to, an item worn by the user, such as a glove, a shirt, a shoe, a bracelet, a ring, a head-mounted display, and/or helmet or other form of headwear.” in para. [0207]). As to Claim 5, Frank teaches the following: wherein the monitoring comprises monitoring a type of signal selected from the group consisting of brainwaves, hemodynamic response, event-related potential (ERP), skin conductance response (SRP), and cortical activity (see in para. [0210]-[0215]). As to Claim 6, Frank teaches the following: deleting a portion of the neural signals in response to the determining that the inhibitory signal is aligned with the emotionally salient signal (see “The approval to execute the computer program may be explicit, e.g., a user may initiate the execution of the program (e.g., by issuing a voice command, pushing an icon that initiates the program's execution, and/or issuing a command via a terminal and/or another form of a user interface with an operating system).” in para. [0303]). As to Claim 7, Frank teaches the following: wherein the deleting is carried out automatically based on stored user preferences (see para. [0303]). As to Claim 8, Frank teaches the following: A system (see “Some aspects of this disclosure include systems, methods, and/or computer programs that may be used to notify a user about a cause of emotional imbalance.” in Abstract), comprising: a memory (“memory”) 402; and a processor (“processor”) 401 communicatively coupled to the memory 402 (see “Since it involves computer-executable modules (also referred to herein simply as “modules”), one more processors and memory may be required in order to realize the system. For example, a system similar to computer system 400 illustrated in FIG. 8, which includes processor 401 and memory 402, may be utilized in order to implement an embodiment of the system illustrated in FIG. 2.” in para. [0057]), wherein the processor 401 is configured to perform a method comprising: monitoring, by the processor 401, neural signals collected from a user by a neural-computer interface (NCI) device (“sensor” or “user interface”, which may be “brain-computer interface”) 102/404 (see “In one embodiment, the measurement is taken with the sensor 102, which is coupled to the user 101, and the measurement comprises at least one of the following values: a physiological signal of the user 101, and a behavioral cue of the user 101.” in para. [0059]; and see “Still continuing the example, the user interface 404 may include one or more of the following components: … a brain-computer interface.” in para. [0370]); detecting, by the processor 401 and based on the monitoring, an inhibitory signal (“affective response”) in the neural signals (see “In some embodiments, the sensor 102 is utilized to take a plurality of measurements of affective response of the user 101 during the day in which the measurement corresponding to the event was taken.” in para. [0061]; and see “In another example, a measurement of affective response may be taken during a contiguous stretch of time (e.g., brain activity measured using EEG over a period of one minute).” in para. [0241]); determining, by the processor 401 and in response to the detecting, that the inhibitory signal (“affective response”) is aligned with an emotionally salient signal (“state of emotional imbalance”) (see “In one example, a determination of a state of emotional imbalance is made based on an assumption that when the user 101 is in the state of emotional imbalance, the measurement of affective response corresponding to the event indicates that an extent, to which the user 101 felt a certain emotion, reaches a threshold. In this example, the certain emotion may be one or more of the following emotions: anger, contempt, disgust, distress, and fear. Thus, for example, if the user 101 is feeling at least a certain level of distress, the user 101 may be considered to be in a state of emotional imbalance. In another example, a determination of a state of emotional imbalance is made based on an assumption that when the user 101 is in the state of emotional imbalance, the measurement of affective response corresponding to the event indicates that a level of stress felt by the user 101 reaches a threshold.” in para. [0062]); generating, by the processor 401 and in response to the determining, a notification (see “In a similar fashion to it role in embodiments modelled according to FIG. 2, in embodiments modelled according to FIG. 5, the awareness module 916 is configured to notify the user 101 about the certain factor selected by the event analyzer module 914. Optionally, notifying the user 101 is done responsive to a determination generated by the emotional state analyzer module 910 that the predicted affective response of the user 101 indicates a certain emotional state. In one embodiment, the certain emotional state corresponds to a state of emotional balance. In another embodiment, the certain emotional state corresponds to a state of emotional imbalance.” in para. [0159]); and displaying, at an output device (“user interface”) 404 (see “Still continuing the example, the user interface 404 may include one or more of the following components: (i) an image generation device, such as a video display, an augmented reality system, a virtual reality system, and/or a mixed reality system, (ii) an audio generation device, such as one or more speakers, (iii) an input device, such as a keyboard, a mouse, a gesture based input device that may be active or passive, and/or a brain-computer interface.” in para. [0370]), the notification (see para. [0159]). As to Claims 9 and 10, Frank teaches the following: receiving a response to the notification input by the user, wherein the response comprises instructions to delete a portion of the neural signals (see “The approval to execute the computer program may be explicit, e.g., a user may initiate the execution of the program (e.g., by issuing a voice command, pushing an icon that initiates the program's execution, and/or issuing a command via a terminal and/or another form of a user interface with an operating system).” in para. [0303]). As to Claim 11, Frank teaches the following: wherein the NCI device is a wearable device (see “In another example, the sensor may be embedded in, and/or attached to, an item worn by the user, such as a glove, a shirt, a shoe, a bracelet, a ring, a head-mounted display, and/or helmet or other form of headwear.” in para. [0207]). As to Claim 12, Frank teaches the following: wherein the monitoring comprises monitoring a type of signal selected from the group consisting of brainwaves, hemodynamic response, event-related potential (ERP), skin conductance response (SRP), and cortical activity (see in para. [0210]-[0215]). As to Claim 13, Frank teaches the following: deleting a portion of the neural signals in response to the determining that the inhibitory signal is aligned with the emotionally salient signal (see “The approval to execute the computer program may be explicit, e.g., a user may initiate the execution of the program (e.g., by issuing a voice command, pushing an icon that initiates the program's execution, and/or issuing a command via a terminal and/or another form of a user interface with an operating system).” in para. [0303]). As to Claim 14, Frank teaches the following: wherein the deleting is carried out automatically based on stored user preferences (see para. [0303]). As to Claim 15, Frank teaches the following: A computer program product, the computer program product comprising a computer readable storage medium (“computer-readable medium”) 403 having program instructions embodied therewith, the program instructions executable by a processor (“processor”) 401 to cause a device to perform a method (see “Some aspects of this disclosure include systems, methods, and/or computer programs that may be used to notify a user about a cause of emotional imbalance.” in Abstract; and see “Since it involves computer-executable modules (also referred to herein simply as “modules”), one more processors and memory may be required in order to realize the system. For example, a system similar to computer system 400 illustrated in FIG. 8, which includes processor 401 and memory 402, may be utilized in order to implement an embodiment of the system illustrated in FIG. 2.” in para. [0057]; and see “The computer 400 includes one or more of the following components: processor 401, memory 402, computer readable medium 403, user interface 404, communication interface 405, and bus 406.” in para. [0369]), the method comprising: monitoring, by the processor 401, neural signals (“a physiological signal of the user”) collected from a user by a neural-computer interface (NCI) device (“sensor” or “user interface”, which may be “brain-computer interface”) 102/404 (see “In one embodiment, the measurement is taken with the sensor 102, which is coupled to the user 101, and the measurement comprises at least one of the following values: a physiological signal of the user 101, and a behavioral cue of the user 101.” in para. [0059]; and see “Still continuing the example, the user interface 404 may include one or more of the following components: … a brain-computer interface.” in para. [0370]); detecting, by the processor 401 and based on the monitoring, an inhibitory signal (“affective response”) in the neural signals (see “In some embodiments, the sensor 102 is utilized to take a plurality of measurements of affective response of the user 101 during the day in which the measurement corresponding to the event was taken.” in para. [0061]; and see “In another example, a measurement of affective response may be taken during a contiguous stretch of time (e.g., brain activity measured using EEG over a period of one minute).” in para. [0241]); determining, by the processor 401 and in response to the detecting, that the inhibitory signal (“affective response”) is aligned with an emotionally salient signal (“state of emotional imbalance”) (see “In one example, a determination of a state of emotional imbalance is made based on an assumption that when the user 101 is in the state of emotional imbalance, the measurement of affective response corresponding to the event indicates that an extent, to which the user 101 felt a certain emotion, reaches a threshold. In this example, the certain emotion may be one or more of the following emotions: anger, contempt, disgust, distress, and fear. Thus, for example, if the user 101 is feeling at least a certain level of distress, the user 101 may be considered to be in a state of emotional imbalance. In another example, a determination of a state of emotional imbalance is made based on an assumption that when the user 101 is in the state of emotional imbalance, the measurement of affective response corresponding to the event indicates that a level of stress felt by the user 101 reaches a threshold.” in para. [0062]); generating, by the processor 401 and in response to the determining, a notification (see “In a similar fashion to it role in embodiments modelled according to FIG. 2, in embodiments modelled according to FIG. 5, the awareness module 916 is configured to notify the user 101 about the certain factor selected by the event analyzer module 914. Optionally, notifying the user 101 is done responsive to a determination generated by the emotional state analyzer module 910 that the predicted affective response of the user 101 indicates a certain emotional state. In one embodiment, the certain emotional state corresponds to a state of emotional balance. In another embodiment, the certain emotional state corresponds to a state of emotional imbalance.” in para. [0159]); and displaying, at an output device (“user interface”) 404 (see “Still continuing the example, the user interface 404 may include one or more of the following components: (i) an image generation device, such as a video display, an augmented reality system, a virtual reality system, and/or a mixed reality system, (ii) an audio generation device, such as one or more speakers, (iii) an input device, such as a keyboard, a mouse, a gesture based input device that may be active or passive, and/or a brain-computer interface.” in para. [0370]), the notification (see para. [0159]). As to Claim 16, Frank teaches the following: receiving a response to the notification input by the user, wherein the response comprises instructions to delete a portion of the neural signals (see “The approval to execute the computer program may be explicit, e.g., a user may initiate the execution of the program (e.g., by issuing a voice command, pushing an icon that initiates the program's execution, and/or issuing a command via a terminal and/or another form of a user interface with an operating system).” in para. [0303]). As to Claim 17, Frank teaches the following: wherein the NCI device is a wearable device (see “In another example, the sensor may be embedded in, and/or attached to, an item worn by the user, such as a glove, a shirt, a shoe, a bracelet, a ring, a head-mounted display, and/or helmet or other form of headwear.” in para. [0207]). As to Claim 18, Frank teaches the following: wherein the monitoring comprises monitoring a type of signal selected from the group consisting of brainwaves, hemodynamic response, event-related potential (ERP), skin conductance response (SRP), and cortical activity (see in para. [0210]-[0215]). As to Claim 19, Frank teaches the following: deleting a portion of the neural signals in response to the determining that the inhibitory signal is aligned with the emotionally salient signal (see “The approval to execute the computer program may be explicit, e.g., a user may initiate the execution of the program (e.g., by issuing a voice command, pushing an icon that initiates the program's execution, and/or issuing a command via a terminal and/or another form of a user interface with an operating system).” in para. [0303]). As to Claim 20, Frank teaches the following: wherein the deleting is carried out automatically based on stored user preferences (see para. [0303]). Conclusion 8. Any inquiry concerning this communication or earlier communications from the examiner should be directed to NAVIN NATNITHITHADHA whose telephone number is (571)272-4732. The examiner can normally be reached Monday - Friday 8:00 am - 8:00 am - 4:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason M Sims can be reached at 571-272-7540. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NAVIN NATNITHITHADHA/Primary Examiner, Art Unit 3791 01/08/2026
Read full office action

Prosecution Timeline

Nov 06, 2023
Application Filed
Jan 08, 2026
Non-Final Rejection — §101, §102
Apr 03, 2026
Interview Requested
Apr 09, 2026
Applicant Interview (Telephonic)
Apr 10, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12569172
DEVICES, SYSTEMS, AND METHODS ASSOCIATED WITH ANALYTE MONITORING DEVICES AND DEVICES INCORPORATING THE SAME
2y 5m to grant Granted Mar 10, 2026
Patent 12564329
Optical Device for Determining Pulse Rate
2y 5m to grant Granted Mar 03, 2026
Patent 12562273
MEDICAL DEVICES AND METHODS
2y 5m to grant Granted Feb 24, 2026
Patent 12555404
DISPLAY DEVICE HAVING BIOMETRIC FUNCTION AND OPERATION METHOD THEREOF
2y 5m to grant Granted Feb 17, 2026
Patent 12543976
SYSTEM FOR MONITORING BODY CHEMISTRY
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+30.9%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 963 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month