Prosecution Insights
Last updated: April 19, 2026
Application No. 17/855,947

MULTIMODAL USER EXPERIENCE DEGRADATION DETECTION

Final Rejection §102§103
Filed
Jul 01, 2022
Examiner
BUTLER, SARAI E
Art Unit
2114
Tech Center
2100 — Computer Architecture & Software
Assignee
Intel Corporation
OA Round
3 (Final)
88%
Grant Probability
Favorable
4-5
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
1008 granted / 1145 resolved
+33.0% vs TC avg
Moderate +11% lift
Without
With
+10.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
13 currently pending
Career history
1158
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
50.4%
+10.4% vs TC avg
§102
16.9%
-23.1% vs TC avg
§112
13.0%
-27.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1145 resolved cases

Office Action

§102 §103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This is in response to Application 17/855947 filed on July 1, 2022 in which Claims 1-25 are presented for examination. Status of Claims Claims 1-25 are pending, of which claims 1-3, 5, 7, 8, 13-15, 17, 19, 20, 22 and 25 are rejected under 102. Claims 4, 6, 9-12, 16, 18, 21, 23 and 24 are rejected under 103. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-3, 5, 7, 8, 13-15, 17, 19, 20, 22 and 25 is/are rejected under 35 U.S.C. 102 as being anticipated by Sasturkar (US Patent Application 2015/0033086). Claim 1, Sasturkar teaches one or more non-transitory computer-readable storage media storing computer-executable instructions that, when executed, cause one or more processor units of a computing device to: detect, by a computing system, a user experience degradation event based on one or more system state vectors and one or more user interaction state vectors (View Sasturkar ¶ 23, 25, 38, 39; detect anomaly in network, network monitoring), individual of the one or more system state vectors to represent a state of the computing system at a point in time (View Sasturkar ¶ 26; snapshot) and individual of the one or more user interaction state vectors to represent a state of user interaction with the computing system at a point in time (View Sasturkar ¶ 45; user behavior observations); and classify, by the computing system, a root cause of the user experience degradation event based on the user experience degradation event, the one or more system state vectors, and the one or more user interaction state vectors (View Sasturkar ¶ 26, 65; determine root cause). Claim 15 is the method corresponding to the media of Claim 1 and is therefore rejected under the same reasons set forth in the rejection of Claim 1. Claim 20 is the apparatus corresponding to the media of Claim 1 and is therefore rejected under the same reasons set forth in the rejection of Claim 1. Claim 2, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar further teaches the computer-executable instructions further cause the one or more processor units to detect the user experience degradation event is performed by a degradation detection network, the degradation detection network being a neural network (View Sasturkar ¶ 39; machine learning). Claim 3, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar further teaches the computer-executable instructions further cause the one or more processor units to generate the one or more system state vectors based on system data (View Sasturkar ¶ 26; system state), the system data comprising telemetry information provided by one or more of one or more integrated circuit components of the computing system, an operating system executing on the computing system, and one or more applications executing on the computing system (View Sasturkar ¶ 26, 35; various resource metrics). Claim 5, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar further teaches the one or more system state vectors are generated based on system data by a system state attention network, the system state attention network being a neural network (View Sasturkar ¶ 39; machine learning). Claim 7, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar further teaches the one or more user interaction state vectors are generated based on user interaction data by a user interaction fusion network, the user interaction fusion network being a neural network (View Sasturkar ¶ 39; machine learning). Claim 8, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar further teaches to detect the user experience degradation event and to classify the root cause of the user experience degradation event is performed by the computing system in real-time (View Sasturkar ¶ 26, 49; real-time). Claim 17 is the method corresponding to the media of Claim 8 and is therefore rejected under the same reasons set forth in the rejection of Claim 8. Claim 22 is the apparatus corresponding to the media of Claim 8 and is therefore rejected under the same reasons set forth in the rejection of Claim 8. Claim 13, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar further teaches the one or more processor units to annotate the one or more user interaction state vectors with user experience degradation information based on user- supplied information (View Sasturkar ¶ 32, 45; user feedback). Claim 14, most of the limitations of this claim has been noted in the rejection of Claim 13. Sasturkar further teaches to detect the user experience degradation event is performed by a degradation detection network (View Sasturkar ¶ 25; detect anomaly in network, network monitoring), the computer-executable instructions further cause the one or more processor units to train the degradation detection network based on the one or more system state vectors and the annotated one or more user interaction state vectors (View Sasturkar ¶ 39; machine learning). Claim 19 is the method corresponding to the media of Claim 14 and is therefore rejected under the same reasons set forth in the rejection of Claim 14. Claim 25 is the apparatus corresponding to the media of Claim 14 and is therefore rejected under the same reasons set forth in the rejection of Claim 14. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sasturkar (US Patent Application 2015/0033086) in view of Patil (US Patent Application 2018/0095814). Claim 4, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar does not explicitly teach to generate the one or more system state vectors based on system data, the system data comprising computing system configuration information. However, Patil teaches to generate the one or more system state vectors based on system data, the system data comprising computing system configuration information (View Patil ¶ 4, 55; system configuration data). It would have been obvious to one of ordinary skill in the art before the effective filing date to modify Sasturkar with to generate the one or more system state vectors based on system data, the system data comprising computing system configuration information since it is known in the art that a configuration data can be detected (View Patil ¶ 4, 55). Such modification would have allowed configuration data to determine if the system is having a degradation event. Claim(s) 6, 16 and 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sasturkar (US Patent Application 2015/0033086) in view of Budnik (US Patent Application 2013/0159774). Claim 6, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar does not explicitly teach to generate the one or more user interaction state vectors based on user interaction data, the user interaction data comprising information indicating user interaction with one or more of a mouse, keypad, keyboard, and touchscreen. However, Budnik teaches to generate the one or more user interaction state vectors based on user interaction data, the user interaction data comprising information indicating user interaction with one or more of a mouse, keypad, keyboard, and touchscreen (View Budnik ¶ 31; user interaction). It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the combination of teachings with to generate the one or more user interaction state vectors based on user interaction data, the user interaction data comprising information indicating user interaction with one or more of a mouse, keypad, keyboard, and touchscreen since it is known in the art that user interaction can be detected (View Budnik ¶ 31). Such modification would have allowed user interaction to determine if the system is having a degradation event. Claim 16 is the method corresponding to the media of Claim 6 and is therefore rejected under the same reasons set forth in the rejection of Claim 6. Claim 21 is the apparatus corresponding to the media of Claim 6 and is therefore rejected under the same reasons set forth in the rejection of Claim 6. Claim(s) 9 and 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sasturkar (US Patent Application 2015/0033086) in view of Gerstl (US Patent Application 2016/0092290). Claim 9, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar does not explicitly teach the classified root cause is a hardware responsiveness issue, a software responsiveness issue, or a network responsiveness issue. However, Gerstl teaches the classified root cause is a hardware responsiveness issue, a software responsiveness issue, or a network responsiveness issue (View Gerstl ¶ 17; root cause). It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the combination of teachings with the classified root cause is a hardware responsiveness issue, a software responsiveness issue, or a network responsiveness issue since it is known in the art that responsiveness can be detected (View Gerstl ¶ 17). Such modification would have allowed responsiveness to be the root cause of a system error. Claim 23 is the apparatus corresponding to the media of Claim 9 and is therefore rejected under the same reasons set forth in the rejection of Claim 9. Claim(s) 10, 18 and 24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sasturkar (US Patent Application 2015/0033086) in view of Ngai (US Patent Application 2016/0004584). Claim 10, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar does not explicitly teach display on a display of information indicating one or more of a root cause of the user experience degradation event, a severity of the user experience degradation event, a duration of the user experience degradation event, a start time of the user experience degradation event, an end time of the user experience degradation event, and system data and/or user interaction data associated with a time prior to, during, and/or after the user experience degradation event However, Ngai teaches display on a display of information indicating one or more of a root cause of the user experience degradation event, a severity of the user experience degradation event, a duration of the user experience degradation event, a start time of the user experience degradation event, an end time of the user experience degradation event, and system data and/or user interaction data associated with a time prior to, during, and/or after the user experience degradation event (View Ngai ¶ 190; display screen). It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the combination of teachings with display on a display of information indicating one or more of a root cause of the user experience degradation event, a severity of the user experience degradation event, a duration of the user experience degradation event, a start time of the user experience degradation event, an end time of the user experience degradation event, and system data and/or user interaction data associated with a time prior to, during, and/or after the user experience degradation event since it is known in the art that errors can be displayed (View Ngai ¶ 190). Such modification would have allowed a system error to be displayed. Claim 18 is the method corresponding to the media of Claim 10 and is therefore rejected under the same reasons set forth in the rejection of Claim 10. Claim 24 is the apparatus corresponding to the media of Claim 10 and is therefore rejected under the same reasons set forth in the rejection of Claim 10. Claim(s) 11 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sasturkar (US Patent Application 2015/0033086) in view of Cook (US Patent Application 2014/0298093). Claim 11, most of the limitations of this claim has been noted in the rejection of Claim 1. Sasturkar does not explicitly teach to annotate the one or more user interaction state vectors with user experience degradation information. However, Cook teaches to annotate the one or more user interaction state vectors with user experience degradation information (View Cook ¶ 44; store user interaction). It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the combination of teachings with to annotate the one or more user interaction state vectors with user experience degradation information since it is known in the art that user error information can be identified (View Cook ¶ 44). Such modification would have allowed user error information to be labeled. Claim 12, most of the limitations of this claim has been noted in the rejection of Claim 11. Sasturkar further teaches to generate the one or more user interaction state vectors based on user interaction data, the user interaction data comprising information indicating user interaction with one or more of a mouse, keypad, keyboard, and touchscreen (View Sasturkar ¶ 45; user behavior observations). Cook further teaches wherein to annotate the one or more user interaction state vectors with user experience degradation information is performed in response to the computing system determining that user interaction data indicates a jiggle of a mouse input device, a keyboard key has been pressed more than a threshold number of times within a time period, a power button has been held down longer than a threshold number of seconds, one or more restarts of the computing system, and/or a disconnection of the computing system from an external power supply (View Cook ¶ 44; store user interaction). Response to Arguments Applicant's arguments filed November 17, 2025 have been fully considered but they are not persuasive. On pages 14-16, Applicant argues that Sasturkar does not teach “detect, by a computing system, a user experience degradation event based on one or more system state vectors and one or more user interaction state vectors”, in Claims 1, 15 and 20. Examiner respectfully disagrees with Applicant because Sasturkar teaches anomalies refer to any unexpected changes in a data stream. The technology disclosed can be applied to correlating anomalies in data streams that exist in a variety of applications including, information technology (IT) systems, telecommunications systems, financial systems, security trading, banking, business intelligence, marketing, mining, energy, etc. One implementation of the technology disclosed relates to IT systems operations. IT operational data refers to any data that is produced by any human, system (hardware or software), machine, application, software, or component within an IT environment. Some examples of this operational data include metrics (server, network, database, services, hypervisor), alerts, logs, errors, software pushes, or application topology, in Paragraph 23. In Paragraph 24, Sasturkar teaches that unexpected changes in operational data i.e. anomalies are important for a number of reasons such as understanding the health of the system, alerting for system failures, or identifying the cause and symptoms for failures. In Paragraphs 38 and 39, Sasturkar teaches an assembly engine periodically retrieves "network events" data from application servers in a network. A baseline is then automatically constructed that represents the normal operating range for the network traffic and stored in baseline data store. In one example, performance metrics such as packets per second and connections per second are collected every two minutes to monitor the network traffic during business hours only (e.g. 9:00 am to 5:00 pm). In this example, assembly engine collects performance metrics that were collected during business hours over a sliding window of time such as a week or month. These extracted performance metrics are the raw data that represent the baseline of network traffic data over the sliding window of time. The assembly engine then performs statistical analysis on the raw data to generate a representation of the normal operating range of network traffic during the sliding window of time. In one implementation, anomalous performances are detected using threshold-based techniques to flag outliers. According to such an implementation, detection engine detects anomalies by comparing values of extracted performance metrics with previously calculated current normal thresholds for the performance metrics. If the values are outside their performance metric's normal limits i.e. baseline, anomalies are detected and stored as anomalous instance data. In some implementations, values of extracted performance metrics are compared to service level thresholds that represent the level at which a defined service level for a performance metric is out of bounds. When the values of extracted performance metrics reach or exceed corresponding service level thresholds, service level exceptions are triggered. According to other implementations of the technology disclosed, anomalies are detected using at least one or combination of statistical anomaly detection (unsupervised anomaly detection like multivariate auto regression analysis), data mining, or machine learning based techniques (supervised anomaly detection, semi-supervised anomaly detection). Therefore, a user experience degradation event (network event data) is detected based on the system data and user interaction data (IT operational data refers to any data that is produced by any human, system (hardware or software)). On page 16, Applicant argues that Sasturkar does not teach “to detect the user experience degradation event is performed by a degradation detection network, the degradation detection network being a neural network”, in Claim 2. Examiner respectfully disagrees with Applicant because Sasturkar teaches anomalies are detected using at least one or combination of statistical anomaly detection (unsupervised anomaly detection like multivariate auto regression analysis), data mining, or machine learning based techniques (supervised anomaly detection, semi-supervised anomaly detection), in Paragraph 39. Therefore, a machine learning technique is used to detect anomalies. Prior Art Made of Record Salajegheh et al. (U.S. Patent Application No. 2016/0103996), teaches the he computing device processor may then monitor the identified factors/features to collect behavior information, generate behavior vectors that characterize the collected behavior information, generate user-specific classifier models that test or evaluate the identified factors/features, and apply the generated behavior vectors to the user-specific classifier models to intelligently determine whether a behavior, software application, or process of the device is non-benign (e.g., malicious, performance degrading, etc.). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARAI E BUTLER whose telephone number is (571)270-3823. The examiner can normally be reached 8 am to 4 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ashish Thomas can be reached at 571-272-0631. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SARAI E BUTLER/Primary Examiner, Art Unit 2114
Read full office action

Prosecution Timeline

Jul 01, 2022
Application Filed
Aug 18, 2022
Response after Non-Final Action
Apr 08, 2023
Non-Final Rejection — §102, §103
Apr 17, 2023
Examiner Interview (Telephonic)
Apr 17, 2023
Examiner Interview Summary
Jul 12, 2025
Non-Final Rejection — §102, §103
Nov 17, 2025
Response Filed
Feb 03, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602277
MANAGING DATA PROCESSING SYSTEM FAILURES USING HIDDEN KNOWLEDGE FROM PREDICTIVE MODELS FOR FAILURE RESPONSE GENERATION
2y 5m to grant Granted Apr 14, 2026
Patent 12602297
PROCESSOR AND METHOD OF DETECTING SOFT ERROR USING THE SAME
2y 5m to grant Granted Apr 14, 2026
Patent 12602280
ENTITY ASSIGNMENT IN AUTOMATED ISSUE RESOLUTION
2y 5m to grant Granted Apr 14, 2026
Patent 12602288
MEMORY SYSTEMS AND OPERATING METHODS THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12596606
FILE PATH TRACING AND BEHAVIORAL REMEDIATION ON PATH REVISION
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
88%
Grant Probability
99%
With Interview (+10.7%)
2y 6m
Median Time to Grant
High
PTA Risk
Based on 1145 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month