DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Notice to Applicant
The following is a Final Office action. In response to Examiner’s Non-Final Rejection of 10/20/25, Applicant, on 1/20/26, amended claims. Claims 1-20 are pending in this application and have been rejected below.
Response to Amendment
Applicant’s amendments are acknowledged.
Reasons for Subject Matter Eligibility under 35 USC 101
The claim 1 overcomes the 101 rejections because the claim is now : a server storing the integrated support application having the messaging extension and the support ticketing extension; process the message via machine intelligence processing to determine a technical problem identified in the message, the integrated support application configured to: responsive to determining the technical problem, perform an action determined by the machine intelligence processing trained based on previous support interactions to mitigate the technical problem, the action comprising sending a signal to a device associated with the user to cause at least one of the device to power cycle, a network connection of the device to reset, an application executing on the device to restart, or a cache of the device to clear; and responsive to the action resolving the technical probe. When viewing the claim as a whole, this when combined with the earlier limitations is viewed as a practical application under step 2a, prong 2, as the claim is improving another technology when viewing all the limitations listed above (See MPEP 2106.05a) and/or is viewed as a using a judicial exception in a meaningful way under MPEP 2106.05(e). The same reasons also apply to independent claims 8 and 14 which have similar limitations. Remaining claims all depend from claims 1, 8, and 14.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 2, 4 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 2 recites the limitation "a machine intelligence process". There is insufficient antecedent basis for this limitation in the claim, as claim 1 now recites “process the message via “machine intelligence processing” to determine a technical problem identified in the message.” It is unclear if there are one or two different kinds of “machine intelligence.” It appears claim 2 is referring to the same “machine intelligence processing” as in claim 1. It now appears claim 2 is duplicative and Examiner suggests cancelling claim 2.
Claim 4 recites the limitation "machine intelligence processing". There is insufficient antecedent basis for this limitation in the claim, as claim 1 now recites “responsive to determining the technical problem, perform an action determined by the “machine intelligence processing.” It is unclear if there are one or two different kinds of “machine intelligence.” It appears claim 4 is referring to the same “machine intelligence processing” as in claim 1. It now appears claim 4 is duplicative and Examiner suggests cancelling claim 4.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-11, 13-17, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Das (US 2022/0398598) and Wang (US 2023/0129123) and Eberlein (US 2021/0089384).
Concerning claim 1, Das discloses:
A system for providing user technical support via an integrated support application (Das – see FIG. 3, par 42 - FIG. 3 shows a block diagram illustrating an architecture 300 in accordance with an example embodiment. While various examples provided herein are described in the context of assisting customers (e.g., end users 302) in connection with troubleshooting product/service issues through live chat via a chatbot and/or live human agents (e.g., agents 301) associated with a product support system) that a messaging extension integrating messaging functionality into the integrated support application and a support ticketing extension integrating support ticketing into the integrated support application (Das see par 26 - Case records may include, among other fields, a title, a description, a subject, a unique identifier (e.g., a case number); see par 43 - Cloud 320a may be a public cloud through which a Customer Relationship Management (CRM) application 330 (e.g., Salesforce, Microsoft Dynamics 365, and Zoho CRM) and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. CRM application 330 includes a case records database 331 and an Application Programming Interface (API) 332. The case records database 331 may store historical case data, including case notes input in the form of text-based information by call center agents, relating to support issues raised by customers. Customers may communicate with the call center agents through chat and/or other communication mechanisms (e.g., voice calls or video calls) supported by the contact center solution 335. Depending upon the particular implementation, API 332 may be a REST API through which interactions between the CRM application 330 and other external systems), the system comprising:
a data store (Das – see par 45, FIG. 3- the topic modeling engine 340 may load a subset of case records from the case records database 331 via API 332 into an input case data database 341 to facilitate performance of LDA at one or more levels of granularity. In one embodiment, LDA is performed on the extracted case data at the component level (e.g., a power supply, battery, or a fan of a server), the product line level (e.g., HPE ProLiant ML servers produced by Hewlett Packard Enterprise Company), and/or the issue level (e.g., hard drive failure, how to update the power supply firmware). . The identified combination of words (or a revised category label provided by an SME) may then be used for the issue category, persisted in the a supported issue categories database 341 and represented by a single vector upon completion by applying the corresponding product line specific word association model from the product line specific word association models 346.);
a server storing the integrated support application having the messaging extension and the support ticketing extension, the integrated support application configured (Applicant’s paragraph [0030] as filed states “In some examples, the messaging application 114 is organized into one or more groups, channels, threads, or other logical separations where each logical separation may correspond to a particular area (e.g., application, service, device, etc.) for technical support. In various examples, the messaging application 114 may be Slack, Microsoft Teams, Zoom, or any other commercially available, or privately developed, application having messaging functionality.” paragraph [0031] as filed states “The messaging extension 118 may be an interface between the integrated support application 112 and the messaging application 114. For example, the integrated support application 112 may read, receive, or intercept messages from the messaging application 114 via the messaging extension 118 and may write, deliver, or otherwise provide messages to the messaging application 114 via the messaging extension 118.”
Das discloses the limitations based on broadest reasonable interpretation in light of the specification – see par 25 - as used herein “call center” is intended to encompass additional channels and forms of communication including, but not limited to, live chat. see par 26 - the phrase “case record” is intended to broadly refer to a unit of case data maintained or otherwise utilized by a call center. Case records may include, among other fields, a title, a description, a subject, a unique identifier (e.g., a case number); see par 28 - The terms “component”, “module”, “system,” and the like as used herein are intended to refer to a computer-related entity. Such a component, module, or system may be in the form of a software-executing general-purpose processor, hardware, firmware or a combination thereof; an application running on a server and the server can be a component; components may communicate via a remote processes, e.g. data from distributed system across a network; see par 43, FIG. 3- the architecture 300 includes one or more clouds 120a-b, which may represent private or public clouds. Cloud 320a may be a public cloud through which a Customer Relationship Management (CRM) application 330 (e.g., Salesforce, Microsoft Dynamics 365, and Zoho CRM) and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. According to one embodiment, CRM application 330 includes a case records database 331 and an Application Programming Interface (API) 332. Depending upon the particular implementation, API 332 may be a REST API through which interactions between the CRM application 330 and other external systems, including those associated with the chat environment, may be handled.)
to:
receive, from a messaging application and via an application programming interface of the messaging application, a message from a user (Das – see par 43 - In the context of the present example, the architecture 300 includes one or more clouds 120a-b, which may represent private or public clouds. Cloud 320a may be a public cloud through which a Customer Relationship Management (CRM) application 330 and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. For example, customers may communicate with the call center agents through chat and/or other communication mechanisms (e.g., voice calls or video calls) supported by the contact center solution 335. Depending upon the particular implementation, API 332 may be a REST API through which interactions between the CRM application 330 and other external systems, including those associated with the chat environment, may be handled), wherein the message initiates a technical support session (Das – see par 29 - In the context of various embodiments described herein, a chatbot (e.g., chatbot 120) receives live chat text 111 from a customer (e.g., customer 110). The live chat text 111 may include a description, in the customer's words, of a problem or issue being experienced by the customer with a product line of a vendor. The chatbot, in turn, may initiate guided problem solving 122 based on a troubleshooting workflow within a set of defined troubleshooting workflows 140. see par 43 - facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. The case records database 331 may store historical case data, including case notes input in the form of text-based information by call center agents, relating to support issues raised by customers. the case records database 331 may store chat transcripts of live chat sessions. See par 44 - In the context of the present example, cloud 320b includes a chatbot 345 (which may correspond to chatbots 120 and 240);
see also Wang – see par 25 - the assigned IT agent may then open the unresolved trouble ticket from trouble ticket database 145 for determining the issue type and root cause of the IT issue described in that ticket towards providing any necessary remediation for resolving the open trouble ticket. Concurrently with opening the trouble ticket and until the IT agent’s trouble ticket session has ended or until the trouble ticket is closed, trouble ticket management system 140 may monitor the actions of the IT agent with regards to the open trouble ticket in step 235. This monitoring may include generating session events identifying the actions taken by the IT agent with regards to the trouble ticket as well as identifying the various types of data accessed by the IT agent; see par 32, FIG. 3 - Databases 306 may include a trouble ticket database 310, a log file database 320, a telemetry database 325, a session event database 330, an aggregated session event database 335, and a label database 340. Application tools 307 may include a set of IT tools 350 including a log viewer 355 and a telemetry viewer 356, a session event generator 360, a session event aggregator 365, a label generator 370 utilizing predetermined criteria 342, a modeling system 380 and an implemented trouble ticket model 385.);
process the message via machine intelligence processing (Das – see par 15 - identifying an appropriate scope of product issue categories to be supported by the chatbot (e.g., those that are self-solvable by customers), training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories; see par 37 - intent identification may first involve filtering the historical case data to create a subset of the product support cases that are both remotely resolvable and non-critical. A neural network classification model may be trained with pre-labeled data for fields indicative of criticality and remote resolvability. Then, a series of AI (artificial intelligence) classification tasks may be performed to determine criticality and remote resolvability for the entirety of the historical case data, thereby allowing the desired subset of historical product support cases to be identified;
see also Eberlein par 18 - the described solution takes into account system development from data collection for use by machine learning of an issue resolution proposal system to fully automate an issue resolution system ) to determine a technical problem identified in the message (Das – see par 30 - The troubleshooting workflows 140 may correspond to product issue categories 130, representing the universe of those product issue categories in the scope of the chatbot. The appropriate troubleshooting workflow may be identified by the chatbot 120 mapping the live chat text 121 to a particular product issue category within the supported product issue categories 130; see par 31 - the troubleshooting workflows 140 are guided troubleshooting decision trees designed with the assistance of product support subject matter experts (SMEs) and with reference to product manuals, and/or support knowledge articles; example of intent discovery and acquisition that may be performed by a combination of human SMEs and machine learning approaches to produce the troubleshooting workflows 140 is described further below with reference to FIG. 2. see par 36 - intent discovery and acquisition 210 may include activities performed by a combination of human SMEs and machine-learning processes. In one embodiment, intent identification processing may be performed to address issues associated with interpreting free text problem descriptions from customers);
responsive to determining the technical problem, perform an action determined by the machine intelligence processing trained based on previous support interactions (See also Eberlein – see par 31 - each of the one or more proposed actions 122 can be based on collected data associated with past issue resolutions. see par 32 - The learning system 202 can be configured to receive input on the proposed action 224 (for example, action changes proposed by operator), to process the (modified) proposed action 224 and to monitor process results 218. The issue resolution data monitored by the learning system 202 can be enriched by extracted problem features 222 and action parameters 226 for the resolving actions of issues processed by the proposal system 204.; see par 45 - At 318, an outcome (for example, success or failure) of the solution is determined.) to mitigate the technical problem (Dependent claim 3 states the action can be giving “written documentation” to a user; see also Applicant’s [0018] as filed “the bot may act based on the message. For example, the bot may identify previous communications occurring in the messaging application which may be useful in providing technical support to the user. The bot may provide a link or other reference to identify the previous communications to the user. In another example, the bot may identify literature, such as operating instructions, support documentation, technical articles, website (such as support forums or message boards) that may be useful in providing technical support to the user.”; [0033] as filed “documentation, support documents, or other written materials related to the technical problem or which may be useful in mitigating, resolving, or otherwise addressing the technical problem.).
Das discloses the limitations based on broadest reasonable interpretation in light of the specification – See par 15 - training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories; see par 38 - Based on the limited set of product issue categories to be supported by the chatbot for each product line at issue, chatbot-specific approved workflows may be created for each issue category within the given product line with input from appropriate members of the product engineering teams. Once reviewed and approved by appropriate stakeholders, the troubleshooting workflows may be codified and made accessible for use by the chatbot at run-time, for example, via an Application Programming Interface (API) (e.g., a Representational State Transfer (REST) API). See par 41 - In the context of the present example, appropriate interfaces may be made available for use by the chatbot to check a customer's entitlement to support for the product at issue, transfer the case to a live agent, create a case within a CRM application, and/or assist the customer self-solve the product issue by returning knowledge articles resulting from searching a product support knowledge article database (transferring to a live agent, returning knowledge articles disclose “actions”) based on the customer defined issue text and information identifying the product line at issue; see also par 73 - At block 530, the customer issue description (e.g., a numerical vector output by the product line specific word association model) is mapped by a mapping engine (e.g., mapping engine 350) to a supported issue category. For example, an intermediate classification model (e.g., a product specific LSTM model trained in accordance with the supervised learning approach described with reference to FIG. 4) may attempt to match the representation of the customer issue description to a representation of a supported issue category; see par 85 - a second stage of operations in which unsupervised learning may be used to incrementally identify new product issue categories.
See also Wang – see par 27 - In step 250, upon determining the issue type and root cause, the IT agent may determine an appropriate resolution, including any needed remediation, of the IT issue. In step 255, the IT agent may implement or cause to be implemented the appropriate remediation to resolve the determined issue type and root cause. This may be implemented after performing some tests to confirm that the remediation solves the issue type and root cause and does not create new IT issues.)
Das discloses that there are product issues categories supported by the chatbot (See par 15) where there are a universe of issue categories in the scope of the chatbot (See par 30). Wang discloses identifying prior resolved trouble tickets with similar symptoms and corresponding IT assets along with remediation steps and confirming predictions based on previously resolved trouble tickets (See par 62).
Eberlein discloses:
responsive to determining the technical problem, perform an action determined by the machine intelligence processing trained based on previous support interactions to mitigate the technical problem “the action comprising sending a signal to a device associated with the user to cause at least one of the device to power cycle, a network connection of the device to reset, an application executing on the device to restart, or a cache of the device to clear; responsive to the action resolving the technical problem and without user action” (Eberlein – see FIG. 3, par 37 - At 302, an incident report is received. The incident report can include data associated with one or more identified software artifacts, software modules, and configurations of the software modules. At 308, the features and parameters are processed to retrieve, from a database, a set of solutions that were previously executed to resolve associated issues. The set of solutions can be processed to select a solution from the set of solutions to resolve the issue. see par 44 - At 316, the solution is implemented. Examples for implementations of solutions can include: an action to check compliance with constraints and their resolution, unlock of user or a user password, issuing or retrieving a new certificate, update (for example, scale-up/scale-out) instances to overcome resource shortages, increase configuration parameters (for example, number of connections, cache size, memory size, etc.), remove locks, deploy a patch (for example, modification of a system), or force restart a service; see par 45, FIG. 3 - At 318, an outcome (for example, success or failure) of the solution is determined.).
Das, Wang, and Eberlein disclose:
responsive to the action resolving the technical problem and without user action (Das - discloses that there are product issues categories supported by the chatbot (See par 15) where there are a universe of issue categories in the scope of the chatbot (See par 30); see par 41 - assist the customer self-solve the product issue by returning knowledge articles resulting from searching a product support knowledge article database (transferring to a live agent, OR returning knowledge articles disclose “actions”) based on the customer defined issue text and information identifying the product line at issue;
see also Wang par 27 - In step 250, upon determining the issue type and root cause, the IT agent may determine an appropriate resolution, including any needed remediation, of the IT issue. In step 255, the IT agent may implement or cause to be implemented the appropriate remediation to resolve the determined issue type and root cause;
see also Eberlein - see par 44 - At 316, the solution is implemented. Examples for implementations of solutions can include: an action to check compliance with constraints and their resolution, unlock of user or a user password, issuing or retrieving a new certificate, update (for example, scale-up/scale-out) instances to overcome resource shortages, increase configuration parameters (for example, number of connections, cache size, memory size, etc.), remove locks, deploy a patch (for example, modification of a system), or force restart a service):
Das discloses “When the product support inquiry involves a live chat session, the chat transcript may also be recorded” (See par 14) and “Various technological hurdles and complexities exist, however, to automating the handling of technical support issues including identification of product issue categories from historical case data, identifying an appropriate scope of product issue categories to be supported by the chatbot (e.g., those that are self-solvable by customers)” (See par 15).
Wang discloses “sessions” and tickets:
terminate the technical support session (Wang – see par 25 - Concurrently with opening the trouble ticket and until the IT agent’s trouble ticket session has ended or until the trouble ticket is closed, trouble ticket management system 140 may monitor the actions of the IT agent with regards to the open trouble ticket in step 235. This monitoring may include generating session events identifying the actions taken by the IT agent with regards to the trouble ticket as well as identifying the various types of data accessed by the IT agent; see par 30, 33 - In step 280, after a sufficient number of trouble tickets have been resolved and closed, the aggregated and labeled session events and log data lines may then be utilized for modeling the process of resolving trouble tickets in modeling system 146. see par 52 - Concurrently with and responsive to the assigned IT agent opening the trouble ticket and until the IT agent’s trouble ticket on-line session has ended or until the trouble ticket is closed, session event generator 360 may monitor the actions of the IT agent with regards to the open trouble ticket in step 525. Additional monitoring of the actions of the IT agent may be performed when the IT agent utilizes a log viewer or other IT tool.)
Das and Wang and Eberlein disclose:
automatically generate a first support ticket for tracking the technical problem and efficacy of actions taken (Das – see par 82 - A lower than desired issue resolution accuracy may be indicative of the effectiveness or accuracy of the troubleshooting workflow for particular issues/problems diminishing over time, for example, as a result of changes in the product line. In one embodiment, an issue resolution accuracy of approximately between 80% to 90% may be used as a desired issue resolution accuracy threshold;
see also Eberlein – see par 32 - The learning system 202 can be configured to receive input on the proposed action 224 (for example, action changes proposed by operator), to process the (modified) proposed action 224 and to monitor process results 218. The issue resolution data monitored by the learning system 202 can be enriched by extracted problem features 222 and action parameters 226 for the resolving actions of issues processed by the proposal system 204. see par 41 - At 310, it is determined whether an accuracy of the solution exceeds a solution implementation threshold. The solution implementation threshold can include a probability (for example, percentage) of solution success. After 310, method 300 proceeds to 312. see par 45, FIG. 3 - At 318, an outcome (for example, success or failure) of the solution is determined. For example, in response to implementing the solution, the process that generated the issue is restarted and monitored to determine whether the same error or an associated error reappears. If no associated error occurs, the outcome of the solution is considered as being successful.) by interfacing, via the support ticketing extension, with a support ticketing application (Das discloses the limitations based on broadest reasonable interpretation in light of the specification – see par 26 - the phrase “case record” is intended to broadly refer to a unit of case data maintained or otherwise utilized by a call center. Case records may include, among other fields, a title, a description, a subject, a unique identifier (e.g., a case number); see par 28 - an application running on a server and the server can be a component; components may communicate via a remote processes, e.g. data from distributed system across a network; see par 43, FIG. 3- a Customer Relationship Management (CRM) application 330 (e.g., Salesforce, Microsoft Dynamics 365, and Zoho CRM) and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. According to one embodiment, CRM application 330 includes a case records database 331 and an Application Programming Interface (API) 332. Depending upon the particular implementation, API 332 may be a REST API through which interactions between the CRM application 330 and other external systems, including those associated with the chat environment, may be handled);
record information related to the technical support session (Das – see par 39 - The case history text can then be filtered by product line to create a custom corpus that may be fed as input to a word association model (e.g., Word2Vec), thereby producing a product line specific word association model for each product line. see par 50 - After a sufficient amount of product support cases have been labeled with SME validated product issue categories as a result of first stage operations, during the second stage, the labeling engine 360 may operate more independently of and reduce the burden on SMEs by performing unsupervised learning (e.g., an auto labeling approach) to incrementally identify and label new issue categories. A non-limiting example of an automated labeling approach that may be used during the second stage of operations is described below with reference to FIG. 7; see par 54 - ] At block 410, product issue categories (e.g., top words) may be identified based on historical case records (e.g., case records 331). According to one embodiment, a corpus of data may be created by extracting text from chat transcripts of live chats stored within the historical case records and a topic model (e.g., LDA) may be applied to find clusters of cases (representing the product issue categories) as well as the suggested top words for naming (labeling) the clusters. See par 88 - At block 720, information indicative of a total number of cases associated with each issue category is extracted from a set of historical case records maintained by a call center. For example, a topic modeling engine (e.g., topic modeling engine 340) may request statistical information for cases records associated with a particular product line via an API (e.g., API 332) of a CRM application (CRM application 330) running within the call center.
See also Wang – for “session” – see par 27 - Then in step 265, the IT agent may then document the determined issue type and root cause as well as the remediation steps in the trouble ticket followed by closing the trouble ticket in trouble ticket database 145. see par 28 - In step 270, the session events for the trouble ticket may be automatically aggregated by trouble ticket management system 140. That is, session events may be grouped together by trouble ticket. For example, multiple IT agents may work on a trouble ticket across multiple sessions utilizing multiple tools, thereby generating session events stored across multiple files and/or tables. By aggregating session events by trouble ticket, the analysis utilized for each trouble ticket may be more easily utilized such as automatically through modeling. In addition, aggregating session events may also allow for easier labeling for use in modeling the trouble ticket process.); and
Das discloses “For example, responsive to receipt of a product support inquiry, a new product support case (or simply a case), representing a customer issue or problem, may be opened in a Customer Relationship Management (CRM) application to store, among other things, information regarding the product line and the specific component to which the issue or problem relates, a description of the issue, a resolution, whether the issue is remotely resolvable by the customer” (See par 14).
Wang discloses:
automatically close the first support ticket as resolved by interfacing, via the support ticketing extension, with the support ticketing application (Wang – see par 19 - Trouble ticket management system 140 may be utilized by IT agents 125 to manage trouble tickets for IT vendor data center 120 and for customer data centers 150 on behalf of the IT vendor and its customers. In the present embodiment, trouble ticket database 145 may store trouble tickets from their submission through resolution and closing and subsequently for historical and modeling purposes. see par 30 - In step 280, after a sufficient number of trouble tickets have been resolved and closed, the aggregated and labeled session events and log data lines may then be utilized for modeling the process of resolving trouble tickets in modeling system 146. As shown with the bidirectional dashed line to step 220, modeling system 146 may then be utilized in step 290 towards predicting the issue type and root cause (i.e., case classification) of an IT issue described in subsequently submitted trouble tickets.)
Das, Wang, and Eberlein are analogous art as they are directed to handling problems/issues (see Das Abstract; Wang Abstract; Eberlein Abstract). 1) Das discloses “When the product support inquiry involves a live chat session, the chat transcript may also be recorded” (See par 14) and “Various technological hurdles and complexities exist, however, to automating the handling of technical support issues including identification of product issue categories from historical case data, identifying an appropriate scope of product issue categories to be supported by the chatbot (e.g., those that are self-solvable by customers)” (See par 15). Wang improves upon Das by disclosing having tickets and sessions (see par 25, 28, 30, 33, 52). One of ordinary skill in the art would be motivated to further include session events for resolving tickets to efficiently improve upon the case number/identifier (par 26) and chat session (par 14) disclosures in Das. 2) Das discloses that there are product issues categories supported by the chatbot (See par 15) where there are a universe of issue categories in the scope of the chatbot (See par 30). Wang discloses identifying prior resolved trouble tickets with similar symptoms and corresponding IT assets along with remediation steps and confirming predictions based on previously resolved trouble tickets (See par 62). Eberlein improves upon Das and Wang by disclosing forcing a restart of a service as a solution to resolve an issue from an incident report (See par 37, 44). One of ordinary skill in the art would be motivated to further include causing some software to restart to efficiently improve upon resolving some issues by a chatbot in Das and the remediation steps for IT assets in Wang.
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the use of chatbot and a human agent for escalated cases in Das to further utilize session data, closing of session, relative to trouble tickets from customers as disclosed in Wang, and automating resolution of an issue by restarting a service as disclosed in Eberlein, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success.
Concerning independent claim 8, Das and Wang disclose:
A method for performing user technical support via an integrated support application that integrates messaging, support ticketing, and automatic problem resolution, the method implemented by a computer system (Das – see par 26 - Case records may include, among other fields, a title, a description, a subject, a unique identifier (e.g., a case number); see par 28 - The terms “component”, “module”, “system,” and the like as used herein are intended to refer to a computer-related entity. Such a component, module, or system may be in the form of a software-executing general-purpose processor, hardware, firmware or a combination thereof; an application running on a server and the server can be a component; components may communicate via a remote processes, e.g. data from distributed system across a network; see FIG. 3, par 42 - FIG. 3 shows a block diagram illustrating an architecture 300 in accordance with an example embodiment. While various examples provided herein are described in the context of assisting customers (e.g., end users 302) in connection with troubleshooting product/service issues through live chat via a chatbot and/or live human agents (e.g., agents 301) associated with a product support system; see par 43 - Cloud 320a may be a public cloud through which a Customer Relationship Management (CRM) application 330 (e.g., Salesforce, Microsoft Dynamics 365, and Zoho CRM) and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service.), and comprising:
receiving, at the integrated support application from a messaging application and via an application programming interface of the messaging application, a message from a user (Das – [same as claim 1 above] - see par 43, FIG. 3-; see par 29 - In the context of various embodiments described herein, a chatbot (e.g., chatbot 120) receives live chat text 111 from a customer (e.g., customer 110)… see par 43 - facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. The case records database 331 may store historical case data... the case records database 331 may store chat transcripts of live chat sessions. See par 44 - In the context of the present example, cloud 320b includes a chatbot 345 (which may correspond to chatbots 120 and 240);
see also Wang – see par 25 …; see par 32, FIG. 3)
processing, by the integrated support application, the message to determine a technical problem identified in the message (Das [same as claim 1 above] – see par 30; see par 31);
responsive to determining the technical problem, performing, via the integrated support application, machine intelligence processing training based on previous support interactions and survey responses from users (Examiner notes the “name” of the source of data being “survey responses” is not entitled to patentable weight at this time. Nonetheless, Das, Wang, and Eberlein disclose the limitations-
Das – see par 15, see par 38 - Based on the limited set of product issue categories to be supported by the chatbot for each product line at issue, chatbot-specific approved workflows may be created for each issue category within the given product line with input from appropriate members of the product engineering teams. Once reviewed and approved by appropriate stakeholders, the troubleshooting workflows may be codified and made accessible for use by the chatbot at run-time; See also Wang – see par 27 - In step 250, upon determining the issue type and root cause, the IT agent may determine an appropriate resolution, including any needed remediation, of the IT issue. In step 255, the IT agent may implement or cause to be implemented the appropriate remediation to resolve the determined issue type and root cause; see par 54 - Then in step 544, the IT agent may confirm that the remediation resolves the IT issue and may confirm that the remediation does not create new IT issues.) based on the determined technical problem to determine at least one action useful in mitigating the determined technical problem (Das [same as claim 1 above] - See par 15 - training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories; see par 38, 41; See par 50 - After a sufficient amount of product support cases have been labeled with SME validated product issue categories as a result of first stage operations, during the second stage, the labeling engine 360 may operate more independently of and reduce the burden on SMEs by performing unsupervised learning (e.g., an auto labeling approach) to incrementally identify and label new issue categories;
see also Eberlein – see par 31-32, 45);
mitigating, via the integrated support application, the determined technical problem based on the determined action (Das See par 15; see par 38; See par 41;
See also Wang – see par 27; see par 61 - That is, in the present embodiment, artificial intelligence such as machine learning may utilize supervised learning with the labeled database to determine the relationship between the inputs (e.g., IT asset type, symptoms, log data lines, etc.) and outputs (e.g., case classification including event type and root cause). These relationships may then be utilized towards identifying which log data lines to inspect for a given IT asset type and symptoms towards predicting the event type and root cause of subsequently submitted trouble tickets. With automated inspection of the identified log data lines of the relevant IT assets, a prediction of the case classification, including event type and root cause, may be completed without human intervention as described below) wherein the mitigating comprising sending a signal to a device associated with the user to cause at least one of the device to power cycle, a network connection of the device to reset, an application executing on the device to restart, or a cache of the device to clear (Eberlein as in claim 1 - see FIG. 3, par 37 ; see par 44; see par 45);
responsive to the determined action resolving the determined technical problem, terminating, via the integrated support application, the technical support session (Wang – [same as cl. 1] - see par 25 ; see par 30, 33. see par 52); and
automatically generating a first support ticket for tracking the technical problem and efficacy of determined action (Das – same as claim 1 - see par 26; see par 28; see par 43, FIG. 3; par 82; see also Eberlein par 32, 41, 45);
recording information related to the technical support session (Das same as cl. 1 – see par 39; see par 50; see par 54; See par 88;
See also Wang [as in claim 1]– for “session” – see par 27-28); and
automatically closing the first support ticket as resolved (Wang [as in claim 1]– see par 19; see par 30)
It would have been obvious to combine Das and Wang and Eberlein for the same reasons as claim 1 above.
Concerning independent claim 14, Das and Wang and Eberlein disclose:
A method for performing user technical support via an integrated support application that integrates messaging, support ticketing, and automatic problem resolution (Das – see par 26 - Case records may include, among other fields, a title, a description, a subject, a unique identifier (e.g., a case number); see par 28 - The terms “component”, “module”, “system,” and the like as used herein are intended to refer to a computer-related entity. Such a component, module, or system may be in the form of a software-executing general-purpose processor, hardware, firmware or a combination thereof; an application running on a server and the server can be a component; components may communicate via a remote processes, e.g. data from distributed system across a network; see FIG. 3, par 42 - FIG. 3 shows a block diagram illustrating an architecture 300 in accordance with an example embodiment. While various examples provided herein are described in the context of assisting customers (e.g., end users 302) in connection with troubleshooting product/service issues through live chat via a chatbot and/or live human agents (e.g., agents 301) associated with a product support system; see par 43 - Cloud 320a may be a public cloud through which a Customer Relationship Management (CRM) application 330 (e.g., Salesforce, Microsoft Dynamics 365, and Zoho CRM) and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service), the method implemented by a computer system (see par 28 - The terms “component”, “module”, “system,” and the like as used herein are intended to refer to a computer-related entity. Such a component, module, or system may be in the form of a software-executing general-purpose processor, hardware, firmware or a combination thereof; an application running on a server and the server can be a component) comprising:
receiving, at an integrated support application from a messaging application and via an application programming interface of the messaging application, a message from a user (Das [same as cl. 8] – see par 43), wherein the message initiates a technical support session (Das [same as claim 8] – see par 29; See par 44);
see also Wang – see par 25; see par 32, FIG. 3);
processing, via the integrated support application (see par 28, 43 - The terms “component”, “module”, “system,” and the like as used herein are intended to refer to a computer-related entity. Such a component, module, or system may be in the form of a software-executing general-purpose processor, hardware, firmware or a combination thereof; an application running on a server and the server can be a component), the message to determine via machine intelligence processing (Das – same as claim 1 – par 15, 37; Eberlein par 18), a technical problem identified in the message (Das [same as claim 8]– see par 30 - The troubleshooting workflows 140 may correspond to product issue categories 130, representing the universe of those product issue categories in the scope of the chatbot. The appropriate troubleshooting workflow may be identified by the chatbot 120 mapping the live chat text 121 to a particular product issue category within the supported product issue categories 130; see par 31 - the troubleshooting workflows 140 are guided troubleshooting decision trees designed with the assistance of product support subject matter experts (SMEs) and with reference to product manuals, and/or support knowledge articles);
responsive to determining the technical problem, identifying, via the integrated support application, written materials related to the determined technical problem (Das – see par 41 - At run-time, the chatbot, for example, during operationalization of the word association and classification models (e.g., product-line specific Long Short-Term Memory Network (LSTM) models), may also make use of an NLP engine library 250 and various integrations 260. In the context of the present example, appropriate interfaces may be made available for use by the chatbot to … assist the customer self-solve the product issue by returning knowledge articles resulting from searching a product support knowledge article database based on the customer defined issue text and information identifying the product line at issue (e.g., a product name and/or serial number;
see also Eberlein – See par 26 - data storage is filtered based on data types, such that only data associated to new issues and solutions is stored to optimize data storage based on relevance; see par 34 - Statistical information on usage ranking or percentages can be used in a cost function of past issues with different degrees of similarity that are relevant for resolving a particular issue);
providing, via the integrated support application, access to the written materials to the user via the messaging application (Das – see FIG. 3 – end users 302 in communication with chatbot 345; various examples provided herein are described in the context of assisting customers (e.g., end users 302) in connection with troubleshooting product/service issues through live chat via a chatbot; see par 76 - the chatbot may further attempt to help the customer to self-solve the issue by performing a search through a support knowledge article database based on the customer defined issue text and may return relevant links to relevant knowledge documents to the customer);
responsive to the written materials resolving the determined technical problem, terminating, via the integrated support application, the technical support session (Wang – [same as cl. 1, 8] - see par 25 - Concurrently with opening the trouble ticket and until the IT agent’s trouble ticket session has ended or until the trouble ticket is closed, trouble ticket management system 140 may monitor the actions of the IT agent with regards to the open trouble ticket in step 235. This monitoring may include generating session events identifying the actions taken by the IT agent with regards to the trouble ticket as well as identifying the various types of data accessed by the IT agent; see par 30, 33 - In step 280, after a sufficient number of trouble tickets have been resolved and closed, the aggregated and labeled session events and log data lines may then be utilized for modeling the process of resolving trouble tickets in modeling system 146. see par 52 - Concurrently with and responsive to the assigned IT agent opening the trouble ticket and until the IT agent’s trouble ticket on-line session has ended or until the trouble ticket is closed, session event generator 360 may monitor the actions of the IT agent with regards to the open trouble ticket in step 525. Additional monitoring of the actions of the IT agent may be performed when the IT agent utilizes a log viewer or other IT tool);
automatically generating a first support ticket for tracking the determined technical problem and efficacy of the written materials ([similar to claim 1 above] - Das – see par 82;
see also Eberlein – see par 32; see par 41; see par 45, FIG. 3);
automatically closing the first support ticket as resolved (Wang [as in claim 1]– see par 19; see par 30);
responsive to the written materials not resolving the determined technical problem, (Das [same as cl. 8] – See par 33 - In one embodiment, when the chatbot is unable to resolve the customer's issue, the product support case may be escalated 123 to a live human agent. See par 41 – chatbot… transfer case to a live agent, create a case within a CRM (Customer relationships management) application); See par 76 -Other potential reasons for escalating a product support case may include … the chatbot is unable to identify the intent from the customer text after multiple attempts, and/or the customer provides feedback that his/her issue remains unresolved after the chatbot has executed the troubleshooting flows. In one embodiment, before escalating the product support case to the live human agent), performing an action determined by the machine intelligence processing trained based on previous support interactions to mitigate the determine technical problem, the action comprising sending a signal to a device associated with the user to cause at least one of the device to power cycle, a network connection of the device to reset, an application executing on the device to restart, or a cache of the device to clear (Eberlein as in claim 1 - see FIG. 3, par 37 ; see par 44; see par 45).
It would have been obvious to combine Das and Wang and Eberlein for the same reasons as claim 1 above.
Concerning claim 2, Das and Wang disclose:
The system of claim 1, wherein the integrated support application is configured to process the message according to a machine intelligence process to determine the technical problem identified in the message (Das – see par 15 - identifying an appropriate scope of product issue categories to be supported by the chatbot (e.g., those that are self-solvable by customers), training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories. Additionally, each product issue category might involve numerous complex resolution workflows (which may also be referred to herein as troubleshooting workflows and/or decision trees). see par 38 - Based on the limited set of product issue categories to be supported by the chatbot for each product line at issue, chatbot-specific approved workflows may be created for each issue category within the given product line with input from appropriate members of the product engineering teams. Once reviewed and approved by appropriate stakeholders, the troubleshooting workflows may be codified and made accessible for use by the chatbot at run-time, for example, via an Application Programming Interface (API) (e.g., a Representational State Transfer (REST) API; See par 50 - After a sufficient amount of product support cases have been labeled with SME validated product issue categories as a result of first stage operations, during the second stage, the labeling engine 360 may operate more independently of and reduce the burden on SMEs by performing unsupervised learning (e.g., an auto labeling approach) to incrementally identify and label new issue categories).
Concerning claim 3, Das and Wang disclose:
The system of claim 1, wherein to perform the action to mitigate the technical problem, the integrated support application is configured to determine and provide written documentation related to the technical problem to the user via the messaging application (Das – see FIG. 3 – end users 302 in communication with chatbot 345; various examples provided herein are described in the context of assisting customers (e.g., end users 302) in connection with troubleshooting product/service issues through live chat via a chatbot; see par 76 - the chatbot may further attempt to help the customer to self-solve the issue by performing a search through a support knowledge article database based on the customer defined issue text and may return relevant links to relevant knowledge documents to the customer)
Concerning claim 4, Das and Wang disclose:
The system of claim 1, wherein to perform the action to mitigate the technical problem, the integrated support application is configured to perform machine intelligence processing based on the technical problem to determine the action (Das – see par 15 - training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories. Additionally, each product issue category might involve numerous complex resolution workflows (which may also be referred to herein as troubleshooting workflows and/or decision trees). see par 73 - At block 530, the customer issue description (e.g., a numerical vector output by the product line specific word association model) is mapped by a mapping engine (e.g., mapping engine 350) to a supported issue category. For example, an intermediate classification model (e.g., a product specific LSTM model trained in accordance with the supervised learning approach described with reference to FIG. 4) may attempt to match the representation of the customer issue description to a representation of a supported issue category. See par 75 - At block 560, the chatbot may initiate an automated, interactive, troubleshooting conversational dialog with the user guided based on a decision tree (e.g., one of decision trees 355) for the matching product issue category within the product line as identified at decision block 540; see par 85 - a second stage of operations in which unsupervised learning may be used to incrementally identify new product issue categories.)
Concerning claim 5, Das discloses evaluating (FIG. 6), whether there are unsatisfactory issue resolutions (See par 78), and if too many issues are unrecognizable by the chatbot, it may be indicative of a need for “retraining” word association models (See par 80-82).
Eberlein discloses:
The system of claim 4, wherein the integrated support application is configured to query the user, via the messaging application, for feedback to determine effectiveness of the action in mitigating the technical problem, and wherein the integrated support application is configured to train the machine intelligence processing based on the feedback (Eberlein – see par 18 - The described solution takes into account system development from data collection for use by machine learning of an issue resolution proposal system to fully automate an issue resolution system. see par 29 - The learning system 202 can be configured to collect data from the IT landscape based on a parameter set (for example, in response to detecting an error). The learning system 202 can be configured to store parameter data with attributes to allow mapping of captured data to particular software functions. The learning system 202 can include data annotation components for annotations of features 212 and parameters 214. See par 30 - The actions can be parametrized (for example, with the instance ID) to work on the impacted service instances. The issue information 208, the additional information 210, the annotations 212, 214, the action names 216, the process result 218, and the parameter values 220 can be captured by the learning system 202. The data captured by the learning system 202 can be selectively stored as reference for future issue resolution procedures. In response to collecting sufficient issue resolution data (for example, based on issue occurrence frequency or based on collected issue types) the learning system 202 can be replaced by the proposal system 204. see par 32 - The learning system 202 can be configured to receive input on the proposed action 224 (for example, action changes proposed by operator), to process the (modified) proposed action 224 and to monitor process results 218. The issue resolution data monitored by the learning system 202 can be enriched by extracted problem features 222 and action parameters 226 for the resolving actions of issues processed by the proposal system 204. see par 45 - At 318, an outcome (for example, success or failure) of the solution is determined. For example, in response to implementing the solution, the process that generated the issue is restarted and monitored to determine whether the same error or an associated error reappears. If no associated error occurs, the outcome of the solution is considered as being successful.
Obvious to combine Das, Wang, and Eberlein for the same reasons as claim 1 above.
Concerning claim 6, Das and Wang disclose:
The system of claim 1, wherein the integrated support application is configured to function as a relay for messages between the messaging application and the support ticketing application (See Applicant’s FIG. 1-2, [0051] as filed “the integrated support application may operate as a relay or middleman between the support ticketing application (or another support application) and the messaging application. In this role, the integrated support application may provide messages posted by a support engineer in the support ticketing or other support application to the user in the messaging application, and vice versa.”
Das discloses the limitations based on broadest reasonable interpretation in light of the specification – see par 28 - components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal). see par 43 - Cloud 320a may be a public cloud through which a Customer Relationship Management (CRM) application 330 (e.g., Salesforce, Microsoft Dynamics 365, and Zoho CRM) and an associated contact center solution 335 are delivered as a service to the agents to, for example, facilitate handling and documentation of inbound communications from the customers of a vendor of a particular product or service. The case records database 331 may store historical case data, including case notes input in the form of text-based information by call center agents, relating to support issues raised by customers. Additionally, the case records database 331 may store chat transcripts of live chat sessions. For example, customers may communicate with the call center agents through chat and/or other communication mechanisms (e.g., voice calls or video calls) supported by the contact center solution 335; see par 45 - The identified combination of words (or a revised category label provided by an SME) may then be used for the issue category, persisted in the a supported issue categories database 341 and represented by a single vector upon completion by applying the corresponding product line specific word association model from the product line specific word association models 346. In one embodiment, the supported issue categories may be pushed to the CRM application 330 via API 332 to allow the agents 301 to appropriately label the product support cases;
See also Wang – par 65 - Computer system/server 612 may be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices. For example, the present invention may be implemented in a cloud computing environment, distributed or otherwise).
Concerning claims 7 and 9 and 16, Das and Wang disclose:
The system of claim 1, wherein to process the message to determine the technical problem identified in the message, the integrated support application is configured to perform natural language processing on the message (Das – see par 15 - , training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent; see par 41 - At run-time, the chatbot, for example, during operationalization of the word association and classification models (e.g., product-line specific Long Short-Term Memory Network (LSTM) models), may also make use of an NLP engine library 250 and various integrations 260. In one embodiment, the NLP engine library 250 may be used at each step of the conversational dialog with the customer to support speech recognition (e.g., parsing and splitting the text using parts of speech classifiers, identification of entities and intents, etc.). see par 48 - In this context, automation refers to the use of guided decision tree troubleshooting flows, interactive refers to the question and answer conversation between the customer and the chatbot, and conversational refers to the ability to process free text (e.g., natural language text) input by a customer during a chat session as a means of communication as opposed to, for example, interactions with a traditional user interface including buttons or menu selections.)
Concerning claim 11, Das and Wang disclose:
The method of claim 8, wherein mitigating the determined technical problem based on the determined action includes performing the determined action automatically and without action by the user to initiate the determined action after the determination of the determined action (Das - See par 41 - In the context of the present example, appropriate interfaces may be made available for use by the chatbot to check a customer's entitlement to support for the product at issue, transfer the case to a live agent, create a case within a CRM application, and/or assist the customer self-solve the product issue by returning knowledge articles resulting from searching a product support knowledge article database (transferring to a live agent, OR returning knowledge articles disclose “actions”) based on the customer defined issue text and information identifying the product line at issue; see also Wang – see par 27 - In step 250, upon determining the issue type and root cause, the IT agent may determine an appropriate resolution, including any needed remediation, of the IT issue. In step 255, the IT agent may implement or cause to be implemented the appropriate remediation to resolve the determined issue type and root cause. This may be implemented after performing some tests to confirm that the remediation solves the issue type and root cause and does not create new IT issues).
It would have been obvious to combine Das and Wang for the same reasons as claim 1 above.
Concerning claim 13 and 20 [similar to claim 5], Das discloses evaluating (FIG. 6), whether there are unsatisfactory issue resolutions (See par 78), and if too many issues are unrecognizable by the chatbot, it may be indicative of a need for “retraining” word association models (See par 80-82).
Williams discloses:
The method of claim 8, further comprising:
querying the user to determine feedback indicating whether the determined action resolved the determine technical problem (Eberlein see par 33 - The proposal system 204 can be configured to statistically analyze the proposed actions 224 relative to operator-selected actions and outcome (for example, success or failure) of the actions.; see par 45 - At 318, an outcome (for example, success or failure) of the solution is determined. For example, in response to implementing the solution, the process that generated the issue is restarted and monitored to determine whether the same error or an associated error reappears. If no associated error occurs, the outcome of the solution is considered as being successful. ); and
training the machine intelligence processing based on the feedback to modify handling of a subsequent technical support session (Eberlein – see par 32 - The proposal system 204 can provide for display the proposed action 224. The learning system 202 can be configured to receive input on the proposed action 224 (for example, action changes proposed by operator), to process the (modified) proposed action 224 and to monitor process results 218. The issue resolution data monitored by the learning system 202 can be enriched by extracted problem features 222 and action parameters 226 for the resolving actions of issues processed by the proposal system 204).
It would have been obvious to combine Das and Wang and Eberlein for the same reasons as claim 5 above.
Concerning claim 15, Das and Wang disclose:
The method of claim 14, wherein responsive to the written materials not resolving the determined technical problem and before generating the support ticket in the support ticketing application, the method further comprises:
performing machine intelligence processing based on the determined technical problem to determine at least one action useful in mitigating the determined technical problem (Das [same as claim 8] - See par 15 - training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories; see par 38 - Based on the limited set of product issue categories to be supported by the chatbot for each product line at issue, chatbot-specific approved workflows may be created for each issue category within the given product line with input from appropriate members of the product engineering teams. Once reviewed and approved by appropriate stakeholders, the troubleshooting workflows may be codified and made accessible for use by the chatbot at run-time, for example, via an Application Programming Interface (API) (e.g., a Representational State Transfer (REST) API); See par 50 - After a sufficient amount of product support cases have been labeled with SME validated product issue categories as a result of first stage operations, during the second stage, the labeling engine 360 may operate more independently of and reduce the burden on SMEs by performing unsupervised learning (e.g., an auto labeling approach) to incrementally identify and label new issue categories;
See also Wang – see par 27 - In step 250, upon determining the issue type and root cause, the IT agent may determine an appropriate resolution, including any needed remediation, of the IT issue. In step 255, the IT agent may implement or cause to be implemented the appropriate remediation to resolve the determined issue type and root cause. This may be implemented after performing some tests to confirm that the remediation solves the issue type and root cause and does not create new IT issues; see par 61 - That is, in the present embodiment, artificial intelligence such as machine learning may utilize supervised learning with the labeled database to determine the relationship between the inputs (e.g., IT asset type, symptoms, log data lines, etc.) and outputs (e.g., case classification including event type and root cause). These relationships may then be utilized towards identifying which log data lines to inspect for a given IT asset type and symptoms towards predicting the event type and root cause of subsequently submitted trouble tickets…);
mitigating the determined technical problem based on the determined action (Das See par 15 - training AI classification models (without undue upfront manual labeling of training data) to allow the chatbot to accurately identify customers' intent, and mapping of the identified intent to one of the supported product issue categories; see par 38 - …Once reviewed and approved by appropriate stakeholders, the troubleshooting workflows may be codified and made accessible for use by the chatbot at run-time, for example, via an Application Programming Interface (API) (e.g., a Representational State Transfer (REST) API). See par 41 - In the context of the present example, appropriate interfaces may be made available for use by the chatbot to check a customer's entitlement to support for the product at issue, transfer the case to a live agent, create a case within a CRM application, and/or assist the customer self-solve the product issue by returning knowledge articles resulting from searching a product support knowledge article database (transferring to a live agent, returning knowledge articles disclose “actions”) based on the customer defined issue text and information identifying the product line at issue
See also Wang – see par 27 - In step 250, upon determining the issue type and root cause, the IT agent may determine an appropriate resolution, including any needed remediation, of the IT issue. In step 255, the IT agent may implement or cause to be implemented the appropriate remediation to resolve the determined issue type and root cause. This may be implemented after performing some tests to confirm that the remediation solves the issue type and root cause and does not create new IT issues; see par 61 - That is, in the present embodiment, artificial intelligence such as machine learning may utilize supervised learning with the labeled database to determine the relationship between the inputs (e.g., IT asset type, symptoms, log data lines, etc.) and outputs (e.g., case classification including event type and root cause). These relationships may then be utilized towards identifying which log data lines to inspect for a given IT asset type and symptoms towards predicting the event type and root cause of subsequently submitted trouble tickets);
responsive to the determined action resolving the determined technical problem, terminating the technical support session (Wang – [same as cl. 1, 8] - see par 25 - Concurrently with opening the trouble ticket and until the IT agent’s trouble ticket session has ended or until the trouble ticket is closed, trouble ticket management system 140 may monitor the actions of the IT agent with regards to the open trouble ticket in step 235. This monitoring may include generating session events identifying the actions taken by the IT agent with regards to the trouble ticket… ; see par 30, 33 - In step 280, after a sufficient number of trouble tickets have been resolved and closed, the aggregated and labeled session events and log data lines may then be utilized for modeling the process of resolving trouble tickets in modeling system 146. see par 52 - Concurrently with and responsive to the assigned IT agent opening the trouble ticket and until the IT agent’s trouble ticket on-line session has ended or until the trouble ticket is closed, session event generator 360 may monitor the actions of the IT agent with regards to the open trouble ticket in step 525); and
responsive to the determine action not resolving the determined technical problem, generating the support ticket in the support ticketing application (Das – See par 33 - In one embodiment, when the chatbot is unable to resolve the customer's issue, the product support case may be escalated 123 to a live human agent. See par 41 – chatbot… transfer case to a live agent, create a case within a CRM (Customer relationships management) application); See par 76 -Other potential reasons for escalating a product support case may include … the chatbot is unable to identify the intent from the customer text after multiple attempts, and/or the customer provides feedback that his/her issue remains unresolved after the chatbot has executed the troubleshooting flows. In one embodiment, before escalating the product support case to the live human agent).
It would have been obvious to combine Das and Wang for the same reasons as claim 1 and claim 8 above.
Concerning claim 17, Das and Wang disclose:
The method of claim 14, identifying the written materials related to the determined technical problem includes searching at least some of a data store or performing an Internet search based on keywords associated with the determined technical problem to identify the written materials (Das – see FIG. 3 – end users 302 in communication with chatbot 345; various examples provided herein are described in the context of assisting customers (e.g., end users 302) in connection with troubleshooting product/service issues through live chat via a chatbot; see par 76 - the chatbot may further attempt to help the customer to self-solve the issue by performing a search through a support knowledge article database based on the customer defined issue text and may return relevant links to relevant knowledge documents to the customer).
Concerning claim 10, Das and Wang disclose:
The method of claim 8, wherein mitigating the determined technical problem based on the determined action includes providing, in the messaging application, a recommendation to the user to perform the determined action (Das – see par 45 - the topic modeling engine 340 is responsible for finding clusters of cases as well as suggested top words that may be used to identify those case clusters to facilitate product issue categorization. For example, the topic modeling engine 340 may load a subset of case records from the case records database 331 via API 332 into an input case data database 341 to facilitate performance of LDA at one or more levels of granularity. In one embodiment, LDA is performed on the extracted case data at the component level (e.g., a power supply, battery, or a fan of a server), the product line level (e.g., HPE ProLiant ML servers produced by Hewlett Packard Enterprise Company), and/or the issue level (e.g., hard drive failure, how to update the power supply firmware); see par 76- the chatbot may further attempt to help the customer to self-solve the issue by performing a search through a support knowledge article database based on the customer defined issue text and may return relevant links to relevant knowledge documents to the customer.).
To any extent Das does not disclose, Eberlein discloses the limitations:
Williams – see par 31- The proposal system 204 can use collected data to generate one or more proposed actions 122 based on an automation process. In some implementations, each of the one or more proposed actions 122 can be provided with a probability of success. Each of the one or more proposed actions 122 can be based on collected data associated with past issue resolutions. In some implementations, the proposal system 204 provides information that enables a user to script the feature extraction or define rules).
It would have been obvious to combine Das and Wang and Eberlein for the same reasons as claim 1 above. In addition, Das discloses the identified issue can be “how to update the power supply firmware” (See par 45). This is viewed as a “recommendation” – in that it is still up to the user whether they actually perform those steps, or perform content in a knowledge article (See par 76). Eberlein improves upon Das and Wang by explicitly stating it is “proposed” to a user.
Claims 12 and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Das (US 2022/0398598) and Wang (US 2023/0129123) and Eberlein (US 2021/889384), as applied to claims 1-11, 13-17, and 20 above, and further in view of Morgan (US 2020/0329144).
Concerning claim 12 and 18, Das and Wang disclose:
The method of claim 8, wherein generating the support ticket includes:
prompting the user, via the messaging application, to open the support ticket (Das – see par 14 - responsive to receipt of a product support inquiry, a new product support case (or simply a case), representing a customer issue or problem, may be opened in a Customer Relationship Management (CRM) application to store, among other things, information regarding the product line and the specific component to which the issue or problem relates, a description of the issue, a resolution, whether the issue is remotely resolvable by the customer (e.g., no replacement parts are required and an onsite engineer is not required) and a level of criticality of the issue (e.g., critical vs. non-critical).)
Das discloses a customer having an issue results in opening a new case (See par 14). Wang discloses opening tickets for an issue (See par 25).
Morgan discloses:
responsive to an affirmative response from the user, interfacing with the support ticketing application to generate the support ticket without further action from the user, wherein the interfacing provides the support ticket with information related to actions taken in the technical support session (Morgan – see par 29- A third type of transition is a transition from receiving support from an automated agent to receiving support from a human agent (any of the channel transitions described above including remaining on the same channel). See par 32 - The customer may accordingly seek support using a variety of devices and channels and may seek (or be assigned to) a support session with a human agent or an automated agent. As part of the support session, the customer device may establish a connection with API (applications programming interface) server 120 of the company. Although a company may have different servers for different types of support requests, for clarity of presentation, FIG. 1 illustrates a single API server 120 for handling support requests of customers; see par 37 - During a customer support session, the customer may transition to a different channel or between support from a human agent and an automated agent. The transition may occur for a variety of reasons, such as the following: the customer wishes to transition to a different channel for his own convenience; a phone call was dropped; the issue of the customer may more easily be resolved on a different channel; see par 49 - For example, session manager component 340 may create an entry in session data store 350 that includes an identifier of the session, an identifier of the customer, an identifier of the human agent assigned to the customer support session, and any other appropriate information); and
reporting, in the messaging application, an identity of the support ticket (Morgan – see par 35 - With support by a human agent, machine learning techniques may be used to assist the human agent. For example, machine learning may be used to provide automatic completions to text entered by the agent, suggest responses for an agent to send to a customer, or automatically provide resources relevant to the subject matter of the conversation, such as the customer's billing history or a manual for the cable modem of the customer; see par 49 - For example, session manager component 340 may create an entry in session data store 350 that includes an identifier of the session, an identifier of the customer, an identifier of the human agent assigned to the customer support session, and any other appropriate information. See par 130 - techniques described above for FIGS. 4A and 4B (and also the techniques described above for adapting FIGS. 4A and 4B for channel transitions with an automated agent) may also be used when a customer transitions from receiving support from an automated agent to receiving support from a human agent. See par 132 - In some implementations, a description (e.g., a summary or a transcript) of the automated support between the customer and the automated agent may be presented to the human agent who is now assisting the customer. In some implementations, a description of the automated support session may include information about one or more intents identified during the automated support session and information items identified during the support session;
see also Eberlein see par 22 - meta-data user interface 108 can be configured to retrieve metadata (for example, procedure names, attributes, or descriptions) from a meta-data landscape 109).
Das, Wang, Eberlein, and Morgan are analogous art as they are directed to handling customer problems/issues (see Das Abstract; Wang Abstract; Eberlein Abstract, Morgan Abstract). Das discloses a customer having an issue results in opening a new case (See par 14). Wang discloses opening tickets for an issue (See par 25). Eberlein discloses “During a proposal phase or an automatic issue resolution phase, the monitoring system 112 can be configured to automatically detect errors based on one or more rules” (See par 22). Morgan improves upon Das and Wang and Eberlein by disclosing having customers seek to have manual support or chooses to transition to support from a human agent (See par 29, 37) where session data can include identifier (See par 49) along with additional information on things that already occurred, such as during an automated support session (See par 132). One of ordinary skill in the art would be motivated to further include ability of customers to seek a human agent and provide information related to the issue to efficiently improve upon the feedback on case topics in Das and the use of tickets and sessions in Wang (See par 19, 25).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the use of chatbot and a human agent for escalated cases in Das to further utilize session data, closing of session, relative to trouble tickets from customers as disclosed in Wang, and automatic issue resolution in Eberlein, to further have customers be able to indicate they would like a person to assist them and other channel transitions as disclosed in Morgan, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success.
Concerning claim 19, Das and Wang disclose:
The method of claim 14, further comprising:
receiving a first message from a support person in the support ticketing application (Das – see par 43 - customers may communicate with the call center agents through chat and/or other communication mechanisms (e.g., voice calls or video calls) supported by the contact center solution 335.)
Morgan discloses:
providing the first message to the user in the messaging application (Morgan – see par 139, FIG. 8A includes a customer list portion 810, that may include a list of customers that the agent is currently communicating with. FIG. 8A also includes conversation portion 820 that allows the agent to see messages typed by a customer, type messages to the customer, and see the conversation history. );
receiving a second message from the user in the messaging application (Morgan – see par 139, FIG. 8A includes a customer list portion 810, that may include a list of customers that the agent is currently communicating with. FIG. 8A also includes conversation portion 820 that allows the agent to see messages typed by a customer, type messages to the customer, and see the conversation history. ); and
providing the second message to the support person in the support ticketing application (Morgan – see par 40 - A log of a customer support session may include text of messages exchanged during the session, a transcript of audio of the session, and other information, such as information about user interfaces presented to a customer or actions performed. see par 139, FIG. 8A includes a customer list portion 810, that may include a list of customers that the agent is currently communicating with. FIG. 8A also includes conversation portion 820 that allows the agent to see messages typed by a customer, type messages to the customer, and see the conversation history; see par 163 - For example, as the customer is going through the automated workflow, the obtained values of the information items may be presented to the human agent, such as in information portion 830 of FIG. 8B. The human agent may then review the received values, and if needed, request clarification from the customer or make changes).
PNG
media_image1.png
574
388
media_image1.png
Greyscale
It would have been obvious to combine Das and Wang, Eberlein, and Morgan for the same reasons as claim 12 above. Das discloses having a call center with different channels of communication (e.g. chat), chat, messages, and human agents (See par 25, 43). Wang discloses users conversing with an IT agent describing an IT issue (See par 22). Morgan improves upon Das and Wang and Eberlein by further displaying how messages back and forth between customers and people can be displayed.
Response to Arguments
Applicant's arguments filed 1/20/26 have been fully considered but they are not persuasive and are moot in view of the new rejections.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IVAN R GOLDBERG whose telephone number is (571)270-7949. The examiner can normally be reached 830AM - 430PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anita Coupe can be reached at 571-270-3614. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/IVAN R GOLDBERG/Primary Examiner, Art Unit 3619