DETAILED ACTION
This action is in response to the amendment filed on 01/05/2026.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/05/2026 has been entered.
Response to Amendment
Applicant’s amendment filed on 01/05/2026 have been entered. Claims 2, 9 and 16 have been amended. No claims have been canceled. No claims have been added. Claims 2 – 21 are still pending in this application, with claims 2, 9 and 16 being independent.
Allowable Subject Matter
Claims 3 – 8, 10 – 15 and 17 – 21 are objected to as being dependent on a rejected base claim.
Aside from the non-prior art rejection of independent claims 2, 9 and 16, it has been determined that the prior art fails to teach or suggest in reasonable combination the limitations recited in the independent claims. For example, Hall (US 2017/0013130) discloses a system and method for managing service requests (Abstract), comprising the following: adding an interest (topic, [0025]) in a user profile and a correlation (experience level) between the user profile and an agent profile associated with an agent ([0059] [0068]), wherein the correlation is based on the interest topic ([0058] [0068]). Yet, Hall fails to teach or suggest in reasonable combination the other limitations recited by the independent claims, including: sending, via a communication interface from a dynamic training response output generation control platform, one or more commands directing a user device to generate an initial dynamic training interface using initial dynamic training interface information, the initial dynamic training interface prompting selection of a guided dynamic training experience or an unguided dynamic training experience, the guided dynamic training experience including one or more methods for selection, and the unguided dynamic training experience including a conversational experience; receiving, via the communication interface by a natural language understanding (NLU) engine, a training request input captured using the initial dynamic training interface; determining, via the NLU engine, a natural language result output by performing natural language understanding and processing on the training request input; and receiving, via the communication interface by a profile correlation computing platform, one or more commands directing the profile correlation computing platform to update based on the natural language result output.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 2, 9 and 16 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 2 of U.S. Patent No. 10,897,434. Although the claims at issue are not identical, they are not patentably distinct from each other.
Current Application
2.(Currently Amended) A system comprising: a dynamic training response output generation control platform configured to send one or more commands, via a communication interface, directing a user device to generate an initial dynamic training interface using initial dynamic training interface information, the initial dynamic training interface prompting selection of a guided dynamic training experience or an unguided dynamic training experience, the guided dynamic training experience including one or more methods for selection, and the unguided dynamic training experience including a conversational experience; a natural language understanding (NLU) engine configured to receive, via the communication interface, a training request input captured using the initial dynamic training interface and perform natural language understanding and processing on the training request input to determine a natural language result output; and a profile correlation computing platform configured to receive, via the communication interface, one or more commands directing the profile correlation computing platform to update based on the natural language result output by adding a training topic interest indicated by the natural language result output to a user profile associated with a user of the user device and adding, to the user profile, a correlation between the user profile and an agent profile associated with an agent wherein the correlation is determined based on the training topic interest.
9. (Currently Amended) A method comprising :sending, via a communication interface from a dynamic training response output generation control platform, one or more commands directing a user device to generate an initial dynamic training interface using initial dynamic training interface information, the initial dynamic training interface prompting selection of a guided dynamic training experience or an unguided dynamic training experience, the guided dynamic training experience including one or more methods for selection, and the unguided dynamic training experience including a conversational experience ;receiving, via the communication interface by a natural language understanding (NLU) engine, a training request input captured using the initial dynamic training interface; determining, via the NLU engine, a natural language result output by performing natural language understanding and processing on the training request input; and receiving, via the communication interface by a profile correlation computing platform, one or more commands directing the profile correlation computing platform to update based on the natural language result output by adding a training topic interest indicated by the natural language result output to a user profile associated with a user of the user device and adding, to the user profile, a correlation between the user profile and an agent profile associated with an agent wherein the correlation is determined based on the training topic interest.
16. (Currently Amended) One or more non-transitory computer-readable media storing instructions that, when executed by a computing platform comprising at least one processor cause the computing platform to: send, via a communication interface, one or more commands directing a user device to generate an initial dynamic training interface using initial dynamic training interface information, the initial dynamic training interface prompting selection of a guided dynamic training experience or an unguided dynamic training experience, the guided dynamic training experience including one or more methods for selection, and the unguided dynamic training experience including a conversational experience; receive, via the communication interface, a training request input captured using the initial dynamic training interface; determine a natural language result output by performing natural language understanding and processing on the training request input; and receive, via the communication interface, one or more commands directing the profile correlation computing platform to update based on the natural language result output by adding a training topic interest indicated by the natural language result output to a user profile associated with a user of the user device and, adding to the user profile, a correlation between the user profile and an agent profile associated with an agent wherein the correlation is determined based on the training topic interest.
US 10,897,434
1.A computing platform, comprising: at least one processor; a communication interface communicatively coupled to the at least one processor; and memory storing computer-readable instructions that, when executed by the at least one processor, cause the computing platform to: receive, from a user device, a request for a dynamic training interface; generate, in response to receiving the request for the dynamic training interface, initial dynamic training interface information; generate one or more commands directing the user device to generate an initial dynamic training interface using the initial dynamic training interface information, wherein the initial dynamic training interface prompts a user to select either a guided dynamic training experience or an unguided dynamic training experience, and wherein the guided dynamic training experience prompts the user to make selections and the unguided dynamic training experience prompts the user for questions; send, via the communication interface, the initial dynamic training interface information and the one or more commands directing the user device to generate the initial dynamic training interface using the initial dynamic training interface information; receive, from the user device and in response to the initial dynamic training interface, a training request input; generate one or more commands directing a natural language understanding (NLU) engine to perform natural language understanding and processing on the training request input to determine a natural language result output; send, to the NLU engine, the training request input and the one or more commands directing the NLU engine to perform natural language understanding and processing on the training request input to determine the natural language result output; receive, from the NLU engine, the natural language result output; determine one or more third party data sources that correspond to the natural language result output; generate one or more commands directing the one or more third party data sources to send source data corresponding to the natural language result output; send, to the one or more third party data sources, the one or more commands directing the one or more third party data sources to send source data corresponding to the natural language result output; receive, from the one or more third party data sources, the source data corresponding to the natural language result output; generate a dynamic training response output based on the natural language result output and the source data; generate one or more commands directing the user device to cause display of the dynamic training response output; and send, to the user device, the dynamic training response output and the one or more commands directing the user device to cause display of the dynamic training response output.
2. The computing platform of claim 1, wherein the memory stores additional computer-readable instructions that, when executed by the at least one processor, further cause the computing platform to: establish a wireless data connection with a profile correlation computing platform; generate one or more commands directing the profile correlation computing platform to update based on the natural language result output; and send, using the wireless data connection with the profile correlation computing platform, the one or more commands directing the profile correlation computing platform to update based on the natural language result output.
As shown above, claim 2 of US,10,897,434 recites the limitations of claim 2 of the current application, except for the following: adding, as the update, a training topic interest indicated by the natural language result output to a user profile associated with a user of the user device and adding, to the user profile, a correlation between the user profile and an agent profile associated with an agent wherein the correlation is determined based on the training topic interest.
However, specifying information used to update a profile is considered an obvious variant of updating a profile. Therefore, claim 2 of the current application and claim 2 of US 10,897,434 are obvious variants.
As shown above, claim 2 of US,10,897,434 recites the limitations of claim 9 of the current application, except for the following: a method embodiment; and adding, as the update, a training topic interest indicated by the natural language result output to a user profile associated with a user of the user device and adding, to the user profile, a correlation between the user profile and an agent profile associated with an agent wherein the correlation is determined based on the training topic interest.
However, specifying information used to update a profile is considered an obvious variant of updating a profile. Furthermore, a method embodiment of an invention is considered an obvious variant of a computer system embodiment of the invention. Therefore, claim 9 of the current application and claim 2 of US 10,897,434 are obvious variants.
As shown above, claim 2 of US,10,897,434 recites the limitations of claim 16 of the current application, except for the following: a non-transitory medium embodiment; and adding, as the update, a training topic interest indicated by the natural language result output to a user profile associated with a user of the user device and adding, to the user profile, a correlation between the user profile and an agent profile associated with an agent wherein the correlation is determined based on the training topic interest.
However, specifying information used to update a profile is considered an obvious variant of updating a profile. Furthermore, a non-transitory medium embodiment of an invention is considered an obvious variant of a computer system embodiment of the invention. Therefore, claim 16 of the current application and claim 2 of US 10,897,434 are obvious variants.
Response to Arguments
Applicant's arguments filed on 01/05/2026 have been fully considered but are not persuasive regarding the double patenting rejection. As shown above, the nonstatutory double patenting rejection has not been overcome by amendment or argument.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SONIA L GAY whose telephone number is (571)270-1951. The examiner can normally be reached Monday-Friday 9-5 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel Washburn can be reached on 571-272-5551. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SONIA L GAY/Primary Examiner, Art Unit 2657