Prosecution Insights
Last updated: April 19, 2026
Application No. 19/007,416

SYSTEMS AND METHODS FOR HYBRID MULTI MACHINE LEARNING AGENT ORCHESTRATION

Non-Final OA §103
Filed
Dec 31, 2024
Examiner
WU, YICUN
Art Unit
2153
Tech Center
2100 — Computer Architecture & Software
Assignee
Hsbc Group Management Services Limited
OA Round
1 (Non-Final)
81%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 81% — above average
81%
Career Allow Rate
486 granted / 598 resolved
+26.3% vs TC avg
Strong +17% interview lift
Without
With
+17.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
16 currently pending
Career history
614
Total Applications
across all art units

Statute-Specific Performance

§101
11.5%
-28.5% vs TC avg
§103
47.5%
+7.5% vs TC avg
§102
26.3%
-13.7% vs TC avg
§112
3.7%
-36.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 598 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . objection “additional data payloads “ has not been clearly defined in the claims. “different activation or input layers tracking different derivatives of data” “ has not been clearly defined in the claims. Drawings Drawings filed on 12/31/2024 is unclear and hard to read. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-4, 7-14, 16- 20 are rejected under 35 U.S.C. 103(a) as being unpatentable over Tong et al. U.S. 20240419950 in view of AHMADIDANESHAS et al. US 20210249002 As to claim 1, Tong discloses a computer system adapted for providing a multi-domain primary computational conversation agent (fig. 1), the computer system comprising: a computer processor operating in conjunction with computer memory and a non-transitory computer data storage (fig. 15), the computer processor configured to: maintain the multi-domain primary computational conversation agent configured to receive a main query string ([0056]-[0057]), the multi-domain primary computational conversation agent coupled to a plurality of secondary domain specialized computational conversation agents (“orchestrator” that identifies an appropriate ML agent for communicating with the particular domain [0056]0[0057]); broadcast a subquery message to the plurality of secondary domain specialized computational conversation agents (broadcast [0058] fig. 4, item 418); receive one or more response data messages from each of the secondary domain specialized computational conversation agents (response [0094]) fig. 2), each data message including a combination of a proposed response string (fig. 4), and a corresponding confidence score value (determine the consistency and accuracy of the output. [0073]); process each of the proposed response string and the corresponding confidence score values to generate a candidate score for each of the proposed response strings based on the multi-domain primary computational conversation agent scoring (fig. 4); combine at least two of the proposed response strings to generate a combined response string using a coupled trained large language model (fig. 4); and output the combined response string as a rendered data object on a coupled user interface instance for rendering on a device associated with the user (fig. 4). Tong does not explicitly teach from a user having a user type and user profile fields and based on the user type and user profile fields from the querying party. AHMADIDANESHAS teaches from a user having a user type and user profile fields and based on the user type and user profile fields from the querying party. (user profile [0013][0127]). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to have modified Tang by the teaching of AHMADIDANESHAS to include from a user having a user type and user profile fields and based on the user type and user profile fields from the querying party with the motivation to provide better automated conversation technology as taught by AHMADIDANESHAS [0008]). As to claim 2, Tong as modified teaches a system of claim 1, wherein the plurality of secondary domain specialized computational conversation agents are trained machine learning models each having different feed sources and feed parameter characteristics coupled to one or more data source feed sources (Tong [0004][0024]). As to claim 3, Tong as modified teaches a system of claim 1, wherein the plurality of secondary domain specialized computational conversation agents are trained machine learning models each having different neural network architectures or each having different domain specializations (Tong [0097]). As to claim 4, Tong as modified teaches a system of claim 1, wherein prior to rendering the rendered data object, the combined search string or the rendered data object is provided to an intermediate reviewer user interface provided to a human reviewer for awaiting provisioning of an approval signal ( Human involvement required AHMADIDANESHAS [0090]), the approval signal triggering the rendering on the device associated with the user (AHMADIDANESHAS [0089]) . As to claim 7, Tong as modified teaches a system of claim 3, wherein the different neural network architectures includes agents that are trained with a same set of data sources but the different neural network architectures are configured with different activation or input layers tracking different derivatives of data from the same set of data sources. ([Tong [0081). As to claim 8, Tong as modified teaches a system of claim 7, wherein the different derivatives of data from the same set of data sources include a real-time data feed, and one or more derivations based on higher order rates of changes of data elements from the real-time data feed. ( data is sanitized in real-time (AHMADIDANESHAS [0322]). As to claim 9, Tong as modified teaches a system of claim 7, wherein the different derivatives of data from the same set of data sources include a real-time data feed, and one or more derivations based on aggregations of data in the real-time data feed. ( data is sanitized in real-time (AHMADIDANESHAS [0322]). As to claim 10, Tong as modified teaches a system of claim 1, wherein the computer system is a special purpose computing appliance residing in a data center and coupled to a message bus receiving data feeds from one or more computing endpoints indicative of computing asset inventory and operational metrics, and the special purpose computing appliance is configured to receive the main query string from the message bus and to output the combined response string on the message bus (Tong fig. 15). Claims 5-6 and 15 are rejected under 35 U.S.C. 103(a) as being unpatentable over Tong et al. U.S. 20240419950 in view of AHMADIDANESHAS et al. US 20210249002 further in view of Shah et al. 20240420161. As to claim 5, Tong as modified teaches a system of claim 1, wherein the user type and the user profile fields (AHMADIDANESHAS user profile [0013][0127])., and the generation of the combined response string using the coupled trained large language model includes additionally providing the user type and the user profile fields as additional data payloads to the coupled trained large language model (Tong [0078]). Tong as modified does not teach user profile fields are obtained from an active directory server. Shah teaches user profile fields are obtained from an active directory server ([0251]). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to have modified Tang by the teaching of Shah to include user profile fields are obtained from an active directory server with the motivation to allow organizations to make informed decisions, improve operational efficiency, and effectively management as taught by Shah [0003]). As to claim 6, Tong as modified teaches a system of claim 5, wherein the broadcast of the subquery message (Tong [0078]) includes including the additional data payloads to the subquery message in addition to the main query string (Tong [0078]). As to claims 11-20, the limitations of these claims have been noted in the rejection above. They are therefore rejected as set forth above. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Yicun Wu whose telephone number is 571-272-4087. The examiner can normally be reached on 8:00 am to 4:30 pm, Monday -Friday. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Kavita Stanley, can be reached on (571) 571-272-8352. The fax phone numbers for the organization where this application or proceeding is assigned are 571-273-8300. Any inquiry of a general nature or relating to the status of this application or proceeding should be directed to the receptionist whose telephone number is 571-272-2100. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system: "http://portal.uspto.gov/external/portal/pair" Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) 866-217-9197 (toll-free) If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. Yicun Wu Patent Examiner Technology Center 2100 /YICUN WU/ Primary Examiner, Art Unit 2153
Read full office action

Prosecution Timeline

Dec 31, 2024
Application Filed
Dec 22, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602351
Methods and Systems for Archiving File System Data Stored by a Networked Storage System
2y 5m to grant Granted Apr 14, 2026
Patent 12547643
UNIFIED CONTEXT-AWARE CONTENT ARCHIVE SYSTEM
2y 5m to grant Granted Feb 10, 2026
Patent 12541693
GENERATING AND UPGRADING KNOWLEDGE GRAPH DATA STRUCTURES
2y 5m to grant Granted Feb 03, 2026
Patent 12536239
METHODS AND SYSTEMS FOR REFRESHING CURRENT PAGE INFORMATION
2y 5m to grant Granted Jan 27, 2026
Patent 12511491
SYSTEM AND METHOD FOR MANAGING AND OPTIMIZING LOOKUP SOURCE TEMPLATES IN A NATURAL LANGUAGE UNDERSTANDING (NLU) FRAMEWORK
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
81%
Grant Probability
99%
With Interview (+17.3%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 598 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month