Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
III. DETAILED ACTION
Claims 1-20 are presented for examination.
Claim Objections
The following terms have been used without clear definition in the claims. Further clarification is required.
“function information”, “function determination model”, “target”, “index number”, “ground truth item”, “loss function/loss terms”, “freezing model weights”, “injecting trainable rank decomposition matrices into each layer of the function determination model.”, “any other weight”, “toke term”, “irrelevant query”, “parallel function “
“not received”, “any other weight”, “irrelevant query” “not received” are considered a negative limitation by the Examiner, the scope of which has not been clearly defined.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-9 and 11-20 are rejected under 35 U.S.C. 103(a) as being unpatentable over Hong et al. 20210136433 in view of Klein et al. 20220382979.
As to claims 1, 19 and 20, Hong discloses a method for implementing functions automatically, comprising: at a computer system including one or more processors and memory:
receiving a natural language (natural language [0005]); and
in response to the natural language, automatically ([0005]) applying (applying the matching model [0130]) a function determination model (function determination model [0091]-[0100]) to generate function information (generate [0113] [0138]) of a target function based on the natural language query ([0091]-[0100]),
the function information further including identification information (identification information [141]) and one or more parameters of the target function (parameters [0167]); and
implementing the target function (performing an operation according to the user's intention based on the received voice input and a voice assistant model that provides pieces of information required to perform a service [0006]) based on the function information ([0167]);
wherein one or more user applications are configured to implement a plurality of predefined functions including the target function.( application [0070]).
Hong does not explicitly teach natural language query.
Klein teaches natural language query (natural language query [0020][0021]).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to have modified Hong by the teaching of Klein to include natural language query with the motivation to provide better commonsense reasoning as taught by Klein[0001].
As to claim 2, Hong as modified teaches a method of claim 1, the computer system including
a client device that receives the natural language query (natural language [0005]), the method further comprising:
locally applying, by the client device, the function determination model to generate the function information associated with the target function (Hong [0070]).
As to claim 3, Hong as modified teaches a method of claim 1, wherein
the computer system includes a client device that is communicatively coupled to a function server, and the natural language query is provided to the function server (natural language [0005]), further comprising:
applying, by the function server, the function determination model to generate the function information associated with the target function (Hong [0070]).
As to claim 4, Hong as modified teaches a method of claim 1, wherein
the identification information of the target function includes an index number (index Klein [0025]) identifying one of a plurality of syntax elements (parse Hong [0127] determine a device related to the intent recognized from the text as the operation-performing device, based on a matching model for determining a relation between the intent and the device Hong [0128]) corresponding to a plurality of function names of the plurality of predefined functions.( identify Hong [0088]).
As to claim 5, Hong as modified teaches a method of claim 1, wherein
the identification information of the target function includes a syntax element corresponding to a function name of the target function (Hong [0128]).
As to claim 6, Hong as modified teaches a method of claim 1, further comprising:
obtaining a base language model configured to process natural language queries (trained Hong [0262]); and
training the base language model (base sentence with perturbations) may be ingested to train a model 206. In this case, a conditional language model (LM)) Klein [0044]) using a corpus of training data to generate the function determination model (Hong [0262]).
As to claim 7, Hong as modified teaches a method of claim 1, further comprising
training the function determination model using a corpus of training data (Hong [0262]);
wherein the corpus of training data include a plurality of training natural language queries (Klein [0020][0021]) and a plurality of ground truth items (ground truth Klein [0062]-[0065]); and
wherein each training natural language query corresponds to a respective ground truth item (ground truth Klein [0062]-[0065]), and each ground truth item is associated with a respective one of the plurality of predefined functions associated with the one or more user applications (ground truth Klein [0062]-[0065]) (Hong [0262]).
As to claim 8, Hong as modified teaches a method of claim 7, wherein: training the function determination model further comprises
generating a loss function (Loss terms. Klein [0062]-[0065]) based on a weighted (weight Hong [0381]) and weight Klein [0029])
combination (aggregating the scores Klein [0069]) of a plurality of loss terms (Loss terms. Klein [0062]-[0065]);
the plurality of loss terms including a functional token term and one or more alternative terms distinct from the functional token term (Loss terms. Klein [0062]-[0065] [0029]);
the functional token term indicates an accuracy level (accuracy Klein [0045] [0072]-[0073]) of the identification information of respective function information generated for each training natural language query (accuracy Klein [0045] [0072]-[0073]); and
a weight (weight Hong [0381]) and weight Klein [0029]) of the function toke term is greater than any other weight of a remainder of the plurality of loss terms (Loss terms. Klein [0062]-[0066]).
As to claim 9, Hong as modified teaches a method of claim 7, further comprising, after training the function determination model using the corpus of training data:
freezing (the contrastive learning model is derived over a limited set of commonsense concepts associated with consistent perturbations Klein [0014]) model weights of the function determination model weight Klein [0029]); and
injecting trainable rank decomposition matrices (dimensionality Klein [0052]) into each layer of the function determination model. (Klein [0052])
Claim 10 is rejected under 35 U.S.C. 103(a) as being unpatentable over Hong et al. 20210136433 in view of Klein et al. 20220382979 further in view of Ni et al. US 20190278857.
As to claim 10, the teachings of Hong as modified have been discussed,
Hong does not teach context information associated with the natural language query is not received.
Ni teaches context information associated with the natural language query is not received. (omit context in spoken follow-on queries context is not provided in follow-on queries [0029]).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to have modified Hong by the teaching of Ni to include context information associated with the natural language query is not received with the motivation to provide better query processing by improved context utilization as taught by Klein[0005][0007]).
As to claim 11, Hong as modified teaches a method of claim 1, wherein
the function information associated with the target function is generated from the natural language query, independently of any other query distinct from the natural language query, and wherein the function determination model includes a large language model (LLM) configured to process the natural language query. (Hong fig. 1, 5)
As to claim 12, Hong as modified teaches a method of claim 1, wherein
the natural language query includes the one or more parameters, and the natural language query is received via a software program configured to communicate with each of the one or more user applications via an Application Programming Interface (API) (Hong fig. 5).
As to claim 13, Hong as modified teaches a method of claim 1, wherein
the plurality of predefined functions includes an irrelevant query alert function and a remainder of plurality of predefined functions that is associated with the one or more user applications, and
implementing the target function further comprises: in accordance with a determination that the identification information corresponds to the irrelevant query alert function (relevant synonyms Klein [0026]),
generating an alert message on a user interface, indicating (Klein fig. 5) that the natural language query is not associated with the remainder of plurality of predefined functions (Note: since Klein [0026] discloses relevant synonyms, it also can disclose irrelevant synonyms ) .
As to claim 14, Hong as modified teaches a method of claim 1, further comprising:
executing a program distinct from the one or more user applications ([Hong 0091]-[0100]); and
displaying a graphical user interface of the program, wherein the natural language query is received via the graphical user interface (Hong fig. 5).
As to claim 15, Hong as modified teaches a method of claim 1, wherein
the target function includes a plurality of parallel functions, and implementing the target function further comprises:
implementing each of the plurality of parallel functions by a respective distinct user application identified by respective identification information and based on a subset of respective one or more parameters of the respective parallel function (second function determination model Hong [0167).
As to claim 16, Hong as modified teaches a method of claim 1, wherein
the target function includes a first function and a second function nested in the first function, and implementing the target function further comprises:
implementing the second function to generate an intermediate parameter ( parameters [0167); and
implementing the first function using the intermediate parameter. ( parameters Hong [0167).
As to claim 17, Hong as modified teaches a method of claim 1, wherein
the one or more user application includes a first application initiated and executed to implement the target function in response to the natural language query ([0167)., and
the function information further includes application information identifying the first application (Hong [0167).
As to claim 18, Hong as modified teaches a method of claim 1, wherein:
each of the one or more user applications is configured to implement a set of respective functions ([0091]-[0100]);
the plurality of predefined functions include the set of respective functions ([0091]-[0100]); and
the function determination model is trained to generate function information of each of the plurality of predefined functions (Hong [0091]-[0100]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Yicun Wu whose telephone number is 571-272-4087. The examiner can normally be reached on 8:00 am to 4:30 pm, Monday -Friday.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Kavita Stanley, can be reached on (571) 571-272-8352. The fax phone numbers for the organization where this application or proceeding is assigned are 571-273-8300.
Any inquiry of a general nature or relating to the status of this application or proceeding should be directed to the receptionist whose telephone number is 571-272-2100.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR.
Status information for unpublished applications is available through Private PAIR only.
For more information about the PAIR system:
"http://portal.uspto.gov/external/portal/pair"
Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) 866-217-9197 (toll-free)
If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Yicun Wu
Patent Examiner
Technology Center 2100
/YICUN WU/
Primary Examiner, Art Unit 2153