Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on September 18, 2025 has been entered.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-17 and 19-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding Claim 1,
Claim 1 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 1 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“select, based on the acquired information and the acquired usage data, a computer vision task”
“determine a set of constraints associated with an implementation of the selected computer vision task on the electronic device”
“select a first neural network as a seed model for the selected computer vision task”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply (See MPEP 2106.05(f)) and insignificant extra-solution activity (See MPEP 2106.05(g)).
The limitations:
“A system, comprising: circuitry configured to…”
“execute, based on the seed model and the determined set of constraints, one or more operations”
“obtain a second neural network based on the execution of the one or more operations, wherein the obtained second neural network is trained on the selected computer vision task and the one or more operations include a neural architecture search”
“implement an Application Programming Interface (API) call functionality on the electronic device, wherein the API call functionality includes an API call code to remotely call the deployed second neural network”
“update, based on the implementation of the API call functionality, a software application on the electronic device to include an end-user feature, wherein the end-user feature implements, on the electronic device, the deployed second neural network for the selected computer vision task”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
The limitations:
“acquire information associated with one or more functional components of an electronic device, wherein the acquired information comprises hardware specification information that indicates hardware resources available at the electronic device”
“acquire usage data associated with the electronic device”
“deploy the obtained second neural network on a cloud server”
As drafted, are additional elements that amount to no more than insignificant extra-solution activity. See MPEP 2106.05(g).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply” or “insignificant extra-solution activity”. Specifically, the acquiring and deploying limitations recite the well-understood, routine, and conventional activity of receiving and transmitting data over a network. MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network). Mere instructions to apply and insignificant extra-solution activity cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 2,
Claim 2 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 2 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 1.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that do not apply the exception in a meaningful way (See MPEP 2106.05(e)).
The limitations:
“wherein the electronic device is an image-capture device and the software application is an imaging software installed on the electronic device”
As drafted, are additional elements that do not apply the exception in a meaningful way. See MPEP 2106.05(e).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements do not apply the exception in a meaningful way. The claim is not patent eligible.
Regarding Claim 3,
Claim 3 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 3 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“determine the set of constraints based on the hardware specification information, and the determined set of constraints includes one or more hardware-specific constraints”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply (See MPEP 2106.05(f)).
The limitations:
“the circuitry is further configured to…”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are mere instructions to apply an exception for the abstract ideas. The claim is not patent eligible.
Regarding Claim 4,
Claim 4 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 4 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“determine the set of constraints based on the cost information, and the determined set of constraints includes one or more cost constraints”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that do not apply the exception in a meaningful way (See MPEP 2106.05(e)) and are mere instructions to apply (See MPEP 2106.05(f)).
The limitations:
“wherein the acquired information includes cost information associated with the one or more functional components”
As drafted, are additional elements that do not apply the exception in a meaningful way. See MPEP 2106.05(e).
The limitations:
“the circuitry is further configured to…”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements do not apply the exception in a meaningful way or are mere instructions to apply an exception for the abstract ideas. The claim is not patent eligible.
Regarding Claim 5,
Claim 5 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 5 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 1.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that do not apply the exception in a meaningful way (See MPEP 2106.05(e)) and are mere instructions to apply (See MPEP 2106.05(f)).
The limitations:
“wherein the acquired usage data comprises: a digital footprint on the software application; a set of category tags associated with image-based content created through the software application; a user preference for the image-based content on the electronic device; and a usage pattern of a plurality of existing functionalities on the electronic device”
As drafted, are additional elements that do not apply the exception in a meaningful way. See MPEP 2106.05(e).
The limitations:
“wherein the plurality of existing functionalities implements, on the electronic device, a type of neural network for one or more computer vision tasks and the one or more computer vision tasks include the selected computer vision task”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements do not apply the exception in a meaningful way or are mere instructions to apply an exception for the abstract ideas. The claim is not patent eligible.
Regarding Claim 6,
Claim 6 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 6 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“a determination of a search space that includes a collection of different types of layer”
“generate a candidate neural network based on a modification of an architecture of the seed model”
“configure hyperparameters of the generated candidate neural network based on the determined set of constraints”
“select a training dataset for the selected computer vision task”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“train, based on the selected training dataset, the generated candidate neural network on the selected computer vision task”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 7,
Claim 7 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 7 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 6.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“execute a quantization-aware training process to train the generated candidate neural network, and the quantization-aware training process includes quantization of weight parameters of the generated candidate neural network from a current bit-depth representation to a first bit-depth representation”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 8,
Claim 8 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 8 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“execution of a pruning operation on weight parameters of the trained candidate neural network”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“obtain the second neural network, from the trained candidate neural network further based on the execution of the pruning operation”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 9,
Claim 9 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 9 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 6.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“execution of a post-training quantization operation on weight parameters of the trained candidate neural network”
“obtain the second neural network, from the trained candidate neural network further based on the execution of the post-training quantization operation”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 10,
Claim 10 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 10 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“select a teacher neural network pre-trained on the selected computer vision task”
“select the generated candidate neural network as a student network”
“produce, based on the selected training dataset, a plurality of inferences”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that do not apply the exception in a meaningful way (See MPEP 2106.05(e)) and are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
wherein the one or more operations further include a knowledge distillation operation”
As drafted, are additional elements that do not apply the exception in a meaningful way. See MPEP 2106.05(e).
The limitations:
“by the selected teacher neural network”
“train the generated candidate neural network based on the produced plurality of inferences”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements do not apply the exception in a meaningful way. The claim is not patent eligible.
Regarding Claim 11,
Claim 11 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 11 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“evaluate, based on the determined set of constraints, one or more performance indicators of the trained candidate neural network”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“train the generated candidate neural network further based on the determined set of constraints”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 12,
Claim 12 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 12 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 11.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“re-execute the neural architecture search based on the evaluated one or more performance indicators being below a threshold”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 13,
Claim 13 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 13 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 11.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“obtain the second neural network further based on the trained candidate neural network and the evaluated one or more performance indicators being above a threshold.”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 14,
Claim 14 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 14 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 1.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply (See MPEP 2106.05(f)) and insignificant extra-solution activity (See MPEP 2106.05(g)).
The limitations:
“control the electronic device to display a User Interface (UI) that includes one or more of a first option to purchase the end-user feature, a second option to subscribe to the end-user feature, a description that includes an accuracy of the obtained second neural network and device resource information associated with the end-user feature, or a price associated with each of the first option and the second option”
“update the software application further based on the received selection”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
The limitations:
“receive, from the electronic device, a selection of one of the first option or the second option”
As drafted, are additional elements that amount to no more than insignificant extra-solution activity. See MPEP 2106.05(g).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply” or “insignificant extra-solution activity”. Specifically, the receiving limitations recite the well-understood, routine, and conventional activity of receiving and transmitting data over a network. MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network). Mere instructions to apply and insignificant extra-solution activity cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 15,
Claim 15 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 15 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: See corresponding analysis of claim 14.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“control the electronic device to display the UI based on the acquired usage data”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 16,
Claim 16 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 16 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“determine the price based on one or more of a cost of one of the electronic device or a functional component of the one or more functional components of the electronic device, a total time, that includes a training time to obtain the second neural network from the seed model, a complexity of the end-user feature, a cost of dataset associated with the obtained second neural network, competitive or business intelligence data associated with a plurality of users of the electronic device, or an estimate-demand for the end-user feature”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: See corresponding analysis of claim 14.
Step 2B Analysis: See corresponding analysis of claim 14.
Regarding Claim 17,
Claim 17 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 17 is directed to a system, comprising: circuitry, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“an update of weight parameters of the existing neural network model based on weight parameters of the deployed second neural network”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply an exception (See MPEP 2106.05(f)).
The limitations:
“a replacement of an existing neural network model on the electronic device with the deployed second neural network”
“an installation of the deployed second neural network as a component of the software application on the electronic device”
As drafted, are additional elements that amount to no more than mere instructions to apply an exception for the abstract ideas. See MPEP 2106.05(f).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply. Mere instructions to apply cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 19,
Claim 19 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 19 is directed to a method, which is directed to a process, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“selecting… based on the acquired information and the acquired usage data, a computer vision task”
“determining… a set of constraints associated with an implementation of the selected computer vision task on the electronic device”
“selecting… a first neural network as a seed model for the selected computer vision task”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply (See MPEP 2106.05(f)) and insignificant extra-solution activity (See MPEP 2106.05(g)).
The limitations:
“comprising: in a system that includes circuitry”
“by the circuitry”
“executing, by the circuitry, based on the seed model and the determined set of constraints, one or more operations”
“obtaining, by the circuitry, a second neural network based on the execution of the one or more operations, wherein the second neural network is trained on the selected computer vision task and the one or more operations include a neural architecture search”
“implementing, by the circuitry, an Application Programming Interface (API) call functionality on the electronic device, wherein the API call functionality includes an API call code to remotely call the deployed second neural network”
“updating, by the circuitry, based on the implementation of the API call functionality, a software application on the electronic device to include an end-user feature, wherein the end-user feature implements, on the electronic device, the deployed second neural network for the selected computer vision task”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
The limitations:
“acquiring, by the circuitry, information associated with one or more functional components of an electronic device, wherein the acquired information comprises hardware specification information that indicates hardware resources available at the electronic device”
“acquiring, by the circuitry, usage data associated with the electronic device”
“deploying, by the circuitry, the obtained second neural network on a cloud server”
As drafted, are additional elements that amount to no more than insignificant extra-solution activity. See MPEP 2106.05(g).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply” or “insignificant extra-solution activity”. Specifically, the acquiring and deploying limitations recite the well-understood, routine, and conventional activity of receiving and transmitting data over a network. MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network). Mere instructions to apply and insignificant extra-solution activity cannot provide an inventive concept. The claim is not patent eligible.
Regarding Claim 20,
Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 Analysis: Claim 20 is directed to a non-transitory computer-readable medium having stored thereon, which is directed to a machine, one of the statutory categories.
Step 2A Prong One Analysis: The limitations:
“selecting, based on the acquired information and the acquired usage data, a computer vision task”
“determining a set of constraints associated with an implementation of the selected computer vision task on the electronic device”
“selecting a first neural network as a seed model for the selected computer vision task”
As drafted, under their broadest reasonable interpretations, cover mental processes, i.e., concepts performed in the human mind (including an observation, evaluation, judgement, opinion). The above limitations in the context of this claim correspond to mental processes, e.g., evaluation and judgement with assistance of pen and paper.
Step 2A Prong Two Analysis: The judicial exceptions are not integrated into a practical application. In particular, the claim recited additional elements that are mere instructions to apply (See MPEP 2106.05(f)) and insignificant extra-solution activity (See MPEP 2106.05(g)).
The limitations:
“A non-transitory computer-readable medium having stored thereon, computer-executable instructions that when executed by a computer in a system, causes the system to execute operations”
“executing, based on the seed model and the determined set of constraints, one or more operations”
“obtaining a second neural network based on the execution of the one or more operations, wherein the second neural network is trained on the selected computer vision task and the one or more operations include a neural architecture search”
“implementing an Application Programming Interface (API) call functionality on the electronic device, wherein the API call functionality includes an API call code to remotely call the deployed second neural network”
“updating, based on the implementation of the API call functionality, a software application on the electronic device to include an end-user feature, wherein the end-user feature implements, on the electronic device, the deployed second neural network for the selected computer vision task”
As drafted, are additional elements that amount to no more than mere instructions to apply. See MPEP 2106.05(f).
The limitations:
“acquiring information associated with one or more functional components of an electronic device, wherein the acquired information comprises hardware specification information that indicates hardware resources available at the electronic device”
“acquiring usage data associated with the electronic device”
“deploying the obtained second neural network on a cloud server”
As drafted, are additional elements that amount to no more than insignificant extra-solution activity. See MPEP 2106.05(g).
Therefore, the additional elements do not integrate the abstract ideas into a practical application.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract ideas into a practical application, all of the additional elements are “mere instructions to apply” or “insignificant extra-solution activity”. Specifically, the acquiring and deploying limitations recite the well-understood, routine, and conventional activity of receiving and transmitting data over a network. MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network). Mere instructions to apply and insignificant extra-solution activity cannot provide an inventive concept. The claim is not patent eligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-4, 6-7, 9, 11-13, 17, and 19-20 is rejected under 35 U.S.C. 103 as being unpatentable over Nunes Coelho, Jr. et al. (U.S. Patent Publication No. 2023/0229895) (“Nunes Coelho, Jr.”) in view of Tanizawa et al. (U.S. Patent Publication No. 2021/0073641) (“Tanizawa”).
Regarding claim 1, Nunes Coelho, Jr. teaches a system, comprising: circuitry configured to: acquire information associated with one or more functional components of an electronic device (Nunes Coelho, Jr. [0051] “In calculation of a score that accounts for both performance and energy cost (e.g., of a model or of a layer within the model), the energy cost can be measured, predicted, or estimated, as needed. For example, in some examples, the reference energy cost, candidate energy cost, or both are measured when executing and/or training the respective models and/or layers on a target device” Nunes Coelho, Jr. provides measuring performance and energy cost in a target device, corresponding to acquire information associated with one or more functional components of an electronic device.) …acquire usage data associated with the electronic device (Nunes Coelho, Jr. [0076] “The example system 600 can include a server computing system 602, a network search computing system 620, and a performance evaluation computing system 640 that are communicatively coupled over a network 660. In some examples, the system 600 may include a user computing device 670.”; [0087] “In some implementations, if a user has provided consent, the training examples can be provided by a user computing device 670 (e.g., based on communications previously provided by the user of the user computing device 670). Thus, in such implementations, model trainer 650 can train using user-specific communication data received from the user computing device 670. In some instances, this process can be referred to as personalizing the model being trained.” Nunes Coelho, Jr. provides acquiring user activity data associated with a user computing device 670 corresponding to acquiring usage data associated with the electronic device); select, based on the acquired information and the acquired usage data, a computer vision task (Nunes Coelho, Jr. [0064] “In some embodiments, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using the actual task (e.g., the “real task”) for which the reference neural network model 102 is being optimized or designed... For instance, evaluating the performance characteristics using the proxy task may include using a smaller training and/or verification data set than the real task (e.g., down-sampled versions of images and/or other data) and/or evaluating the real task for fewer epochs than would generally be used to train the model using the real task.” Nunes Coelho, Jr. provides evaluating performance characteristics based on a selected/actual task and using image data as training data based on acquired/usage information, corresponding to select a computer vision task, based on the acquired information and the acquired usage data.); determine a set of constraints associated with an implementation of the selected computer vision task on the electronic device (Nunes Coelho, Jr. [0065] “In some implementations, real-world energy costs can be directly measured by executing the model on a particular platform (e.g., a mobile device such as the Google Pixel device). In further implementations, various other performance characteristics can be included in a multi-objective function that guides the search process, including, as examples, power consumption, user interface responsiveness, peak compute requirements, and/or other characteristics of the generated network models.” Nunes Coelho, Jr. provides determining various performance characteristics relating to generated models to implement a computer vision task on an electronic device including power consumption and peak compute requirements corresponding to determine a set of constraints associated with an implementation of the selected computer vision task on the electronic device.); select a first neural network as a seed model for the selected computer vision task (Nunes Coelho, Jr. [0020] “In some embodiments, systems and methods of the present disclosure may produce an optimized neural network model by optimizing the existing architecture of a provided reference neural network model.”; [0059] “In some implementations, the performance evaluation subsystem 108 may evaluate the performance of candidate models 106 using pre-trained model values inherited from the reference neural network model 102, subject to the modifications that may have been applied by the controller model 104 (e.g., quantization). In this manner, the performance evaluation subsystem 108 may quickly evaluate the candidate models 106 for comparison to the reference model 102.”; [0060] “The trained models 204 may be optionally trained using inherited trained values from the reference model 102 as seed values or using inherited trained values directly, or both.”; [0069] “At 502, a computing system can receive a reference neural network model. The reference neural network model may be received in any suitable manner, such as via transmission to or within the computing system, such as from local or remote storage or via networked communications channels.” Nunes Coelho, Jr. provides receiving a reference neural network, which is used to evaluate performance characteristics and seed values, corresponding to select a first neural network as a seed model for the selected computer vision task.); execute, based on the seed model and the determined set of constraints, one or more operations (Nunes Coelho, Jr. [0060] “The trained models 204 may be optionally trained using inherited trained values from the reference model 102 as seed values or using inherited trained values directly, or both.”; [0061] “For example, one or more performance characteristics 210 of the trained candidate model(s) 204 may include a validation accuracy and/or an energy cost associated with the training and/or the execution of the one or more trained candidate model(s) 204 on the real-world device(s) 208.” Nunes Coelho, Jr. provides executing operations on a real-world device based on a seed model and energy cost corresponding to execute, based on the seed model and the determined set of constraints, one or more operations); obtain a second neural network based on the execution of the one or more operations (Nunes Coelho, Jr. [0020] “Generally, the present disclosure is directed to systems and methods for performing a neural architecture search to produce a neural network model architecture that provides an improved tradeoff between performance and energy consumption.”; [0057] “For example, the controller model 104 may search a search space comprising a first searchable subspace corresponding to a quantization scheme for quantizing one or more values within a layer of the reference neural network model 102 and a second searchable subspace corresponding to a size of the layer (e.g., the number of filters within the layer and/or number of output units). Based on values selected from the searchable subspaces, the controller model 104 may generate one or more candidate models 106 for evaluation by a performance evaluation subsystem 108.”; [0064] “In some embodiments, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using the actual task (e.g., the “real task”) for which the reference neural network model 102 is being optimized or designed. For instance, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using a set of training data that will be used to train the resulting model that includes the optimized neural network model” Nunes Coelho, Jr. provides executing a neural architecture search using a reference neural network and corresponding seed values to obtain an optimized neural network for implementing computer vison tasks on a real-world user device, corresponding to obtain a second neural network based on the execution of the one or more operations.), wherein the obtained second neural network is trained on the selected computer vision task, and the one or more operations include a neural architecture search (Nunes Coelho, Jr. [0020] “Generally, the present disclosure is directed to systems and methods for performing a neural architecture search to produce a neural network model architecture that provides an improved tradeoff between performance and energy consumption.”; [0064] “In some embodiments, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using the actual task (e.g., the “real task”) for which the reference neural network model 102 is being optimized or designed. For instance, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using a set of training data that will be used to train the resulting model that includes the optimized neural network model.” Nunes Coelho, Jr. provides training an optimized model using tasks and the operations include a neural architecture search corresponding to the obtained second neural network is trained on the selected computer vision task and the one or more operations include a neural architecture search.); deploy the obtained second neural network on a cloud server (Nunes Coelho, Jr. [0079] “For example, the one or more neural network models 612 can include a reference neural network model to be optimized according to the present disclosure. The neural network models 612 can be uploaded to the server computing system 602 for storage thereon, and in some embodiments, the server computing system 602 hosts or otherwise operates the one or more neural network models 612 in an application. In some implementations, the systems and methods can be provided as a cloud-based service (e.g., by the server computing system 602).” Nunes Coelho, Jr. provides deploying an optimized neural network to a cloud server corresponding to deploying an obtained second neural network on a cloud server.); implement an Application Programming Interface (API) call functionality on the electronic device, wherein the API call functionality includes an API call code to remotely call the deployed second neural network (Nunes Coelho, Jr. [0099] “Example applications include a text messaging application, an email application, a dictation application, a virtual keyboard application, a browser application, etc. In some implementations, each application can communicate with the central intelligence layer (and model(s) stored therein) using an API (e.g., a common API across all applications).” Nunes Coelho, Jr. provides an API for models corresponding to the API call functionally includes an API call code to remotely call the deployed second neural network.); and update, based on the implementation of the API call functionality, a software application on the electronic device to include an end-user feature, wherein the end-user feature implements, on the electronic device, the deployed second neural network for the selected computer vision task (Nunes Coelho, Jr. [0099] “The computing device 800 includes a number of applications (e.g., applications 1 through N). Each application is in communication with a central intelligence layer. Example applications include a text messaging application, an email application, a dictation application, a virtual keyboard application, a browser application, etc. In some implementations, each application can communicate with the central intelligence layer (and model(s) stored therein) using an API (e.g., a common API across all applications).”; [0102] “As one example, the systems and methods of the present disclosure can be included or otherwise employed within the context of an application, a browser plug-in, or in other contexts. Thus, in some implementations, the models of the present disclosure can be included in or otherwise stored and implemented by a user computing device such as a laptop, tablet, or smartphone. As yet another example, the models can be included in or otherwise stored and implemented by a server computing device that communicates with the user computing device according to a client-server relationship. For example, the models can be implemented by the server computing device as a portion of a web service (e.g., a web email service)” Nunes Coelho, Jr. provides updating software applications on a user device with the optimized neural network including end-user features such as web services including use of an API to call deployed models, corresponding to update a software application on the electronic device to include an end-user feature, wherein the end-user feature implements, on the electronic device, the obtained second neural network for the selected computer vision task.).
Nunes Coelho, Jr. fails to explicitly teach, wherein the acquired information comprises hardware specification information that indicates hardware resources available at the electronic device.
However, Tanizawa teaches acquire information associated with one or more functional components of an electronic device, wherein the acquired information comprises hardware specification information that indicates hardware resources available at the electronic device (Tanizawa [0048] “The target constraint condition 30 is information indicating a constraint condition configured to operate the neural network model 32 with target hardware.”; [0050] “In other words, the target constraint condition 30 is an index determined by specifications of the target hardware.”; [0051] “FIG. 4 is a diagram illustrating an example of the target constraint condition 30. The target constraint condition 30 is stored in association with a hardware ID that is identification information of the target hardware. FIG. 4 illustrates an example in which the target constraint condition 30 includes the hardware ID. The hardware ID is the identification information of the target hardware. The target constraint condition 30 includes, for example, at least one item of a model size, a model calculation amount, latency, power consumption, an inference speed, memory usage, a memory size of a neural network model, and a memory bandwidth, and a target value corresponding to each item.” Tanizawa provides acquiring hardware specification information of a target device which is to run a neural network, including for example, at least one item of a model size, a model calculation amount, latency, power consumption, an inference speed, memory usage, a memory size of a neural network model, and a memory bandwidth, corresponding to acquiring hardware specification information that indicates hardware resources available at the electronic device.).
Nunes Coelho, Jr. and Tanizawa are both considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and specifically applied to running a neural network on a target hardware/device. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. with the above teachings of Tanizawa. Doing so would prevent a neural network deployed on a target hardware from bottlenecking the target hardware (Tanizawa [0157] “In addition, when the third learned model structure 27C obtained by contracting the second learned model structure 27B is used, the morphing unit 18 can contract the neural network model 33 of the third learned model structure 27C to the minimum of the specifications of the target hardware while maintaining the generality such that the specific convolution processing block 26B does not become a bottleneck.”).
Regarding claim 2, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1, as discussed above in the rejection of claim 1, wherein the electronic device is an image-capture device and the software application is an imaging software installed on the electronic device (Nunes Coelho, Jr. [0088] “The user computing device 670 can be any type of computing device, such as, for example, a personal computing device (e.g., laptop or desktop), a mobile computing device (e.g., smartphone or tablet), a gaming console or controller, a wearable computing device, an embedded computing device, or any other type of computing device.” [0064] “For instance, evaluating the performance characteristics using the proxy task may include using a smaller training and/or verification data set than the real task (e.g., down-sampled versions of images and/or other data) and/or evaluating the real task for fewer epochs than would generally be used to train the model using the real task.” [0102] “For example, the models can be implemented by the server computing device as a portion of a web service (e.g., a web email service).” Nunes Coelho, Jr. provides a smartphone corresponding to an image-capture device and training an optimized model with images and implementing the model in user web services, corresponding to an imaging software installed on the electronic device.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 1.
Regarding claim 3, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1, as discussed above in the rejection of claim 1, wherein the circuitry is further configured to determine the set of constraints based on the hardware specification information, and the determined set of constraints includes one or more hardware-specific constraints (Tanizawa [0048] “The target constraint condition 30 is information indicating a constraint condition configured to operate the neural network model 32 with target hardware.”; [0050] “In other words, the target constraint condition 30 is an index determined by specifications of the target hardware.”; [0051] “FIG. 4 is a diagram illustrating an example of the target constraint condition 30. The target constraint condition 30 is stored in association with a hardware ID that is identification information of the target hardware. FIG. 4 illustrates an example in which the target constraint condition 30 includes the hardware ID. The hardware ID is the identification information of the target hardware. The target constraint condition 30 includes, for example, at least one item of a model size, a model calculation amount, latency, power consumption, an inference speed, memory usage, a memory size of a neural network model, and a memory bandwidth, and a target value corresponding to each item.” Tanizawa provides target constraint condition 30, which determines the set of constraints based on the hardware specification information, and the determined set of constraints includes one or more hardware-specific constraints.).
Nunes Coelho, Jr. and Tanizawa are both considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and specifically applied to running a neural network on a target hardware/device. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. with the above teachings of Tanizawa. Doing so would prevent a neural network deployed on a target hardware from bottlenecking the target hardware (Tanizawa [0157] “In addition, when the third learned model structure 27C obtained by contracting the second learned model structure 27B is used, the morphing unit 18 can contract the neural network model 33 of the third learned model structure 27C to the minimum of the specifications of the target hardware while maintaining the generality such that the specific convolution processing block 26B does not become a bottleneck.”).
Regarding claim 4, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1, as discussed above in the rejection of claim 1, wherein the acquired information includes cost information associated with the one or more functional components, the circuitry is further configured to determine the set of constraints based on the cost information, and the determined set of constraints includes one or more cost constraints (Nunes Coelho, Jr. [0061] “The trainer 202 may directly evaluate one or more performance characteristics of the trained candidate model(s) 204 directly. For example, one or more performance characteristics 206 of the trained candidate model(s) 204 may include a validation accuracy and/or an energy cost associated with the training and/or the execution of the one or more trained candidate model(s) 204. For example, the energy cost can be directly computed using one or more look up tables or formulas which directly translate from model characteristics (e.g., number/types of operations and quantization scheme) to an energy cost value.” Nunes Coelho, Jr. provides determining cost information associated an electronic device and generating models from that information, corresponding to the acquired information includes cost information associated with the one or more functional components, the circuitry is further configured to determine the set of constraints based on the cost information, and the determined set of constraints includes one or more cost constraints.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 1.
Regarding claim 6, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1, as discussed above in the rejection of claim 1, wherein the execution of the one or more operations comprises: a determination of a search space that includes a collection of different types of layers (Nunes Coelho, Jr. [0025] “In some embodiments, a network search space is constructed to correspond to the architecture (e.g., the arrangement, configuration, and/or number of layers) of a given neural network model, permitting efficient search for optimizing the neural network structure without limitation to the complexity of the neural network model.” Nunes Coelho, Jr. provides a network search space constructed to correspond to a configuration and/or number of layers, corresponding to a determination of a search space that includes a collection of different types of layers.); and the execution of the neural architecture search within the search space to: generate a candidate neural network based on a modification of an architecture of the seed model (Nunes Coelho, Jr. [0052] “For instance, a controller model may be employed to generate a candidate neural network model by modifying a reference neural network model according to one or more values selected from the network search space, such as from a first searchable subspace corresponding to a quantization scheme for quantizing one or more values within the reference model and from a second searchable subspace corresponding to a number of filters contained in a layer within the reference model.”; [0060] “In some implementations, the example system 100 may be configured as shown in FIG. 2. The performance evaluation subsystem 108 may comprise a trainer 202 which trains the one or more candidate models 106 to produce one or more trained candidate models 204. The trained models 204 may be optionally trained using inherited trained values from the reference model 102 as seed values or using inherited trained values directly, or both. The trained models 204 may also be trained from scratch.” Nunes Coelho, Jr. provides generating a candidate neural network based on a reference neural network and training them using seed values from the reference neural network, corresponding to generate a candidate neural network based on a modification of an architecture of the seed model.); configure hyperparameters of the generated candidate neural network based on the determined set of constraints (Nunes Coelho, Jr. [0067] “For instance, a performance evaluation subsystem 108 may comprise a system which is desired to be optimized for energy cost and/or performance on execution, but is already optimized in other aspects, including hyperparameters governing aspects of the network architecture. By preserving the configuration of the architecture of the reference neural network model 102, subject to the modifications by the controller model 104, the systems and methods according to the present disclosure can retain any advantages of prior investment in optimizing the hyperparameters governing the network's architecture.” Nunes Coelho, Jr. provides configuring hyperparameters for an optimized neural network in accordance with optimization parameters corresponding to configure hyperparameters of the candidate neural network based on the determined set of constraints.); select a training dataset for the selected computer vision task (Nunes Coelho, Jr. [0060] “The performance evaluation subsystem 108 may comprise a trainer 202 which trains the one or more candidate models 106 to produce one or more trained candidate models 204. The trained models 204 may be optionally trained using inherited trained values from the reference model 102 as seed values or using inherited trained values directly, or both. The trained models 204 may also be trained from scratch.”; [0064] “For instance, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using a set of training data that will be used to train the resulting model that includes the optimized neural network model.”; [0067] “In some embodiments, the performance evaluation subsystem 108 may comprise training data for training the candidate models 106, advantageously avoiding the transmission of training data between the controller model 104 and the trainer 202.” Nunes Coelho, Jr. provides selecting a set of training data to optimize the neural network.); and train, based on the selected training dataset, the generated candidate neural network on the selected computer vision task, (Nunes Coelho, Jr. [0060] “The performance evaluation subsystem 108 may comprise a trainer 202 which trains the one or more candidate models 106 to produce one or more trained candidate models 204.”; [0064] “For instance, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using a set of training data that will be used to train the resulting model that includes the optimized neural network model.” Nunes Coelho, Jr. provides training a candidate neural network with a selected set of training data.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 1.
Regarding claim 7, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 6, as discussed above in the rejection of claim 6, wherein the circuitry is further configured to execute a quantization-aware training process to train the generated candidate neural network (Nunes Coelho, Jr. [0023] “In some cases, reducing the precision (e.g., bitwidth) of values within a given neural network may be accomplished by quantization, which includes methods of mapping higher precision numbers into bins corresponding to lower precision numbers.”; [0061] “For example, one or more performance characteristics 206 of the trained candidate model(s) 204 may include a validation accuracy and/or an energy cost associated with the training and/or the execution of the one or more trained candidate model(s) 204. For example, the energy cost can be directly computed using one or more look up tables or formulas which directly translate from model characteristics (e.g., number/types of operations and quantization scheme) to an energy cost value.”; Nunes Coelho, Jr. provides implementing quantization schemes to train a candidate neural network corresponding to executing a quantization-aware training process to train the candidate neural network.), and the quantization-aware training process includes quantization of weight parameters of the generated candidate neural network from a current bit-depth representation to a first bit-depth representation (Nunes Coelho, Jr. [0022] “High precision numbers require more bits for representation within the computing system, and this increased bitwidth (which can also be referred to in some instances as bit depth) is associated with several energy costs, including increased storage costs, retrieval costs, and calculation costs.”; [0023] “In some cases, reducing the precision (e.g., bitwidth) of values within a given neural network may be accomplished by quantization, which includes methods of mapping higher precision numbers into bins corresponding to lower precision numbers.”; [0031] “In some examples, systems and methods according to the present disclosure reduce the energy consumption by a model by quantizing one or more values or sets of values (e.g., the inputs, weights, filters, and/or biases for a layer) in view of both the quantity of bits for the values as well as the cost of the necessary types of operations to be applied to the values.” Nunes Coelho, Jr. provides quantizing weights and reducing bit width (also known as bit depth), corresponding to quantize weight parameters of the candidate neural network from a current bit-depth representation to a first bit-depth representation.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 6.
Regarding claim 9, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 6, as discussed above in the rejection of claim 6, wherein the execution of the one or more operations further comprises execution of a post-training quantization operation on weight parameters of the trained candidate neural network, and the circuitry is further configured to obtain the second neural network from the trained candidate neural network, further based on the execution of the post-training quantization operation (Nunes Coelho, Jr. [0070] “The first searchable subspace corresponds to a quantization scheme for quantizing one or more values of the candidate neural network model, and the second searchable subspace corresponds to a size of a layer (e.g., the quantity of filters and/or output units contained in the layer) of the candidate neural network model.”; [0083] “The model trainer 650 can perform a number of generalization techniques (e.g., weight decays, dropouts, etc.) to improve the generalization capability of the models being trained.”; [0109] “For example, the above example chooses a quantizer for trainable parameters within a layer (e.g., weights, filters, and/or biases), and the quantizer may be the same or different for one or more of the layers and/or one or more of the parameters within the layer.” Nunes Coelho, Jr. provides quantizing weight parameters of a candidate neural network after training corresponding to execution of a post-training quantization operation on weight parameters of the trained candidate neural network, the circuitry is further configured to obtain the second neural network from the trained candidate neural network, further based on the execution of the post-training quantization operation.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 6.
Regarding claim 11, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 6, as discussed above in the rejection of claim 6, wherein the circuitry is further configured to: train the generated candidate neural network further based on the determined set of constraints (Nunes Coelho, Jr. [0060] “The performance evaluation subsystem 108 may comprise a trainer 202 which trains the one or more candidate models 106 to produce one or more trained candidate models 204.”; [0064] “For instance, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using a set of training data that will be used to train the resulting model that includes the optimized neural network model.”; [0064] “In some embodiments, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using the actual task (e.g., the “real task”) for which the reference neural network model 102 is being optimized or designed. For instance, the one or more performance characteristic(s) 206 and/or the one or more performance characteristics 210 may be evaluated using a set of training data that will be used to train the resulting model that includes the optimized neural network model.”; [0065] “In some implementations, real-world energy costs can be directly measured by executing the model on a particular platform (e.g., a mobile device such as the Google Pixel device). In further implementations, various other performance characteristics can be included in a multi-objective function that guides the search process, including, as examples, power consumption, user interface responsiveness, peak compute requirements, and/or other characteristics of the generated network models.” Nunes Coelho, Jr. provides training a generated candidate neural network further based on a determined set of constraints); and evaluate, based on the determined set of constraints, one or more performance indicators of the trained candidate neural network (Nunes Coelho, Jr. [0061] “The trainer 202 may directly evaluate one or more performance characteristics of the trained candidate model(s) 204 directly.”; [0066] “In some embodiments, the system 100 may evaluate candidate models 106 in a constraint evaluation module 402, as shown in FIG. 4. A constraint evaluation module 402 may be included in the controller model 104 in some examples, and additionally, or alternatively, may be included in the performance evaluation subsystem 108 in some examples.” Nunes Coelho, Jr. provides evaluating performance of trained candidate neural networks corresponding to evaluate, based on the determined set of constraints, one or more performance indicators of the trained candidate neural network.)
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 6.
Regarding claim 12, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 11, as discussed above in the rejection of claim 11, wherein the circuitry is further configured to re-execute the neural architecture search based on the evaluated one or more performance indicators being below a threshold (Nunes Coelho, Jr. [0066] “The constraint evaluation module 402 may evaluate threshold determinations regarding the candidate models 106 (e.g., dimensionality and/or other compatibility concerns, etc.) and return constraint feedback 404 to the controller model 104 prior to engaging in a computationally expensive training in the trainer 202. In this manner, threshold determinations regarding performance may be performed and prior to passing the candidate models 106 to the next stage.”; [0057] “The controller model 104 may then search a network search space corresponding to the neural network architecture of the reference neural network model 102.” Nunes Coelho, Jr. provides a threshold for performance and providing feedback to controller model 104, which executes the neural architecture search, corresponding to the neural architecture search is re- executed based on a determination that the evaluated one or more performance indicators are below a threshold.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 11.
Regarding claim 13, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 11, as discussed above in the rejection of claim 11, wherein the circuitry is further configured to obtain the second neural network further based on the trained candidate neural network and the evaluated one or more performance indicators being above a threshold (Nunes Coelho, Jr. [0066] “The constraint evaluation module 402 may evaluate threshold determinations regarding the candidate models 106 (e.g., dimensionality and/or other compatibility concerns, etc.) and return constraint feedback 404 to the controller model 104 prior to engaging in a computationally expensive training in the trainer 202. In this manner, threshold determinations regarding performance may be performed and prior to passing the candidate models 106 to the next stage.” Nunes Coelho, Jr. provides a performance threshold, wherein candidate models are passed to the next stage when above a performance threshold corresponding to obtaining the second neural network further based the trained candidate neural network and the evaluated one or more performance indicators are above a threshold.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 11.
Regarding claim 17, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1, as discussed above in the rejection of claim 1, wherein the update of the software application comprises: a replacement of an existing neural network model on the electronic device with the deployed second neural network (Nunes Coelho, Jr. [0020] “Generally, the present disclosure is directed to systems and methods for performing a neural architecture search to produce a neural network model architecture that provides an improved tradeoff between performance and energy consumption. In some embodiments, systems and methods of the present disclosure may produce an optimized neural network model by optimizing the existing architecture of a provided reference neural network model” Nunes Coelho, Jr. replacing a reference neural network from an electronic device with an optimized neural network corresponding to a replacement of an existing neural network model on the electronic device with the second neural network); an installation of the deployed second neural network as a component of the software application on the electronic device (Nunes Coelho, Jr. [0092] “In some examples, the neural network model(s) 680 are trained and/or pre-trained by the performance evaluation computing system 640 prior to loading onto the user computing device 670. The user computing device 670 may then execute and/or apply the neural networks 680 to evaluate one or more performance metrics, such as accuracy and/or an energy cost metric. For example, the user computing device may measure a real-world energy cost associated with applying the trained neural network model(s) 680 received from the performance evaluation computing system 640.” Nunes Coelho, Jr. replacing a reference neural network from an electronic device with an optimized neural network corresponding to an installation of the second neural network as a component of the software application on the electronic device.), and an update of weight parameters of the existing neural network model based on weight parameters of the deployed second neural network (Nunes Coelho, Jr. [0109] “For example, the above example chooses a quantizer for trainable parameters within a layer (e.g., weights, filters, and/or biases), and the quantizer may be the same or different for one or more of the layers and/or one or more of the parameters within the layer.” Nunes Coelho, Jr. provides quantizing weight parameters for implementing an optimized neural network on an electronic device corresponding to an update of parameters, including weight parameters of an existing neural network on the electronic device with that of the second neural network.).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 1.
Regarding claim 19, it is the method embodiment of claim 1 with similar limitations to claims 1 and is rejected using the same reasoning disclosed above in the rejection of claim 1.
Regarding claim 20, it is the non-transitory computer-readable medium having stored thereon, computer- executable instructions that when executed by a computer in a system, causes the system to execute operations embodiment of claim 1 with similar limitations to claims 1 and is rejected using the same reasoning disclosed above in the rejection of claim 1. Further, Nunes Coelho, Jr. teaches a non-transitory computer-readable medium having stored thereon, computer- executable instructions that when executed by a computer in a system, causes the system to execute operations (Nunes Coelho, Jr. [0077] “The memory 606 can include one or more non-transitory computer-readable storage mediums, such as RAM, SRAM, DRAM, ROM, EEPROM, EPROM, flash memory devices, magnetic disks, etc., and combinations thereof. The memory 606 can store data 608 and instructions 610 which are executed by the processor 604 to cause the server computing system 602 to perform operations.” Nunes Coelho, Jr. provides a non-transitory computer-readable medium having stored thereon, computer- executable instructions that when executed by a computer in a system, causes the system to execute operations.)
It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa for the same reasons disclosed above in the rejection of claim 1.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Nunes Coelho, Jr. et al. (U.S. Patent Publication No. 2023/0229895) (“Nunes Coelho, Jr.”) in view of Tanizawa et al. (U.S. Patent Publication No. 2021/0073641) (“Tanizawa”) in further view of ELDEEB et al. (U.S. Patent Publication No. 2020/0380389) (“Eldeeb”).
Regarding claim 5, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1 as discussed above in the rejection of claim 1, but fails to teach wherein the acquired usage data comprises: a digital footprint on the software application; a set of category tags associated with image-based content created through the software application; a user preference for the image-based content on the electronic device; and a usage pattern of a plurality of existing functionalities on the electronic device, wherein the plurality of existing functionalities implements, on the electronic device, a type of neural network for one or more computer vision tasks and the one or more computer vision tasks include the selected computer vision task.
However, Eldeeb teaches wherein the acquired usage data comprises: a digital footprint on the software application (Eldeeb [0085] “User data and models 231 include various data associated with the user (e.g., user-specific vocabulary data, user preference data, user-specified name pronunciations, data from the user's electronic address book, to-do lists, shopping lists, etc.) to provide the client-side functionalities of the digital assistant.” Eldeeb provides user online activity on a web browser corresponding to usage data comprises: a digital footprint on the software application.); a set of category tags associated with image-based content created through the software application (Eldeeb [0114] “In conjunction with image management module 244, e-mail client module 240 makes it very easy to create and send e-mails with still or video images taken with camera module 243.”; [0118] “In conjunction with touch screen 212, display controller 256, contact/motion module 230, graphics module 232, text input module 234, and camera module 243, image management module 244 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.” Eldeeb provides images uploaded to an email service taken from a camera, which are categorized by image management module 244 including labeling, corresponding to a set of category tags related to image-based content created through the software application.); a user preference for the image-based content on the electronic device (Eldeeb [0085] “User data and models 231 include various data associated with the user (e.g., user-specific vocabulary data, user preference data, user-specified name pronunciations, data from the user's electronic address book, to-do lists, shopping lists, etc.) to provide the client-side functionalities of the digital assistant.” Eldeeb provides user preference data corresponding to a user preference for the image-based content on the electronic device.); and a usage pattern of a plurality of existing functionalities on the electronic device (Eldeeb [0085] “Further, user data and models 231 include various models (e.g., speech recognition models, statistical language models, natural language processing models, ontology, task flow models, service models, etc.) for processing user input and determining user intent.” Eldeeb provides model task flows included in user and model data corresponding to a usage pattern of existing functionalities that implement a type of neural network for one or more computer vision tasks.), wherein the plurality of existing functionalities implements, on the electronic device, a type of neural network for one or more computer vision tasks, and the one or more computer vision tasks include the selected computer vision task (Eldeeb [0276] “A machine learning model includes one or more algorithms, mathematical models, statistical models, and/or neural network models. A machine learning model can perform a specific task without using explicit instructions. To perform a specific task (e.g., make a prediction or decision) without explicit instructions, a machine learning model can be pre-trained using training data. After training is performed, first machine learning model 1033 receives vectors representing the tokens generated from the data items. First machine learning model 1033 processes the vectors to predict sentiment of the data items represented by the tokens.” Eldeeb provides implementing a type of neural network for a computer vision task.).
Nunes Coelho, Jr., Tanizawa and Eldeeb are all considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and more specifically user specific neural networks. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa with the above teachings of Eldeeb. Doing so would allow for customized suggestions based on a user’s digital footprint (Eldeeb [0034] “To provide customized suggestions, impressions are collected from a plurality of data sources. The impressions include data that reflect user activities.”).
Claims 8 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Nunes Coelho, Jr. et al. (U.S. Patent Publication No. 2023/0229895) (“Nunes Coelho, Jr.”) in view of Tanizawa et al. (U.S. Patent Publication No. 2021/0073641) (“Tanizawa”) in further view of Ravi et al. (U.S. Patent Publication No. 2020/0125956) (“Ravi”).
Regarding claim 8, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 6 as discussed above in the rejection of claim 6, but fails to teach wherein the execution of the one or more operations further comprises execution of a pruning operation on weight parameters of the trained candidate neural network, and circuitry is further configured to obtain the second neural network from the trained candidate neural network, further based on the execution of the pruning operation.
However, Ravi teaches wherein the execution of the one or more operations further comprises execution of a pruning operation on weight parameters of the trained candidate neural network (Ravi [0089] “The machine learning manager 122 can provide a number of machine learning services such as, for example, a model training service and/or a training data management service. The machine learning manager 122 can include and use a machine learning library to train models.”; [0099] “The model compression service and/or model conversion service can enable the developer to compress and/or convert the models to optimize the models for use by a mobile device or in the mobile environment. For example, compressing the model can include performing quantization (e.g., scalar quantization, vector quantization weight sharing, product quantization, etc.), pruning (e.g., pruning by values, L1 regularization, etc.), low rank representation (e.g., circulatent matrix, Kronecker structures, SVD decompositions, etc.), distillation, and/or other compression techniques.”; [0100] “Pruning reduces model size by removing weights or operations from the model that are least useful for predictions, including, for example, low-scoring weights.” Ravi provides pruning weights of trained machine learning models corresponding to a pruning operation on weight parameters of the trained candidate neural network.), and circuitry is further configured to obtain the second neural network from the trained candidate neural network, further based on the execution of the pruning operation (Ravi [0099] “For example, compressing the model can include performing quantization (e.g., scalar quantization, vector quantization weight sharing, product quantization, etc.), pruning (e.g., pruning by values, L1 regularization, etc.), low rank representation (e.g., circulatent matrix, Kronecker structures, SVD decompositions, etc.), distillation, and/or other compression techniques.”; [0136] “The framework permits efficient distributed training but can be optimized to produce a neural network model with low memory footprint that can run on devices at low computation cost.” Ravi provides pruning weights of a neural network for optimization corresponding to obtain the second neural network from the trained candidate neural network, further based on the execution of the pruning operation.).
Nunes Coelho, Jr., Tanizawa, and Ravi are all considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and more specifically user specific neural networks. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa with the above teachings of Ravi. Doing so would reduce model complexity (Ravi [0151] “For example, LSTM RNN models typically apply pruning and use smaller, fixed-size vocabularies in the input encoding step to reduce model complexity.”).
Regarding claim 10, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 6 as discussed above in the rejection of claim 6, but fails to teach wherein the one or more operations further include a knowledge distillation operation, and the circuitry is further configured to: execute the knowledge distillation operation to: select a teacher neural network pre-trained on the selected computer vision task; and select the generated candidate neural network as a student network; produce, based on the selected training dataset, a plurality of inferences by the selected teacher neural network; and train the generated candidate neural network based on the produced plurality of inferences.
However, Ravi teaches wherein the one or more operations further include a knowledge distillation operation, and the circuitry is further configured to: execute the knowledge distillation operation to: select a teacher neural network pre-trained on the selected computer vision task (Ravi [0049] “The joint training enables the compact machine-learned model to learn from (and/or with) the trainer model, thereby improving the prediction accuracy of the compact machine-learned model. Thus, the joint training can follow a teacher-student joint training architecture.”; [0050] “Thus, in some implementations, the application development platform can include and implement a training pipeline to train a compact machine-learned model. The training pipeline can train the compact machine-learned model individually and/or jointly train the compact machine-learned model with a trainer model (e.g., pre-trained model). Thus, the trainer or teacher model can be fixed or can be jointly optimized with the student model.”’ [0068] “These first party models can be general-use machine-learned models that provide high quality performance at commonly required tasks such as speech analysis (e.g., natural language processing, voice recognition, and/or the like), text analysis, image analysis (e.g., object detection, barcode/QR code reading, optical character recognition, and/or other tasks which may be categorized as “mobile vision”), and/or the like.” Ravi provides a pre-trained teacher model for implementing object detection corresponding to a knowledge distillation operation including selecting a teacher neural network which is pre-trained on the selected computer vision task.); and select the generated candidate neural network as a student network (Ravi [0134] “In some implementations, the two models can be trained jointly using backpropagation, where the student network learns from the teacher network similar to apprenticeship learning. Once trained, the smaller network can be used directly for inference at low memory and computation cost.” Ravi provides a student neural network trained based on a teacher network to produce inferences corresponding to select the generated candidate neural network as a student network.); produce, based on the selected training dataset, a plurality of inferences by the selected teacher neural network (Ravi [0111] “Some of the joint training and distillation approaches provided by the present disclosure follow a teacher-student setup where the knowledge of the trainer model is utilized to learn an equivalent compact student model with minimal loss in accuracy. During training, the teacher or trainer model parameters can be held fixed (e.g., as in distillation) or jointly optimized to improve both models simultaneously.”; [0112] “So instead of providing a single compressed model, the machine learning manager 122 can generate multiple on-device models at different sizes and inference speeds and the developer can select the model that is best suited for their application needs (e.g., provides the most appropriate tradeoff between size and performance). Additionally, jointly training multiple compact models with shared parameters typically takes only slightly more time than training a single large model, but yields multiple compressed/compact models in a single shot that are smaller in size, faster, and have lower cost relative to the more complex model, while still providing good prediction accuracy.”; [0134] “In some implementations, the two models can be trained jointly using backpropagation, where the student network learns from the teacher network similar to apprenticeship learning. Once trained, the smaller network can be used directly for inference at low memory and computation cost.”; [0174] “Alternatively or additionally, the training data can be derived from large, public training datasets.” Ravi provides a student teacher network, wherein the teacher neural network produces a plurality of inferences based on public training datasets corresponding to produce, based on the selected training dataset, a plurality of inferences by the selected teacher neural network); and train the generated candidate neural network based on the produced plurality of inferences (Ravi [0134] “In some implementations, the two models can be trained jointly using backpropagation, where the student network learns from the teacher network similar to apprenticeship learning. Once trained, the smaller network can be used directly for inference at low memory and computation cost.” Ravi provides training the student neural network based on the produced plurality of inferences corresponding to train the generated candidate neural network based on the produced plurality of inferences).
Nunes Coelho, Jr., Tanizawa, and Ravi are all considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and more specifically user specific neural networks. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa with the above teachings of Ravi. Doing so would reduce model complexity (Ravi [0151] “For example, LSTM RNN models typically apply pruning and use smaller, fixed-size vocabularies in the input encoding step to reduce model complexity.”).
Claim 14-15 are rejected under 35 U.S.C. 103 as being unpatentable over Nunes Coelho, Jr. et al. (U.S. Patent Publication No. 2023/0229895) (“Nunes Coelho, Jr.”) in view of Tanizawa et al. (U.S. Patent Publication No. 2021/0073641) (“Tanizawa”) in further view of Klein (U.S. Patent Publication No. 2018/0211115) (“Klein”).
Regarding claim 14, Nunes Coelho, Jr. in view of Tanizawa teaches the system according to claim 1 as discussed above in the rejection of claim 1, but fails to teach wherein the circuitry is further configured to: control the electronic device to display a User Interface (UI) that includes one or more of a first option to purchase the end-user feature, a second option to subscribe to the end-user feature, a description that includes an accuracy of the obtained second neural network and device resource information associated with the end-user feature, or a price associated with each of the first option and the second option; receive, from the electronic device, a selection of one of the first option or the second option; and update the software application further based on the received selection.
However, Klein teaches wherein the circuitry is further configured to: control the electronic device to display a User Interface (UI) that includes one or more of a first option to purchase the end-user feature, a second option to subscribe to the end-user feature, a description that includes an accuracy of the obtained second neural network and device resource information associated with the end-user feature, or a price associated with each of the first option and the second option (Klein [0010] “In some embodiments, the MSP and the end user can access an application (i.e., via a user device such as a computer) that is configured to provide a front-end user interface (UI), which allows for viewing video clips or stream video, receive alerts and notifications, search for and purchase monitoring or security services from MSPs, conduct MSP onboarding and credentialing, manage MSP subscription, set preferences and settings, and customize user-specific security needs.”; [0021] “In various embodiments, the UGVs/UAVs 128, 110 are configured to analyze motion detection and object recognition/detection.”; [0028] “Similarly, the end user can also access a web and/or a mobile application 136, 138 to view video clips, stream videos, receive alerts and notifications, search for and purchase security and surveillance services from one or more monitoring service providers 130, conduct onboarding and credentialing for monitoring service providers, manage subscription for monitoring service providers, set user preferences and settings, and customize user-specific security needs.” Klein provides a user interface including options for purchasing and subscribing to computer vision tasks such as object recognition, corresponding to one or more of a first option to purchase the end-user feature and a second option to subscribe to the end-user feature.); and receive, from the electronic device, a selection of one of the first option or the second option; and update the software application further based on the received selection (Klein [0028] “Particularly, the monitoring service provider 130 can view clips or stream video and receive alerts and notifications via a web and/or a mobile application 136, 138 that is configured to provide a front-end UI. Similarly, the end user can also access a web and/or a mobile application 136, 138 to view video clips, stream videos, receive alerts and notifications, search for and purchase security and surveillance services from one or more monitoring service providers 130, conduct onboarding and credentialing for monitoring service providers, manage subscription for monitoring service providers, set user preferences and settings, and customize user-specific security needs.” Klein provides a user portal for managing account and subscription services, wherein the web and/or a mobile application corresponding to the software application is updated based on a received user selection of one of the first or second options to purchase or subscribe.).
Nunes Coelho, Jr., Tanizawa, and Klein are all considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and more specifically user specific neural networks. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr, in view of Tanizawa with the above teachings of Klein. Doing so would allow for user specific computer vision and/or security needs (Klein [0010] “In some embodiments, the MSP and the end user can access an application (i.e., via a user device such as a computer) that is configured to provide a front-end user interface (UI), which allows for viewing video clips or stream video, receive alerts and notifications, search for and purchase monitoring or security services from MSPs, conduct MSP onboarding and credentialing, manage MSP subscription, set preferences and settings, and customize user-specific security needs”).
Regarding claim 15, Nunes Coelho, Jr. in view of Tanizawa in further view of Klein teaches the system according to claim 14 as discussed above in the rejection of claim 14, wherein the circuitry is further configured to control the electronic device to display the UI based on the acquired usage data (Klein [0028] “Similarly, the end user can also access a web and/or a mobile application 136, 138 to view video clips, stream videos, receive alerts and notifications, search for and purchase security and surveillance services from one or more monitoring service providers 130, conduct onboarding and credentialing for monitoring service providers, manage subscription for monitoring service providers, set user preferences and settings, and customize user-specific security needs.” Klein provides a user interface including options for purchasing and subscribing to user specific computer vision tasks corresponding to the first option and the second option are included in the UI, based on the acquired usage data.).
Nunes Coelho, Jr., Tanizawa, and Klein are all considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and more specifically user specific neural networks. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa in view of Klein with the above teachings of Klein. Doing so would allow for user specific computer vision and/or security needs (Klein [0010] “In some embodiments, the MSP and the end user can access an application (i.e., via a user device such as a computer) that is configured to provide a front-end user interface (UI), which allows for viewing video clips or stream video, receive alerts and notifications, search for and purchase monitoring or security services from MSPs, conduct MSP onboarding and credentialing, manage MSP subscription, set preferences and settings, and customize user-specific security needs”).
Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Nunes Coelho, Jr. et al. (U.S. Patent Publication No. 2023/0229895) (“Nunes Coelho, Jr.”) in view of Tanizawa et al. (U.S. Patent Publication No. 2021/0073641) (“Tanizawa”) in further view of Klein (U.S. Patent Publication No. 2018/0211115) (“Klein”) in further view of Silva et al. (U.S. Patent Publication 2020/0265487) (“Silva”).
Regarding claim 16, Nunes Coelho, Jr. in view of Tanizawa in further view of Klein teaches the system according to claim 14 as discussed above in the rejection of claim 14, but fails to teach wherein the circuitry is further configured to determine the price based on one or more of a cost of one of the electronic device or a functional component of the one or more functional components of the electronic device, a total time, that includes a training time to obtain the second neural network from the seed model, a complexity of the end-user feature, a cost of dataset associated with the obtained second neural network, competitive or business intelligence data associated with a plurality of users of the electronic device, or an estimate-demand for the end-user feature.
However, Silva teaches wherein the circuitry is further configured to determine the price based on one or more of a cost of one of the electronic device or a functional component of the one or more functional components of the electronic device, a total time, that includes a training time to obtain the second neural network from the seed model, a complexity of the end-user feature, a cost of dataset associated with the obtained second neural network, competitive or business intelligence data associated with a plurality of users of the electronic device, or an estimate-demand for the end-user feature (Silva [0082] “The final offer price can be determined further based on at least (1) a predicted resale value of the electronic device, (2) a predicted incoming volume of a model of the electronic device, or (3) a predicted processing cost of the electronic device.” Silva provides determining a price based on a cost of an electronic device.).
Nunes Coelho, Jr., Tanizawa, Klein and Silva are all considered to be analogous to the claimed invention because they are in the same field of artificial intelligence and more specifically user specific neural networks. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Nunes Coelho, Jr. in view of Tanizawa and Klein with the above teaching of Silva. Doing so would allow for an estimated price based on the cost of an electronic device (Silva [0042] “The display screen 104 can also provide an estimated price, or an estimated range of prices, that the kiosk 100 can offer the user for the mobile phone 150 based on the visual analysis, and/or based on user input (e.g., input regarding the type, condition, etc., of the phone 150).”)
Response to Arguments
Regarding the objection to the claims in the previous office action, Applicant’s amendments to the claims overcome the objection.
Regarding the rejection applied under 35 U.S.C. 101, Applicant firstly asserts that that the claimed subject matter inextricably requires physical objects such as “circuitry” to deploy a neural network on a cloud server, to implement an API on an electronic device, and to update a software application on an electronic device (“Remarks”, Page 14).
However, simply reciting the use of “circuitry” does not integrate a judicial exception into a practical application, as discussed in MPEP 2106.05(f). For example, the claim recites a plurality of abstract ideas, such as “determine a set of constraints associated with an implementation of the selected computer vision task on the electronic device”, which can be performed as an evaluation and judgement, and therefore a mental process. Further, the implementing an API on an electronic device, and updating a software application on an electronic device correspond to mere instructions to apply, as discussed above in the 35 U.S.C. 101 rejection of claim 1 above. The deploying a neural network on a cloud server limitation recites the well-understood, routine, and conventional activity of receiving and transmitting data over a network.
Applicant further asserts that any alleged abstract idea is integrated into a practical implementation. Specifically, Applicant asserts that the claimed embodiments may customize software applications with neural network features based on a determined set of constraints, thereby eliminating human intervention and additional hardware requirements to implement such features (“Remarks”, Page 15).
However, the MPEP states that “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer does not integrate a judicial exception into a practical application or provide an inventive concept.” MPEP 21060.5(f). As discussed above with respect to the integration of the abstract ideas into a practical application, the “determine a set of constraints associated with an implementation of the selected computer vision task on the electronic device” is a mental process (i.e., evaluation and judgement with assistance of pen and paper), and therefore, the efficiency gain produced therefrom, as written in the claims, does not integrate a judicial exception into a practical application or provide an inventive concept
Applicant further asserts that a second optimum neural network may be obtained using a neural architecture search. Based on the obtained second neural network, the system may update a software application of the electronic device to include an end-user feature that implements the second neural network and the update of the software application may enrich existing functionalities offered by the electronic device or may provide new functionalities, without a need to upgrade the electronic device (“Remarks”, Page 15). Applicant further asserts that the claimed features provide technological improvements in the field of computer vision, and the specification describes a practical implementation and how existing functionality of an electronic device that implements “computer vision software” is improved by updating a software of the electronic device based on an API call to call a neural network model deployed on a cloud server without a need to upgrade the electronic device. Applicant therefore asserts the implementation of an API to call the neural network deployed on the cloud server is not “mere instructions to apply” (“Remarks”, Page 16).
However, the implementing an API and updating a software application on an electronic device are both mere instructions to apply, as discussed above in the 35 U.S.C. 101 rejection of claim 1. First, the claim language fails to recite how a solution is accomplished and provides no mechanism for accomplishing a solution. The claim simply recites implementing an API and updating a software application therefrom to include an end-user feature. Next the claim recites using the computer as a tool to perform existing processes in its ordinary capacity of updating software applications. Lastly, the claim has broad applicability across many fields and simply recites using a generic API and updating generic software therefrom. Therefore, in accordance with MPEP 2106.05(f)(1), MPEP 2106.05(f)(2), and MPEP 2106.05(f)(3), respectively, the interpretation of the limitations as mere instructions to apply an exception is proper and the additional elements do not integrate the abstract ideas into a practical application. Further, even if the claims did recite an improvement, it would be in the abstract idea of determining a set of constraints associated with an implementation of the selected computer vision task on the electronic device. As recited in the MPEP, an improvement in the abstract idea itself is not an improvement in technology. MPEP 2106.05(a).
As discussed above, the claims do not provide any improvement which amounts to integrating the abstract ideas into a practical application. For similar reasons, the claims do not recite an improvement which recites significantly more than the abstract idea. Therefore, the claims remain rejected under 35 U.S.C. 101.
Regarding the rejections applied under 35 U.S.C. 102 and 35 U.S.C. 103, Applicant’s arguments with respect to claims have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KURT NICHOLAS PRESSLY whose telephone number is (703)756-4639. The examiner can normally be reached M-F 8-4.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kamran Afshar can be reached at (571) 272-7796. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KURT NICHOLAS PRESSLY/Examiner, Art Unit 2125
/KAMRAN AFSHAR/Supervisory Patent Examiner, Art Unit 2125