Prosecution Insights
Last updated: April 19, 2026
Application No. 17/846,007

Network Space Search for Pareto-Efficient Spaces

Final Rejection §101§103§112
Filed
Jun 22, 2022
Examiner
JONES, CHARLES JEFFREY
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
MediaTek Inc.
OA Round
2 (Final)
27%
Grant Probability
At Risk
3-4
OA Rounds
4y 2m
To Grant
93%
With Interview

Examiner Intelligence

Grants only 27% of cases
27%
Career Allow Rate
4 granted / 15 resolved
-28.3% vs TC avg
Strong +66% interview lift
Without
With
+65.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
27 currently pending
Career history
42
Total Applications
across all art units

Statute-Specific Performance

§101
34.5%
-5.5% vs TC avg
§103
29.1%
-10.9% vs TC avg
§102
17.7%
-22.3% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 15 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION This is the first action regarding application number 17/846,007 filed 11/11/2025. Claims 1-2, 7, 11-12 and 17 have been amended. 1-20 have been examined and are pending. Information Disclosure Statement The information disclosure statement (IDS) submitted on 01/20/2026 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Benefit Domestic benefit of 08/20/201 is acknowledged. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 7 and 17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 7 and 17 recites the limitation wherein each network space is characterized by the first range of d values and the second range of w values.. There is insufficient antecedent basis for this limitation in the claim as there is no previous reference to a first range of d values and/or a second range of w values. As it is ambiguous as to what terms the first range of d values and the second range of w values are referencing the Examiner will interpret this limitation as wherein each network space is characterized by a first range of d values and a second range of w values for the purposes of examining. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 1-20 rejected under 35 U.S.C. 101 because claims are directed towards an abstract idea(s). Regarding Claim 1: Subject Matter Eligibility Analysis Step 2A Prong 1 The claim recites partitioning an expanded search space into a plurality of…spaces with each…space including a plurality of…architectures, wherein the expanded search space is characterized by a first range of…depths and a second range of…widths… and partitioning the expanded search space further includes assigning each…space a sub-range of the first range and a sub-range of the second range which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompasses defining subsets of a set using judgement to select ranges of numbers(2106.04.(a)(2).III.C). The claim recites sampling respective…architectures in the…spaces which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing or selecting a set from a list of sets(2106.04.(a)(2).III.C). The claim recites evaluating performance of the…spaces by evaluating the respective…architectures with respect to a multi-objective loss function, wherein the evaluated performance is indicated as a probability associated with each…space which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites identifying a subset of the … spaces that has highest probabilities which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making an evaluation of multiple sets (2106.04.(a)(2).III.C). Alternatively, the BRI of the claims can be categorized as abstract idea using math(Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). The claim recites selecting a target network space from the subset as output of the…space search wherein the target…space has an operation count that is closest to a predetermined target operation count which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a choice and selecting a particular subset of set based on a count of number of operations performed (2106.04.(a)(2).III.C) Subject Matter Eligibility Analysis Step 2A Prong 2: Network/networks(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional element (a) do not integrate the abstract idea into a practical application because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)); The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 2: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim does not contain elements that would warrant a Step 2A Prong 1 analysis. Subject Matter Eligibility Analysis Step 2A Prong 2: wherein each network architecture in the expanded search space includes a stem network to receive an input, a prediction network to generate an output, and a network body that includes the predetermined number of stages(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)); The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 3: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1:. The claim recites wherein the multi-objective loss function includes a task-specific loss function and a model complexity function which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 4: The rejection of claim 3 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites wherein the model complexity function calculates complexity of a network architecture in terms of the number of floating-point operations (FLOPs) which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 5: The rejection of claim 3 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites wherein the model complexity function calculates a ratio of a network architecture's floating-point operations (FLOPs) to a predetermined FLOPs constraint which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 6: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: wherein selecting the target network space further comprises: choosing the target network space that has a floating-point operations (FLOPs) count closest to a predetermined FLOPS constraint which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making an evaluation and choosing (2106.04.(a)(2).III.C) and Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 7: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2A Prong 2: wherein each network architecture includes a predetermined number of stages, each stage including d blocks and each block including w channels, wherein each network space is characterized by a first range of d values and a second range of w values(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)); The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 8: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2A Prong 2: wherein each block is a residual block including two convolution sub-blocks(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)); The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 9: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites sampling the network architectures in each network space using at least a portion of the weights of the super network which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user selecting particular network architectures based on a weight preference. Please see 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: training a super network with a maximum network depth and a maximum network width to obtain weights (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 10: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites wherein evaluating the performance further comprises optimizing a probability distribution over the network spaces which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making changes and evaluating how the changes reflect the rest of the environment(2106.04.(a)(2).III.C) and Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 11: Subject Matter Eligibility Analysis Step 2A Prong 1 The claim recites partition an expanded search space into a plurality of…spaces with each…space including a plurality of…architectures, wherein the expanded search space is characterized by a first range of…depths and a second range of…widths… and partition the expanded search space further includes assigning each…space a sub-range of the first range and a sub-range of the second range which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompasses defining subsets of a set using judgement to select ranges of numbers(2106.04.(a)(2).III.C). The claim recites sample respective…architectures in the…spaces which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing or selecting a set from a list of sets(2106.04.(a)(2).III.C). The claim recites evaluate performance of the…spaces by evaluating the respective…architectures with respect to a multi-objective loss function, wherein the evaluated performance is indicated as a probability associated with each…space which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites identify a subset of the … spaces that has highest probabilities which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making an evaluation of multiple sets (2106.04.(a)(2).III.C). Alternatively, the BRI of the claims can be categorized as abstract idea using math(Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). The claim recites select a target network space from the subset as output of the…space search wherein the target…space has an operation count that is closest to a predetermined target operation count which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a choice and selecting a particular subset of set based on a count of number of operations performed (2106.04.(a)(2).III.C) Subject Matter Eligibility Analysis Step 2A Prong 2: one or more processors; and memory to store instructions, when executed by the one or more processors, cause the system(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Network/networks(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional element (b) do not integrate the abstract idea into a practical application because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)); The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding claim 12: The rejection of claim 11 incorporated in claim 12. Claim 12 is rejected under the same rationale as set forth in the rejection of claim 2. Regarding claim 13: The rejection of claim 11 incorporated in claim 13. Claim 13 is rejected under the same rationale as set forth in the rejection of claim 3. Regarding claim 14: The rejection of claim 13 incorporated in claim 14. Claim 14 is rejected under the same rationale as set forth in the rejection of claim 4. Regarding claim 15: The rejection of claim 13 incorporated in claim 15. Claim 15 are rejected under the same rationale as set forth in the rejection of claim 5. Regarding claim 16: The rejection of claim 11 incorporated in claim 16. Claim 16 are rejected under the same rationale as set forth in the rejection of claim 6. Regarding claim 17: The rejection of claim 11 incorporated in claim 17. Claim 17 is rejected under the same rationale as set forth in the rejection of claim 7. Regarding claim 18: The rejection of claim 11 incorporated in claim 18. Claim 18 is rejected under the same rationale as set forth in the rejection of claim 8. Regarding claim 19: The rejection of claim 11 incorporated in claim 19. Claim 19 is rejected under the same rationale as set forth in the rejection of claim 9. Regarding claim 20: The rejection of claim 11 incorporated in claim 20. Claim 20 is rejected under the same rationale as set forth in the rejection of claim 10. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-4, 7-14 and 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fang et al.(“Densely Connected Search Space for More Flexible Neural Architecture Search”, henceforth known as Fang) and Luo et. al(“Semi-Supervised Neural Architecture Search”, henceforth known as Luo) Regarding claim 1: Fang discloses partitioning an expanded search space into a plurality of network spaces(“We partition the entire network into several stages…This design principle of the super network allows more possibilities of block counts and block widths) with each network space including a plurality of network architectures wherein the expanded search space is characterized by a first range of network depths and a second range of network widths (Fang, Page 1, Figure 1 and Figure 2, where the figures shows that within a stage(denoted by the green lines in Figure 1) a sequence of blocks that share the same spatial resolution and the space within stages are separated into blocks with various widths in each stage and the spatial resolution (H x W(244 x 224 from Figure 2)) is considered a first and second range) and partitioning the expanded search space further includes assigning each network space a sub-range of the first range and a sub-range of the second range(Fang, Page 4, Col. 1, Paragraph 5 “each stage contains routing blocks with various widths and the same spatial resolution” and Fang, Page 2, Col. 2, Paragraph 3, “Not only the number of layers within one block but also the number of blocks within one stage can be searched” where the routing blocks each contain basic layers of the stage and multiple routing blocks with each block is within the range within each stage correspond to partitioning the search space further by assigning each network space a sub-range of the first range and a sub-range of the second range) Fang discloses Sampling respective network architectures in the network spaces(“…we sample one path of the candidate operations according to the architecture weight distribution …. in every basic layer” where sampling a path in every basic layer corresponds to sampling respective network architectures in the network spaces ) Fang discloses evaluating performance of the network spaces by evaluating respective network architectures with respect to a multi-objective loss function wherein the evaluated performance is indicated as a probability associated with each network space(Page 10632, Col. 2, Equation 8 and Paragraph 2, “We design a loss function with the cost-based regularization to achieve the multi-objective optimization” where α and β both relate to the softmax probabilities of network spaces(See also: Page 10629, Col. 1, Paragraph 4, “We integrate our search space into the differentiable NAS framework by relaxing the search space. We assign a probability parameter to each output path of the routing block”)) Fang discloses identifying a subset of the network spaces that has highest probabilities(Page 10629, Col. 1, Paragraph 4, “The final block connection paths in the super network are derived based on the probability distribution” and Page 10632, Col. 2, Paragraph 6, “At the network level, we use the Viterbi algorithm to derive the paths connecting the blocks with the highest total transition probability based on the output path probabilities”) Fang discloses selecting(Fang, Page 4, Col. 1, Paragraph 4, “In the super network, routing blocks are densely connected and we search for the best path between them to derive the final architecture”) a target network space from the subset as output of the network space search(Fang, Page 2, Col. 2, Paragraph 6, “Our proposed method designs a densely connected search space beyond conventional search constrains to generate the architecture with a better trade-off between accuracy and model cost” where the selecting of best path is based on trade-off of accuracy and model cost is considered selecting a target network space of the optimized network (See also Fang, Page 2, Col. 1, Paragraph 4, “To optimize the cost (FLOPs/latency) of the network, we design a chained estimation algorithm targeted at approximating the cost of the model during the search”)) Fang does not explicitly disclose, however Luo does disclose wherein the target network space has an operation count that is closest to a predetermining target operation count (Luo, Page 7, Col. 1, Paragraph 1, “SemiNAS achieves 23.5% top-1 test error rate on ImageNet under the mobile setting (FLOPS ≤ 600 Million)” wherein choosing the network architecture based on <= 600M FLOPS corresponds to selecting a target network space that has an operation count that is closest to a predetermining target operation count) References Fang and Luo are analogous art because they are from the same field of endeavor of using Neural Architecture Search to optimize networks. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Fang and Luo before him or her, to modify the target selection of Fang to include the target operation count of Luo to target specific categories of hardware such as practical mobile system. The suggestion/motivation for doing so would have been Luo, Page 2, Paragraph 4, “For image classification…we achieve 23.5% top-1 error rate on ImageNet under the mobile setting” where the mobile setting refers to having a target of <= 600M FLOPS aimed at testing on mobile categories of hardware. Regarding claim 2: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Fang discloses wherein each network architecture in the expanded search space includes a stem network to receive an input(Figure 2 and Page 11, Col. 1, Algorithm 1, “input Block B0” where the explicit input block B0 is considered an input stem network that receives an input as a routing block receives an input), a prediction network to generate an output(“We assume that the routing block Bi outputs the tensor bi and connects to m subsequent blocks” where basic layers make a routing block and a routing block output is considered a prediction network generating an output(See Also Fang, Page 11, Col. 1, Algorithm 1, “routing blocks {B1…,BN}”)), and a network body that includes the predetermined number of stages(“We partition the entire network into several stages… each stage contains routing blocks with various widths” where the partitioning of the network into a super network into stages is considered a network body that includes the predetermined number of stages as the number of stages are determined by the search space design) Regarding claim 3: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Fang discloses wherein the multi-objective loss function includes a task-specific loss function and a model complexity function(Equation 8, where equation 8 balancing cost(latency/FLOPS) and accuracy is considered a multi-objective function) Regarding claim 4: The rejection of claim 3 with prior art Fang-Luo is incorporated and further: Fang discloses wherein the model complexity function calculates complexity of a network architecture in terms of the number of floating-point operations (FLOPs)( Fang, Page 5, Col. 1, Equation’s 5-6 and Paragraph 3, “ We propose to optimize both the accuracy and the cost (latency/FLOPs) of the model” where optimizing both the accuracy and cost in terms of FLOPs is considered calculating complexity of a network architecture of the number of floating-point operations (FLOPs)) Regarding claim 7: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Fang discloses wherein each network architecture includes a predetermined number of stages(“We partition the entire network into several stages… each stage contains routing blocks with various widths” where the partitioning of the network into a super network into stages is considered including a predetermined number of stages as the number of stages are determined by the search space design), each stage including d blocks and each block including w channels(“Block denotes a set of layers/operations in the network which output feature maps with the same spatial resolution and the same width (number of channels)” where blocks containing channels is considered including blocks and each blocks including channels), wherein each network space is characterized by a first range of d values(Fang, Page 2, Col. 2, Paragraph 3, “Not only the number of layers within one block but also the number of blocks within one stage can be searched” where block counts are considered a range of d values that make a d block) and a second range of w values(Fang, Page 4, Col. 1, Paragraph 5,“each stage contains routing blocks with various widths” where widths are considered range of width values) Regarding claim 8: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Fang discloses wherein each block is a residual block including two convolution sub-blocks(Fang, Page 11, Table 6, where the application of DenseNAS in the ResNet search space shows stage 2-5 having two convolution sub-blocks) Regarding claim 9: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Fang discloses further comprising: training a super network with a maximum network depth and a maximum network width to obtain weights(“The super network includes all the possible architectures defined in the search space”) and sampling the network architectures in each network space using at least a portion of the weights of the super network(Fang, Page 11, Col. 2, Paragraph 1, “When training the weights of operations, we sample one path of the candidate operations … in every basic layer” where sampling one path of each basic layer during training is considered sampling each network space using at least a portion of the weights of the super network as the sampling reuses a portion of the weights in the path (See Fang, Page 11, Col. 2, Paragraph 1, “The dropping-path strategy not only accelerates the search but also weakens the coupling effect between operation weights shared by different sub-architectures in the search space.”)) Regarding claim 10: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Fang discloses wherein evaluating the performance further comprises: optimizing a probability distribution over the network spaces(Equation 8, where Equation 8 showing the multi-objective optimization that balances accuracy and efficiency with these parameters balances the accuracy and efficiency is considered optimizing a probability distribution over the network as α(Eq. 1) is layer-level architecture that weighs candidate operations inside each basic lay via softmax and β(Eq. 3) is the transition probabilities that weight the paths between routing blacks via softmax are both probability distributions) Regarding claim 11: Fang discloses one or more processors; and memory to store instructions, when executed by the one or more processors(Fang, Page 1, Abstract, “DenseNAS achieves 75.3% top-1 accuracy on ImageNet with only 361MB FLOPs and 17.9mslatency on a single TITAN-XP” where a TITAN-XP GPU is considered having one or more processor, memory and executing memory using a processor) Fang discloses partition an expanded search space into a plurality of network spaces(“We partition the entire network into several stages…This design principle of the super network allows more possibilities of block counts and block widths) with each network space including a plurality of network architectures wherein the expanded search space is characterized by a first range of network depths and a second range of network widths (Fang, Page 1, Figure 1 and Figure 2, where the figures shows that within a stage(denoted by the green lines in Figure 1) a sequence of blocks that share the same spatial resolution and the space within stages are separated into blocks with various widths in each stage and the spatial resolution (H x W(244 x 224 from Figure 2)) is considered a first and second range) and partition the expanded search space further includes assigning each network space a sub-range of the first range and a sub-range of the second range(Fang, Page 4, Col. 1, Paragraph 5 “each stage contains routing blocks with various widths and the same spatial resolution” where block widths are considered range of network widths) (Fang, Page 2, Col. 2, Paragraph 3, “Not only the number of layers within one block but also the number of blocks within one stage can be searched” where the block counts are considered a range of depth) Fang discloses Sampling respective network architectures in the network spaces(“…we sample one path of the candidate operations according to the architecture weight distribution …. in every basic layer” where sampling a path in every basic layer corresponds to sampling respective network architectures in the network spaces ) Fang discloses evaluating performance of the network spaces by evaluating respective network architectures with respect to a multi-objective loss function wherein the evaluated performance is indicated as a probability associated with each network space(Page 10632, Col. 2, Equation 8 and Paragraph 2, “We design a loss function with the cost-based regularization to achieve the multi-objective optimization” where α and β both relate to the softmax probabilities of network spaces(See also: Page 10629, Col. 1, Paragraph 4, “We integrate our search space into the differentiable NAS framework by relaxing the search space. We assign a probability parameter to each output path of the routing block”)) Fang discloses identifying a subset of the network spaces that has highest probabilities(Page 10629, Col. 1, Paragraph 4, “The final block connection paths in the super network are derived based on the probability distribution” and Page 10632, Col. 2, Paragraph 6, “At the network level, we use the Viterbi algorithm to derive the paths connecting the blocks with the highest total transition probability based on the output path probabilities”) Fang discloses selecting(Fang, Page 4, Col. 1, Paragraph 4, “In the super network, routing blocks are densely connected and we search for the best path between them to derive the final architecture”) a target network space from the subset as output of the network space search(Fang, Page 2, Col. 2, Paragraph 6, “Our proposed method designs a densely connected search space beyond conventional search constrains to generate the architecture with a better trade-off between accuracy and model cost” where the selecting of best path is based on trade-off of accuracy and model cost is considered selecting a target network space of the optimized network (See also Fang, Page 2, Col. 1, Paragraph 4, “To optimize the cost (FLOPs/latency) of the network, we design a chained estimation algorithm targeted at approximating the cost of the model during the search”)) Fang does not explicitly disclose, however Luo does disclose wherein the target network space has an operation count that is closest to a predetermining target operation count (Luo, Page 7, Col. 1, Paragraph 1, “SemiNAS achieves 23.5% top-1 test error rate on ImageNet under the mobile setting (FLOPS ≤ 600 Million)” wherein choosing the network architecture based on <= 600M FLOPS corresponds to selecting a target network space that has an operation count that is closest to a predetermining target operation count) References Fang and Luo are analogous art because they are from the same field of endeavor of using Neural Architecture Search to optimize networks. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Fang and Luo before him or her, to modify the target selection of Fang to include the target operation count of Luo to target specific categories of hardware such as practical mobile system. The suggestion/motivation for doing so would have been Luo, Page 2, Paragraph 4, “For image classification…we achieve 23.5% top-1 error rate on ImageNet under the mobile setting” where the mobile setting refers to having a target of <= 600M FLOPS aimed at testing on mobile categories of hardware. Regarding claim 12: The rejection of claim 11 incorporated in claim 12. Claim 12 is rejected under the same rationale as set forth in the rejection of claim 2. Regarding claim 13: The rejection of claim 11 incorporated in claim 13. Claim 13 is rejected under the same rationale as set forth in the rejection of claim 3. Regarding claim 14: The rejection of claim 13 incorporated in claim 14. Claim 14 is rejected under the same rationale as set forth in the rejection of claim 4. Regarding claim 17: The rejection of claim 11 incorporated in claim 17. Claim 17 is rejected under the same rationale as set forth in the rejection of claim 7. Regarding claim 18: The rejection of claim 11 incorporated in claim 18. Claim 18 is rejected under the same rationale as set forth in the rejection of claim 8. Regarding claim 19: The rejection of claim 11 incorporated in claim 19. Claim 19 is rejected under the same rationale as set forth in the rejection of claim 9. Regarding claim 20: The rejection of claim 11 incorporated in claim 20. Claim 20 is rejected under the same rationale as set forth in the rejection of claim 10. Claim(s) 5-6 and 15-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fang et al.(“Densely Connected Search Space for More Flexible Neural Architecture Search”, henceforth known as Fang) in view of Luo et. al(“Semi-Supervised Neural Architecture Search”, henceforth known as Luo) and in further view of Cai et. al(“Once-for-all: Train one network and specialize it for efficient deployment”, henceforth known as Cai) Regarding claim 5: The rejection of claim 3 with prior art Fang-Luo is incorporated and further: Cai discloses wherein the model complexity function calculates a ratio of a network architecture's floating-point operations (FLOPs) to a predetermined FLOPs constraint(Cai, Page 2, Paragraph 4, “On the ImageNet mobile setting (less than 600M MACs), OFA achieves a new SOTA 80.0% top1 accuracy with 595M MACs” where less than 600M MACs is considered a predetermined FLOP constraint as MAC are considered 1 or 2 FLOPs for purposes of measuring model complexing(See also Cai, Figure 14, “OFA can design specialized models for different hardware and different latency constraint”)) References Fang and Cai are analogous art because they are from the same field of endeavor of using neural architecture search with hardware and cost-aware CNN design. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Fang and Cai before him or her, to modify the multi-objective loss function or Viterbi/path selection of Fang to include the explicit/predetermined FLOPS constraint of Cai to allow for hard FLOP budgets when choosing/training. The suggestion/motivation for doing so would have been “We apply our trained once-for-all network to get different specialized sub-networks for diverse hardware platforms” Cai, Page 7, Paragraph 5 and “In particular, OFA achieves a new SOTA 80.0% ImageNet top-1 accuracy under the mobile setting (<600M MACs)” Cai, Abstract. Regarding claim 6: The rejection of claim 1 with prior art Fang-Luo is incorporated and further: Cai discloses wherein selecting the target network space further comprises: choosing the target network space(Cai, Abstract, “We can quickly get a specialized sub-network by selecting from the OFA network” where selecting a sub-network from the OFA network is considered choosing a network space) that has a floating-point operations (FLOPs) count closest to a predetermined FLOPS constraint(Cai, Abstract “in particular, OFA achieves a new SOTA 80.0% ImageNet top-1 accuracy under the mobile setting (<600M MACs)” where selecting a network based on <600M Mac’s is considered a choosing a network space that has a FLOP count closest to a predetermined FLOP constraint) References Fang and Cai are analogous art because they are from the same field of endeavor of using neural architecture search with hardware and cost-aware CNN design. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Fang and Cai before him or her, to modify the multi-objective loss function or Viterbi/path selection of Fang to include the explicit/predetermined FLOPS constraint of Cai to allow for hard FLOP budgets when choosing/training. The suggestion/motivation for doing so would have been “We apply our trained once-for-all network to get different specialized sub-networks for diverse hardware platforms” Cai, Page 7, Paragraph 5 and “In particular, OFA achieves a new SOTA 80.0% ImageNet top-1 accuracy under the mobile setting (<600M MACs)” Cai, Abstract. Regarding claim 15: The rejection of claim 13 incorporated in claim 15. Claim 15 are rejected under the same rationale as set forth in the rejection of claim 5. Regarding claim 16: The rejection of claim 11 incorporated in claim 16. Claim 16 are rejected under the same rationale as set forth in the rejection of claim 6. Relevant Prior Art: While not used in the rejection Examiner believes these are relevant arts that may also cover limitations in the claims: Zhao et al. (Few-shot Neural Architecture Search) – Similar NAS system Response to Arguments Applicant's arguments filed 11/11/2025 have been fully considered but they are not persuasive. A breakdown of the arguments can be found below. 112: Applicant has overcome 112 rejection from previous office action, however amended language as introduce separate 112 deficiencies. 101: Applicant appears to argue that mental steps are not recited and are integral to a practical application of neural network design and cites to partitioning as an example . Examiner respectfully disagrees as the BRI of partitioning does not encompass the narrowly claimed interpretation the applicant is interpreting as simply partitioning a system, without a specific definition of what portioning means, encompasses simple separating a set of information. Further, although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Applicant appears to be interpreting a narrower claim as the current claims do not positively recite providing a technological improvement concerning automatic neural network design. 102/103: Applicant appears to argue that the cited reference Fang using a “network architecture search” is different from the claimed “network space search”. Applicant further asserts that prior art does not discuss amended language. Examiner respectfully disagrees as network space search is used within network architecture search and without an Examiner asks Applicant to give details as to why the “network architecture search” of Fang is not comparable to the claimed “network space search”. Examiner has added Luo that discusses the new claimed amended language. Applicant appears to argue that find a best-performing model with the budget less than 600M MACs is different from the claimed “choosing a target network that has a floating-point operation count closest to a predetermined FLOPS constraint” Examiner respectfully disagrees as Cai’s 600M MAC budget is searching for the best-performing model with the constraint of 600M MAC as hard upper bound. This discloses choosing a target network space that has a FLOP count closest to a predetermined FLOPS constraint as a best performing model will be one that is close to the FLOPS budget as more FLOPS mean more capacity (deeper/wider networks with larger kernels). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHARLES JEFFREY JONES JR whose telephone number is (703)756-1414. The examiner can normally be reached Monday - Friday 8:00 - 5:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kakali Chaki can be reached at 571-272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /C.J.J./Examiner, Art Unit 2122 /KAKALI CHAKI/Supervisory Patent Examiner, Art Unit 2122
Read full office action

Prosecution Timeline

Jun 22, 2022
Application Filed
Sep 05, 2025
Non-Final Rejection — §101, §103, §112
Nov 11, 2025
Response Filed
Feb 21, 2026
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582959
DATA GENERATION DEVICE AND METHOD, AND LEARNING DEVICE AND METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12380333
METHOD OF CONSTRUCTING NETWORK MODEL FOR DEEP LEARNING, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
27%
Grant Probability
93%
With Interview (+65.9%)
4y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 15 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month