DETAILED ACTION
This action is responsive to the Request for Continued Examination and claims filed on 01/02/2026. Claims 1-20 are pending in the case. Claims 1, 9, and 18 are independent claims. Claims 1, 9, and 18 are amended.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C.
102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the
statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a
new ground of rejection if the prior art relied upon, and the rationale supporting the rejection,
would be the same under either status.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 01/26/2026 in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/02/2026 has been entered.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1-20 rejected under 35 U.S.C. 101 because the claims are directed to an abstract idea or mental process
Regarding claim 1:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining kernel adaptation weights corresponding to weight matrices in a category set represented by a plurality of predetermined discrete values which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing kernel weights. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))).
The claim recites determining a unified kernel selecting two or more of the weight matrices based on the kernel adaptation weights corresponding to the weight matrices which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user making a choice and selecting matrices using judgement based on kernel weights. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))).
The claim recites performing an operation on the selected weight matrices which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and making evaluations of weight matrices. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C)).
The claim recites performing a convolution operation based on the unified kernel matrices which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C)).
The claim recites wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to weight matrices in a category set represented by powers of “2” which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and choosing weights using a representation based on powers of 2/binary encoding . See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 2:
The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites generating a plurality of kernel relevance scores corresponding to input data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging a set of data and creating scores. See 2106.04.(a)(2).III.C.
The claim recites and determining the kernel adaptation weights based on the kernel relevance scores which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing values based on a set of data. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 3:
The rejection of claim 2 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the determining of the kernel adaptation weights based on the kernel relevance scores comprises determining the kernel adaptation weights by performing Gumbel softmax sampling on the kernel relevance scores which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 4:
The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to the weight matrices in a category set represented by "0" and "1" which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and choosing weights using a representation based on 0 and 1. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 5:
The rejection of claim 4 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the determining of the unified kernel comprises determining the unified kernel by summing weight matrices of which the kernel adaptation weights are determined to be "1" which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 6:
The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the determining of the kernel adaptation weights comprises, in a category set represented by "0" and "1 ", determining kernel adaptation weights that correspond to one of a plurality of weight matrices to be "1" which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and choosing weights using a representation based on 0 and 1 and choosing 1’s. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 7:
The rejection of claim 6 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites the determining of the unified kernel comprises determining a weight matrix as the unified kernel which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing an option. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 8:
The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining a unified bias based on biases and the kernel adaptation weights, wherein the performing of the convolution operation comprises performing the convolution operation based on input data, the unified kernel, and the unified bias which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 9:
The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim does not contain elements that would warrant a Step 2A Prong 1 analysis.
Subject Matter Eligibility Analysis Step 2A Prong 2:
A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors, configure the one or more processors(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f)))
Subject Matter Eligibility Analysis Step 2B:
Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
The additional element(s) (a) in claim 9 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding claim 10:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining kernel adaptation weights corresponding to weight matrices in a category set represented by a plurality of predetermined discrete values which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing kernel weights. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))).
The claim recites determining a unified kernel selecting two or more of the weight matrices based on the kernel adaptation weights corresponding to the weight matrices which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user making a choice and selecting matrices using judgement based on kernel weights. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))).
The claim recites performing an operation on the selected weight matrices which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and making evaluations of weight matrices. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C)).
The claim recites performing a convolution operation based on the unified kernel which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
The claim recites wherein, for the determining of the kernel adaptation weights, the one or more processors are configured to determine the kernel adaptation weights corresponding to weight matrices in a category set represented by powers of “2” which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and choosing weights using a representation based on powers of 2/binary encoding . See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
one or more processors(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f)))
Subject Matter Eligibility Analysis Step 2B:
Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
The additional element(s) (a) in claim 10 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding claim 11-17:
Claims 11-17 are rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claims 2-8 found in claims 11-17 respectively.
Regarding claim 18:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining discrete valued kernel adaptation weights for weight matrices based on input data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing kernel weights. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))).
The claim recites determining a unified kernel by selecting two or more of the weight matrices based on the kernel adaptation weights corresponding to the weight matrices which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user making a choice and selecting matrices using judgement based on kernel weights. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))).
The claim recites performing an operation on the selected weight matrices which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and making evaluations of weight matrices. See 2106.04.(a)(2).III.C. Alternatively the broadest reasonable interpretation of the limitation can be understood to be Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C)).
The claim recites generating an output by performing convolution between the input data and the unified kernel which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
The claim recites wherein the determining of the discrete valued kernel adaptation weights comprises determining the kernel adaptation weights corresponding to the weight matrices in a category set represented by powers of "2" which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user judging and choosing weights using a representation based on powers of 2/binary encoding . See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 19:
The rejection of claim 18 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining of the unified kernel comprises selecting one of the weight matrices as the unified kernel which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing an option. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding claim 20:
The rejection of claim 19 is incorporated and further claim recites further additional elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the selecting of the one of the weight matrices comprises selecting one of the weight matrices corresponding to a predetermined weight among the kernel adaptation weights which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing an option. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bulat et al.(“HIGH-CAPACITY EXPERT BINARY NETWORKS”, henceforth known as Bulat) and further in view of Juefei-Xu et al.(“Local Binary Convolutional Neural Networks”,
henceforth known as Juefei-Xu)
Regarding claim 1:
Bulat discloses a method with dynamic convolution(Bulat, ABSTRACT, “we propose Expert Binary Convolution, which, for the first time, tailors conditional computing to binary networks by learning to select one data-specific expert binary filter at a time conditioned on input features.”
Bulat discloses determining kernel adaptation weights(Bulat, Page 4, Equation 2, where equation 2 is considered to determine kernel adaptation weights as the selection applied to each kernel to determine which kernel(s) will be used for convolution as the aggregating function used with the gating function(φ(ψ(x))) computes/selects weights in kernels) corresponding to weight matrices in a category set(Bulat, Page 4, Equation 2, where θ are individual expert weight matrices and ϴ is a matrix containing expert kernels which are considered a category sets) represented by a plurality of predetermined discrete values(Bulat, Page 4, Equation 2 and 3, where the aggregating function used with the gating function(φ(ψ(x))) in Equation 2 is defined in in Equation 3 as using the selection space for the adaption weights as discrete, 1 or 0 specifically)
Bulat discloses determining a unified kernel (Bulat, Page 4, Equation 2, where equation 2 is considered to determine an unified kernel(called expert kernel in prior art)) by selecting two or more of the weight matrices(Bulat, Page 3, Paragraph 7, “…we propose to learn a set of expert weights (or simply experts) {θ0, θ1,…, θN-1}, θi ∈ ℝCin ×Cout ×KH × kW alongside a selector gating function which, given input x, selects only a single expert to be applied to it” where gating function selects the expert kernel selecting from expert weights using a gating function via selecting from a pool of two or more matrices and, from that selection of pool of two or more matrices, determining an expert kernel corresponds to determining a unified kernel by selecting two or more weight matrices), based on the kernel adaptation weights corresponding to the weight matrices(Bulat, Page 4, Equation 2, where θ are individual expert weight matrices and ϴ is a matrix containing all N expert weight tensors) and performing an operation on the selected weight matrices(Bulat, Page 4, Equation 2 and Paragraph 1, “and (.)r simply reshapes its argument to a tensor of appropriate dimensions” where (.)r reshaping the final selected expert kernel to a tensor of appropriate dimensions corresponds to performing an operation on selected weight matrices)
Bulat discloses performing a convolution operation based on the unified kernel(Bulat, Page 4, Equation 2, where equation 2 is considered performing a convolution operation based on the unified kernel as the unified kernel(called expert kernel in prior art) performs convolution using the unified kernel on input x(See Also: Bulat, Page 3, Paragraph 7, “In contrast to a normal convolution that applies the same weights to all input features, we propose to learn a set of expert weights (or simply experts) {θ0, θ1,…, θN-1}, θi ∈ ℝCin ×Cout ×KH × kW alongside a selector gating function which, given input x, selects only a single expert to be applied to it.”))
Bulat does not disclose, however Juefei-Xu discloses wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to weight matrices in a category set represented by powers of “2” (“Standard formulations of LBP are simply a weighted sum of all the bit maps using a pre-defined weight vector v = [27, 26, 25, 24, 23, 22, 21, 20]” where defining kernel weight vector v with 2x corresponds to a category set representing by a powers of 2)
References Bulat and Juefei-Xu are analogous art because they are from the [insert the phrase “same field of endeavor” or “problem-solving area,” and the name of that field or area.]
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Bulat and Juefei-Xu before him or her, to modify the convolutional kernels of Bulat to include the binary kernels of Juefei-Xu to for ease of computational savings through binary convolutions to assist in resource constrained platforms. The suggestion/motivation for doing so would have been [insert reasons why modification would have been obvious and give column and line numbers in the references where such motivation is found on Juefei-Xu Page 1, Col. 1, Paragraph 1, “…There is a growing need for deploying…these systems on resource constrained platforms like, autonomous cars, robots, smartphones, smart cameras, smart wearable devices, etc…Binary weights bear dramatic computational savings through efficient implementations of binary convolutions”
Regarding claim 2:
The rejection of claim 1 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the kernel adaptation weights comprises: generating a plurality of kernel relevance scores corresponding to input data(Bulat, Page 4, Equation 5, where x is the input data and ψ(x) computes spatially averaged input feature yielding N kernel relevance scores(where N is the number of total expert kernels) and determining the kernel adaptation weights based on the kernel relevance scores (Bulat, Page 4, Equation 2 and Equation 3, where the φ(ψ(x)) portion of equation 2 produces the adaptation weights based off of argmax(ψ(x)) as defined in Equation 3 which indicates which expert kernel to use and is based on the kernel relevance scores)
Regarding claim 3:
The rejection of claim 2 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the kernel adaptation weights based on the kernel relevance scores comprises determining the kernel adaptation weights by performing Gumbel softmax sampling(Bulat , Page 3, Paragraph 1, “we note that our single expert selection mechanism is akin to the Gumbel-max trick…the Gumbel-Softmax Estimator”) on the kernel relevance scores(“To this end, we propose, for the backward pass, to use the Softmax function for approximating the gradients φ(.)” where the use of softmax function to approximate the gradients φ(.) is used to produce a probability distribution across the experts kernels is considered determining kernel adaptation weights by performing Gumbel softmax sampling on the kernel relevance scores)
Regarding claim 4:
The rejection of claim 1 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to the weight matrices in a category set represented by "0" and "1"(Bulat, Page 4, Equation 2 and 3, where the aggregating function used with the gating function(φ(ψ(x))) of Equation 2 is defined in Equation 3 as using the selection space for the adaption weights as discrete, 1 or 0 specifically)
Regarding claim 5:
The rejection of claim 4 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the unified kernel comprises determining the unified kernel by summing weight matrices of which the kernel adaptation weights are determined to be "1"(Bulat, Page 4, Equation 2 and Equation 3, where ϴ is the stacked matrix of flattened θi and φ(ψ(x))T is a row vector with a single “1” at the position i* and φ(ψ(x))Tϴ computes the weighted sum over the rows ϴ using wi∈{0,1} via matrix multiplication which is considered summing weight matrices where there kernel adaptation weights are determined to be a 1 to determine a unified kernel )
Regarding claim 6:
The rejection of claim 1 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the kernel adaptation weights comprises, in a category set represented by "0" and "1 "(Bulat, Page 4, Equation 2 and 3, where the aggregating function used with the gating function(φ(ψ(x))) of Equation 2 is defined in Equation 3 as using the selection space for the adaption weights as discrete, 1 or 0 specifically), determining kernel adaptation weights that correspond to one of a plurality of weight matrices to be "1"(Bulat, Page 4, Equation 2 and Equation 3, where ϴ is the stacked matrix of flattened θi and φ(ψ(x))T is a row vector with a single “1” at the position i* and φ(ψ(x))Tϴ computes the weighted sum over the rows ϴ using wi∈{0,1} which is considered determining adaptation weights that correspond to one of a plurality of weight matrices to be 1 as the weights used correspond to the 1 in φ(ψ(x))T)
Regarding claim 7:
The rejection of claim 6 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the unified kernel comprises determining the one of the plurality of weight matrices as the unified kernel(Bulat, Page 1, Paragraph 2, “During inference, a very light-weight gating function, dynamically selects a single expert for each input sample and uses it to process the input features” where inference to select a single kernel with a selector gate function from multiple expert kernel matrices is considered determining one weight matrix as a unified kernel from a plurality of weight matrices)
Regarding claim 8:
The rejection of claim 1 with prior art Bulat is incorporated and further:
Bulat discloses determining a unified bias based on biases and the kernel adaptation weights(“In order to minimize the reconstruction error between the full precision and binary convolution…channel-wise real-valued scaling factors are used to modulate the output of the binary convolutions…In this work, we adopted the latter, learning one scaling factor per channel via back-propagation.” where learning scaling factors is considered determining a unified bias), wherein the performing of the convolution operation comprises performing the convolution operation based on input data, the unified kernel(Bulat, Page 4, Equation 2 and Equation 3, where x is the input data, (φ(ψ(x))Tϴ)r produces a unified kernel), and the unified bias(Bulat, Page 3, Equation 1, where BConv(x, θ) uses the Hadamard product of learning scaling factor α which is considered a unified bias)
Regarding claim 9:
The rejection of claim 1 with prior art Bulat is incorporated and further:
Bulat discloses a non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors, configure the one or more processors to perform the method of claim 1 (Bulat, Page 8, Paragraph 2, “Table 5 shows our results ImageNet” where the use of ImageNet requires the use of a computer which uses memory and a processor)
Regarding claim 10:
Bulat discloses a one or more processors(Bulat, Page 8, Paragraph 2, “Table 5 shows our results ImageNet” where the use of ImageNet requires the use of a computer which uses memory and a processor) configured to: determining kernel adaptation weights(Bulat, Page 4, Equation 2, where equation 2 is considered to determine kernel adaptation weights as the selection applied to each kernel to determine which kernel(s) will be used for convolution as the aggregating function used with the gating function(φ(ψ(x))) computes/selects weights in kernels) corresponding to weight matrices in a category set(Bulat, Page 4, Equation 2, where θ are individual expert weight matrices and ϴ is a matrix containing expert kernels which are considered a category sets) represented by a plurality of predetermined discrete values(Bulat, Page 4, Equation 2 and 3, where the aggregating function used with the gating function(φ(ψ(x))) in Equation 2 is defined in in Equation 3 as using the selection space for the adaption weights as discrete, 1 or 0 specifically)
Bulat discloses determining a unified kernel (Bulat, Page 4, Equation 2, where equation 2 is considered to determine an unified kernel(called expert kernel in prior art)) by selecting two or more of the weight matrices(Bulat, Page 3, Paragraph 7, “…we propose to learn a set of expert weights (or simply experts) {θ0, θ1,…, θN-1}, θi ∈ ℝCin ×Cout ×KH × kW alongside a selector gating function which, given input x, selects only a single expert to be applied to it” where gating function selects the expert kernel selecting from expert weights using a gating function via selecting from a pool of two or more matrices and, from that selection of pool of two or more matrices, determining an expert kernel corresponds to determining a unified kernel by selecting two or more weight matrices), based on the kernel adaptation weights corresponding to the weight matrices(Bulat, Page 4, Equation 2, where θ are individual expert weight matrices and ϴ is a matrix containing all N expert weight tensors) and performing an operation on the selected weight matrices(Bulat, Page 4, Equation 2 and Paragraph 1, “and (.)r simply reshapes its argument to a tensor of appropriate dimensions” where (.)r reshaping the final selected expert kernel to a tensor of appropriate dimensions corresponds to performing an operation on selected weight matrices)
Bulat discloses performing a convolution operation based on the unified kernel(Bulat, Page 4, Equation 2, where equation 2 is considered performing a convolution operation based on the unified kernel as the unified kernel(called expert kernel in prior art) performs convolution using the unified kernel on input x(See Also: Bulat, Page 3, Paragraph 7, “In contrast to a normal convolution that applies the same weights to all input features, we propose to learn a set of expert weights (or simply experts) {θ0, θ1,…, θN-1}, θi ∈ ℝCin ×Cout ×KH × kW alongside a selector gating function which, given input x, selects only a single expert to be applied to it.”))
Bulat does not disclose, however Juefei-Xu discloses wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to weight matrices in a category set represented by powers of “2” (“Standard formulations of LBP are simply a weighted sum of all the bit maps using a pre-defined weight vector v = [27, 26, 25, 24, 23, 22, 21, 20]” where defining kernel weight vector v with 2x corresponds to a category set representing by a powers of 2)
References Bulat and Juefei-Xu are analogous art because they are from the [insert the phrase “same field of endeavor” or “problem-solving area,” and the name of that field or area.]
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Bulat and Juefei-Xu before him or her, to modify the convolutional kernels of Bulat to include the binary kernels of Juefei-Xu to for ease of computational savings through binary convolutions to assist in resource constrained platforms. The suggestion/motivation for doing so would have been [insert reasons why modification would have been obvious and give column and line numbers in the references where such motivation is found on Juefei-Xu Page 1, Col. 1, Paragraph 1, “…There is a growing need for deploying…these systems on resource constrained platforms like, autonomous cars, robots, smartphones, smart cameras, smart wearable devices, etc…Binary weights bear dramatic computational savings through efficient implementations of binary convolutions”
Regarding claim 11-17:
Regarding claim 11, The rejection of claim 10 incorporated in claim 11, and further, claim 11 is rejected under the same rationale as set forth in the rejection of claim 2.
Regarding claim 12, The rejection of claim 11 incorporated in claim 12, and further, claim 12 is rejected under the same rationale as set forth in the rejection of claim 3.
Regarding claim 13, The rejection of claim 10 incorporated in claim 13, and further, claim 13 is rejected under the same rationale as set forth in the rejection of claim 4.
Regarding claim 14, The rejection of claim 13 incorporated in claim 14, and further, claim 14 is rejected under the same rationale as set forth in the rejection of claim 5.
Regarding claim 15, The rejection of claim 10 incorporated in claim 15, and further, claim 15 is rejected under the same rationale as set forth in the rejection of claim 6.
Regarding claim 16, The rejection of claim 15 incorporated in claim 16, and further, claim 16 is rejected under the same rationale as set forth in the rejection of claim 7.
Regarding claim 17, The rejection of claim 10 incorporated in claim 17, and further, claim 17 is rejected under the same rationale as set forth in the rejection of claim 8.
Regarding claim 18:
Bulat discloses a determining kernel adaptation weights(Bulat, Page 4, Equation 2, where equation 2 is considered to determine kernel adaptation weights as the selection applied to each kernel to determine which kernel(s) will be used for convolution as the aggregating function used with the gating function(φ(ψ(x))) computes/selects weights in kernels) corresponding to weight matrices in a category set(Bulat, Page 4, Equation 2, where θ are individual expert weight matrices and ϴ is a matrix containing expert kernels which are considered a category sets) represented by a plurality of predetermined discrete values(Bulat, Page 4, Equation 2 and 3, where the aggregating function used with the gating function(φ(ψ(x))) in Equation 2 is defined in in Equation 3 as using the selection space for the adaption weights as discrete, 1 or 0 specifically)
Bulat discloses determining a unified kernel (Bulat, Page 4, Equation 2, where equation 2 is considered to determine an unified kernel(called expert kernel in prior art)) by selecting two or more of the weight matrices(Bulat, Page 3, Paragraph 7, “…we propose to learn a set of expert weights (or simply experts) {θ0, θ1,…, θN-1}, θi ∈ ℝCin ×Cout ×KH × kW alongside a selector gating function which, given input x, selects only a single expert to be applied to it” where gating function selects the expert kernel selecting from expert weights using a gating function via selecting from a pool of two or more matrices and, from that selection of pool of two or more matrices, determining an expert kernel corresponds to determining a unified kernel by selecting two or more weight matrices), based on the kernel adaptation weights corresponding to the weight matrices(Bulat, Page 4, Equation 2, where θ are individual expert weight matrices and ϴ is a matrix containing all N expert weight tensors) and performing an operation on the selected weight matrices(Bulat, Page 4, Equation 2 and Paragraph 1, “and (.)r simply reshapes its argument to a tensor of appropriate dimensions” where (.)r reshaping the final selected expert kernel to a tensor of appropriate dimensions corresponds to performing an operation on selected weight matrices)
Bulat discloses performing a convolution operation based on the unified kernel(Bulat, Page 4, Equation 2, where equation 2 is considered performing a convolution operation based on the unified kernel as the unified kernel(called expert kernel in prior art) performs convolution using the unified kernel on input x(See Also: Bulat, Page 3, Paragraph 7, “In contrast to a normal convolution that applies the same weights to all input features, we propose to learn a set of expert weights (or simply experts) {θ0, θ1,…, θN-1}, θi ∈ ℝCin ×Cout ×KH × kW alongside a selector gating function which, given input x, selects only a single expert to be applied to it.”))
Bulat does not disclose, however Juefei-Xu discloses wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to weight matrices in a category set represented by powers of “2” (“Standard formulations of LBP are simply a weighted sum of all the bit maps using a pre-defined weight vector v = [27, 26, 25, 24, 23, 22, 21, 20]” where defining kernel weight vector v with 2x corresponds to a category set representing by a powers of 2)
References Bulat and Juefei-Xu are analogous art because they are from the [insert the phrase “same field of endeavor” or “problem-solving area,” and the name of that field or area.]
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Bulat and Juefei-Xu before him or her, to modify the convolutional kernels of Bulat to include the binary kernels of Juefei-Xu to for ease of computational savings through binary convolutions to assist in resource constrained platforms. The suggestion/motivation for doing so would have been [insert reasons why modification would have been obvious and give column and line numbers in the references where such motivation is found on Juefei-Xu Page 1, Col. 1, Paragraph 1, “…There is a growing need for deploying…these systems on resource constrained platforms like, autonomous cars, robots, smartphones, smart cameras, smart wearable devices, etc…Binary weights bear dramatic computational savings through efficient implementations of binary convolutions”
Regarding claim 19:
The rejection of claim 18 with prior art Bulat is incorporated and further:
Bulat discloses wherein the determining of the unified kernel comprises selecting one of the weight matrices as the unified kernel(Bulat, Page 1, Paragraph 2, “During inference, a very light-weight gating function, dynamically selects a single expert for each input sample and uses it to process the input features” where inference to select a single kernel is considered determining one weight matrix as a unified kernel)
Regarding claim 20:
The rejection of claim 19 with prior art Bulat is incorporated and further:
Bulat discloses wherein the selecting of the one of the weight matrices comprises selecting one of the weight matrices corresponding to a predetermined weight among the kernel adaptation weights(Bulat, Page 4, Equation 2 and 3, where the aggregating function used with the gating function(φ(ψ(x))) of Equation 2 is defined in Equation 3 as using the predetermined selection space for the adaption weights as discrete, 1 or 0 specifically)
Response to Arguments
Applicant's arguments filed 01/02/2026 have been fully considered but they are not persuasive. A breakdown for the arguments can be found below.
101:
Applicant appears to argue on pages 6-9 that claims do not recite an abstract idea and cites to ex parte Desjardins, Examples 38, Example 41 and 2019 guidance to draw comparison between the claims and the guidance/examples to argue that the claims do not cite a mathematical calculation/relationships/formulas and the current claims are based on mathematical concepts without reciting mathematical concepts in the claims.
Examiner respectfully disagrees as the limitations noted as performing a convolution operation based on the unified kernel matrices is considered a mathematical calculation. Further, the BRI of the claims encompass mental process of choosing/judging/selecting noted as determining kernel adaptation weights corresponding to weight matrices in a category set represented by a plurality of predetermined discrete values and determining a unified kernel selecting two or more of the weight matrices based on the kernel adaptation weights corresponding to the weight matrices. While Applicant cites to reference ex parte Desjardins, Examples 38, Example 41 and 2019 guidance Applicant does not highlight any specific limitation that the references applies to. Examiner does not see any comparisons between the noted references and the claims or limitations.
Applicant appears to argue on page 10 that claims do not recite an abstract idea and cites to 2019 guidance noting to draw comparison between the claims and the guidance/examples to argue that the claims do not cite a mental or abstract idea as Applicant argues that the “determining kernel weights” limitation cannot be practically be performed in the human mind as the human mind is not equipped to perform such complex actions.
Examiner respectfully disagrees as complexity of the computations is not taken into account in relation to mathematical concept grouping and are categized into mathematical concepts grouping of abstract ideas(See Flook, 437 U.S. at 591-92, 198 USPQ2d at 198 (“the novelty of the mathematical algorithm is not a determining factor at all”)) and limitations that covers performance of the mind with a physical aid such as pen and paper can be considered abstract ideas with claims that require a computer may still recite a mental process (please see MPEP 2106.04(a)(2).III.C).
Applicant appears to argue on page 10-13 that the current claims present a practical application that provide an improvement in the functioning of a computer by reducing operational overhead and quantization error citing to the specification. Applicant cites the technical improvements are realized in the claimed features via limitations such as determining kernel adaptation weights corresponding to weight matrices in a category set represented by a plurality of predetermined discrete values .
Examiner respectfully disagrees as the limitations noted as the claims appear to recite a generic computer component to perform abstract ideas("apply it on a computer" (see MPEP 2106.05(f))) and fail to integrate into “significantly more” as claims that require a computer may still recite a mental process (please see MPEP 2106.04(a)(2).III.C). Further, the technological improvements argued are, based on Examiner’s review of the arguments, understood to be an improvement provided by the claimed abstract idea using determining kernel adaptation weights and determining a unified kernel by selecting two or more of the weight matrices based on the kernel adaptation, which does not result in an improvement in technology, merely an improvement in the mental process of selecting/choosing as the claims as presented do not result or highlight an improvement in neural networks or hardware processors. Examiner notes MPEP 2106.05(a) which provides the requirements for how an improvement to the functioning of a computer or to any other technology or technical field is evaluated. Even further, although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Applicant appears to be interpreting a narrower claim as the current claims do not positively recite providing a technological improvement concerning the models providing stability to edge devices.
102:
Applicant argues that Bulat does not teach limitation amended "wherein the determining of the kernel adaptation weights comprises determining the kernel adaptation weights corresponding to weight matrices in a category set represented by powers of '2',"
Previously cited references do not alone disclose the added limitation, however Examiner has added the Juefei-Xu prior art reference to disclose the added limitation.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHARLES JEFFREY JONES JR whose telephone number is (703)756-1414. The examiner can normally be reached Monday - Friday 8:00 - 5:00 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kakali Chaki can be reached at 571-272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/C.J.J./Examiner, Art Unit 2122
/KAKALI CHAKI/Supervisory Patent Examiner, Art Unit 2122