Prosecution Insights
Last updated: April 19, 2026
Application No. 17/772,157

INTEGRATED ANALYSIS METHOD, INTEGRATED ANALYSIS APPARATUS, AND COMPUTER-READABLE STORAGE MEDIUM STORING AN INTEGRATED ANALYSIS PROGRAM

Non-Final OA §101§103
Filed
Apr 27, 2022
Examiner
HOANG, AMY P
Art Unit
2143
Tech Center
2100 — Computer Architecture & Software
Assignee
Omron Corporation
OA Round
3 (Non-Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
163 granted / 232 resolved
+15.3% vs TC avg
Strong +64% interview lift
Without
With
+64.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
31 currently pending
Career history
263
Total Applications
across all art units

Statute-Specific Performance

§101
15.9%
-24.1% vs TC avg
§103
46.0%
+6.0% vs TC avg
§102
17.0%
-23.0% vs TC avg
§112
13.4%
-26.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 232 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/28/2025 has been entered. Response to Amendment The Amendment filed on 11/28/2025 has been entered. Claim 12 is canceled. Claims 1-11 and 13-20 remain pending in the application. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-11 and 13-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1: Claims 1-11, 13 and 16-20 are directed to a method, claim 14 is directed to an apparatus and claim 15 are directed to a medium. Therefore, the claims are eligible under Step 1 for being directed to a process, a machine and a manufacture respectively. Step 2A Prong 1: Independent claims 1, 14 and 15 recite: executing computation of an autocorrelation matrix on local learning data for obtaining a correlation between elements in local samples comprised in the local learning data - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of obtaining data which is an evaluation or observation that is practically capable of being performed in the human mind with the assistance of pen and paper, or is a mathematical concept that is achievable through mathematical computation; calculating an integration result indicating a correlation between elements of all of the local samples of all of the local learning data, by integrating the results of the computation of an autocorrelation matrices acquired from the client apparatuses - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation; deriving one or more principal components from the calculated integration result - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation; Dependent claim 2 recites: acquiring average values of respective elements of all of the local samples comprised in all of the local learning data - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation; normalizing the local samples comprised in the local learning data by subtracting the acquired average values from the values of the elements of the local samples - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation; and calculating the autocorrelation matrices of the local learning data from the normalized local samples - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, acquiring the results of computation comprises acquiring the calculated autocorrelation matrices - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, and integrating the results of computation comprises obtaining the sum of the autocorrelation matrices acquired from the client apparatuses - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 3 recites: the local samples are weighted according to the designated importances - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, the average values of the elements of all of the local samples are weighted average values that are weighted according to the importances - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, and calculating the integration result comprises calculating, by the server apparatus, a variance-covariance matrix of all of the local learning data as the integration result, by dividing the sum of the calculated autocorrelation matrices by sum of weights according to the importances - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 4 recites: wherein acquiring average values of respective elements of all of the local samples comprises calculating the average values of the elements of all of the local samples by secret calculation using the number of the local samples and the average values of the respective elements that are obtained from each client apparatus - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 5 recites: wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation. Dependent claim 6 recites: calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of computation acquired from the client apparatuses, regarding the two or more designated elements - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the calculated integration result by performing principal component analysis, regarding the two or more designated elements - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 7 recites: assigning, by the server apparatus, each client apparatus to at least one of a plurality of groups, based on a matching degree of the designated two or more elements - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of obtaining and evaluating data, which is an evaluation or observation that is practically capable of being performed in the human mind with the assistance of pen and paper, or is a mathematical concept that is achievable through mathematical computation. calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of computation acquired from the client apparatuses in the same group, regarding the two or more designated elements - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the integration result calculated in the same group, regarding the two or more designated elements, by performing principal component analysis - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 8 recites: assigning, by the server apparatus, each client apparatus to at least one of a plurality of groups - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of obtaining and evaluating data, which is an evaluation or observation that is practically capable of being performed in the human mind with the assistance of pen and paper, or is a mathematical concept that is achievable through mathematical computation. calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of computation acquired from the client apparatuses in the same group - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the integration result calculated in the same group, by performing principal component analysis - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 10 recites: acquiring attribute data regarding the local learning data from the client apparatuses - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of obtaining data which is an evaluation or observation that is practically capable of being performed in the human mind with the assistance of pen and paper, or is a mathematical concept that is achievable through mathematical computation, performing clustering on the attribute data acquired from the client apparatuses - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of obtaining data which is an evaluation or observation that is practically capable of being performed in the human mind with the assistance of pen and paper, or is a mathematical concept that is achievable through mathematical computation, and assigning each client apparatus to at least one of the plurality of groups based on the clustering result - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of obtaining data which is an evaluation or observation that is practically capable of being performed in the human mind with the assistance of pen and paper, or is a mathematical concept that is achievable through mathematical computation. Dependent claim 16 recites: wherein acquiring average values of respective elements of all of the local samples comprises calculating the average values of the elements of all of the local samples by secret calculation using the number of the local samples and the average values of the respective elements that are obtained from each client apparatus - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 17 recites: wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 18 recites: wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 19 recites: wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Dependent claim 20 recites: calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of the computation of the autocorrelation matrices acquired from the client apparatuses, regarding the two or more designated elements - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation, and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the calculated integration result by performing principal component analysis only on the calculated integration results, regarding the two or more designated elements - Under its broadest reasonable interpretation in light of the specification, this limitation encompasses mathematical concepts that is achievable through mathematical computation. Step 2A Prong 2: This judicial exception is not integrated into a practical application because they recite the additional elements: Independent claims 1, 14 and 15: a plurality of client apparatuses, a server apparatus, an integrated analysis apparatus, a processor, a computer-readable medium, storing an integrated analysis program, which when read an executed, for causing a computer to perform operations - These limitations amount to components of a general purpose computer that applies a judicial exception, by use of conventional computer functions (see MPEP § 2106.05(b)). comprising image data of images of products, or measurement data obtained by measuring attributes of products - the step recited at a high level of generality, and amounts to selecting a particular data source or type of data to be manipulated, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). acquiring results of the computation from the client apparatuses - the “acquiring” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). performing principal component analysis for the entirety of the local learning data based on the integration result - the “performing” step recited at a high level of generality, and amounts to more than a recitation of the words "apply it" (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer (see MPEP § 2106.05(f). outputting information regarding the one or more derived principal components - the “outputting” step recited at a high level of generality, and amounts to mere data outputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 3: receiving, by the client apparatuses, a designation of importances of the local samples - the “receiving” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 6: receiving, by the client apparatuses, a designation of two or more elements from a plurality of elements that constitute the local samples - the “receiving” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 9: distributing, by the server apparatus, a list indicating the plurality of groups to each client apparatus, causing the client apparatus to select one or more groups from the plurality of groups shown in the list - the “distributing” and “causing the client apparatus to select” steps recited at a high level of generality, and amounts to mere data outputting and inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)), and assigning the client apparatus to the selected one or more groups - the “assigning” step recited at a high level of generality, and amounts to mere data assigning which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 11: wherein outputting, by the server apparatus, the information regarding the one or more principal components comprises distributing, by the server apparatus, information regarding the one or more derived principal components to the client apparatuses - the “outputting” and “distributing” steps recited at a high level of generality, and amounts to mere data outputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 13: wherein the local learning data is constituted by sensing data obtained by a sensor that observes states of subjects - the step recited at a high level of generality, and amounts to mere data description, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 20: receiving, by the client apparatuses, a designation of two or more elements from a plurality of elements that constitute the local samples - the “receiving” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claims are thus directed to the abstract idea. Step 2B: The claims do not include additional elements that amount to significantly more than the judicial exception. The additional elements: Independent claims 1, 14 and 15: a plurality of client apparatuses, a server apparatus, an integrated analysis apparatus, a processor, a computer-readable medium, storing an integrated analysis program, which when read an executed, for causing a computer to perform operations - These limitations amount to components of a general purpose computer that applies a judicial exception, by use of conventional computer functions (see MPEP § 2106.05(b)). comprising image data of images of products, or measurement data obtained by measuring attributes of products - viewed individually or in combination, describes selecting a particular data source or type of data to be manipulated similar to selecting information, based on types of information and availability of information in a power-grid environment, for collection, analysis and display described in MPEP § 2106.05(g). acquiring results of the computation from the client apparatuses - the “acquiring” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). performing principal component analysis for the entirety of the local learning data based on the integration result - the “performing” step recited at a high level of generality, and amounts to more than a recitation of the words "apply it" (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer (see MPEP § 2106.05(f). outputting information regarding the one or more derived principal components - the “outputting” step recited at a high level of generality, and amounts to mere data outputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 3: receiving, by the client apparatuses, a designation of importances of the local samples - the “receiving” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 6: receiving, by the client apparatuses, a designation of two or more elements from a plurality of elements that constitute the local samples - the “receiving” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 9: distributing, by the server apparatus, a list indicating the plurality of groups to each client apparatus, causing the client apparatus to select one or more groups from the plurality of groups shown in the list - the “distributing” and “causing the client apparatus to select” steps recited at a high level of generality, and amounts to mere data outputting and inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)), and assigning the client apparatus to the selected one or more groups - the “assigning” step recited at a high level of generality, and amounts to mere data assigning which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 11: wherein outputting, by the server apparatus, the information regarding the one or more principal components comprises distributing, by the server apparatus, information regarding the one or more derived principal components to the client apparatuses - the “outputting” and “distributing” steps recited at a high level of generality, and amounts to mere data outputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 13: wherein the local learning data is constituted by sensing data obtained by a sensor that observes states of subjects - the step recited at a high level of generality, and amounts to mere data description, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Dependent claim 20: receiving, by the client apparatuses, a designation of two or more elements from a plurality of elements that constitute the local samples - the “receiving” step recited at a high level of generality, and amounts to mere data inputting, which is a form of insignificant extra-solution activity (see MPEP § 2106.05(g)). Accordingly, these additional elements do not amount to significantly more than the judicial exception. As such, the claims are ineligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 5, 11 and 13-15 are rejected under 35 U.S.C. 103 as being unpatentable over Grammenos et al. (hereinafter Grammenos and this reference is included on the IDS filed 04/26/2023), "Federated PCA with Adaptive Rank Estimation", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, July 18, 2019, Page 1-19, https://arxiv.org/abs/1907.08059v1, ITHACA, NY, in view of Kudinov et al. (hereinafter Kudinov), US 20220058524 A1. Regarding independent claim 1, Grammenos teaches an integrated analysis method comprising: executing computation of an autocorrelation matrix, by each of a plurality of client apparatuses, on local learning data, for obtaining a correlation between elements in local samples comprised in the local learning data (Section 1. Introduction, 1st paragraph, “each of a large number of independent clients can contribute to the training of a centralised model by computing local updates with their own data”, “In a nutshell, given a matrix Y ∈ R d×n of n feature vectors of dimension d, PCA aims to build a low-dimensional subspace of R d that captures the directions of maximum variance in the data contained in Y”; Algorithm 1, Phase I: Estimate local updates; Section 3, 2nd paragraph “In the inner-most loop, each client i ∈ [M/`p−1] collects or generates Yp,i in a streaming fashion and computes an estimate of SVDr (Yp,i) using SPCA or SAPCA (Algorithm 3)”; Section 3.2, 1st paragraph “In this section we introduce SPCA and SAPCA which are the streaming algorithms clients use to compute local updates to the centralised model”); acquiring, by a server apparatus, results of the computation from the client apparatuses (Section 1. Introduction, 1st paragraph, “sending them to the client holding the centralised model for aggregation”); calculating, by the server apparatus, an integration result indicating a correlation between elements of all of the local samples of all of the local learning data, by integrating the results of the computation of the autocorrelation matrices acquired from the client apparatuses (Algorithm 1, Phase II: Merging estimantions; Section 3, 2nd paragraph “the estimates for level p are aggregated by the calling merge (Algorithm 2) recursively on subspace batches of size k”; Section 3.1, Algorithm 2); deriving, by the server apparatus, one or more principal components from the calculated integration result by performing principal component analysis for the entirety of the local learning data based on the integration result (Section 3.2, “In this section we introduce SPCA and SAPCA which are the streaming algorithms clients use to compute local updates to the centralised model Consider a sequence {y1, . . . , yn} ⊂ R d of feature vectors and let their concatenation at time τ ≤ n be Y[τ] = [ y1 y2 · · · yτ ] ∈ R d×τ . (4) A block of size b ∈ N is formed by taking b contiguous columns of Y[τ] . Hence, a matrix Y[τ] with r ≤ b ≤ τ induces K = dτ /be blocks. For convenience, we assume K ∈ N, so that τ = Kb ∈ N. In this case, block k ∈ [K] corresponds to the sub-matrix containing columns Sk = [(k − 1)b + 1, kb]. 4 It is assumed that all blocks Sk are owned and observed exclusively by client i ∈ [M], but that due to resource limitations it can not store them all. Hence, once client i has observed YSk ∈ R d×b it uses it to update its estimate Yb [(k−1)b],r of the r principal components of Y[kb] and then releases YSk from memory. If Yb 0,r is the empty matrix, the r principal components of Y[τ] can be estimated by Yb [kb],r = SVDr h Yb [(k−1)b],r YSk i ∈ R d×kb”); and outputting, by the server apparatus, information regarding the one or more derived principal components (Section 3.2, 4th paragraph “Its output after K iterations is ̂ Y[Kb],r = SVDr (Y[τ ]), which, as mentioned above, contains both an estimate of leading r principal components of Y[τ ] and the projection of Y[τ ] onto this estimate.”). Grammenos does not explicitly disclose a plurality of client apparatuses and a server apparatus, wherein the local learning data comprising image data of images of products, or measurement data obtained by measuring attributes of products. However, in the same field of endeavor, Kudinov teaches a plurality of client apparatuses and a server apparatus (Fig. 2; [0062] The present disclosure is implemented in a wireless communication network architecture and includes hardware and/or software means at the server side and hardware and/or software means at the user equipment side. As a non-limiting example, server side means may include units and/or modules which perform the operations of providing initial ML models, initializing machine learning (ML) models at the server, distributing (sending out) ML model(s) among one or more user equipment (UEs) connected to the server by a communication network, transmitting training data of initial sample from the server to the one or more UEs, receiving ML models trained on the one or more UEs from the one or more UEs, updating the personalized ML model at the server by averaging the trained ML models received from the one or more UEs), wherein the local learning data comprising image data of images of products, or measurement data obtained by measuring attributes of products (Fig. 1; [0076] At operation S3, user generated data by means of user input are accumulated in each of the one or more UEs … user generated data are images which the user generates by means of one or more photo- or video cameras, provided in the UE, as well as tags which the user assigns to objects which exist in the images. Besides images from one or more cameras from the UE, object identification may also be performed by the ML model in images acquired by the UE from other sources, e.g., via a communication network from other users or by browsing websites). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of integrating system comprising servers and user equipmentsand and user generated data are images comprising objects as suggested in Kudinov into Grammenos’s system because both of these systems are addressing machine learning models for personalizing user equipment. This modification would have been motivated by the desire to improve quality of training personalized artificial intelligence models while preventing their “overfitting” and reducing the expenses for data transmission over network connections (Kudinov, [0009]). Regarding dependent claim 5, the combination of Grammenos and Kudinov teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Grammenos further teaches wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation (Algorithm 1, Phase II: Merging estimantions; Section 3, 2nd paragraph “the estimates for level p are aggregated by the calling merge (Algorithm 2) recursively on subspace batches of size k”; Section 3.1, Algorithm 2. Examiner notes that “secret calculation” does not explicitly define what technical features are necessary for the calculation, thus Examiner broadly interprets it as any calculation). Regarding dependent claim 11, the combination of Grammenos and Kudinov teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Grammenos further teaches wherein outputting, by the server apparatus, the information regarding the one or more principal components comprises distributing, by the server apparatus, information regarding the one or more derived principal components to the client apparatuses (Section 3.2, 4th paragraph “the nodes in our algorithm can adjust, independently of each other, their rank estimate based on the distribution of the data seen so far. This is convenient because we expect to have nodes which observe different distributions and will likely require to adjust the number of principal components kept in order to accurately track the data distribution over time”). Regarding dependent claim 13, the combination of Grammenos and Kudinov teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Kudinov further teaches wherein the local learning data is constituted by sensing data obtained by a sensor that observes the states of subjects ([0063] The UE may include various input/output means, such as, without limitation, a touchscreen, one or more keys, one or more microphones, one or more photo- and/or video cameras, positioning system signal receivers, such as GPS, GLONASS, GALILEO etc., one or more sensors for determining physical parameters of the user equipment and/or its environment, such as spatial position of the user equipment, temperature, illumination levels etc.). Regarding independent claim 14, it is an apparatus claim that corresponding to the method of claim 1. Therefore, it is rejected for the same reason as claim 1 above. Kudinov further teaches an integrated analysis apparatus comprising a processor to perform operations comprising: operation as an acquisition unit, operation as an integration unit, operation as an analysis unit, operation as an output unit ([0011] at least one processor configured to; [0062] The present disclosure is implemented in a wireless communication network architecture and includes hardware and/or software means at the server side and hardware and/or software means at the user equipment side. As a non-limiting example, server side means may include units and/or modules which perform the operations of providing initial ML models, initializing machine learning (ML) models at the server, distributing (sending out) ML model(s) among one or more user equipment (UEs) connected to the server by a communication network, transmitting training data of initial sample from the server to the one or more UEs, receiving ML models trained on the one or more UEs from the one or more UEs, updating the personalized ML model at the server by averaging the trained ML models received from the one or more UEs). Regarding independent claim 15, it is a medium claim that corresponding to the method of claim 1. Therefore, it is rejected for the same reason as claim 1 above. Kudinov further teaches a non-transitory computer-readable medium, storing an integrated analysis program, which when read an executed, for causing a computer to perform operations ([0111] Various embodiments as set forth herein may be implemented as software (e.g., the program 440) including one or more instructions that are stored in a storage medium (e.g., internal memory 436 or external memory 438) that is readable by a machine (e.g., the electronic device 401). For example, a processor (e.g., the processor 420) of the machine (e.g., the electronic device 401) may invoke at least one of the one or more instructions stored in the storage medium, and execute it, with or without using one or more other components under the control of the processor) Claims 2, 4, 17 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Grammenos and Kudinov as applied in claim 1, in view of Chowdhary (hereinafter Chowdhary), US 20160001355 A1. Regarding dependent claim 2, the combination of Grammenos and Kudinov teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. The combination of Grammenos and Kudinov does not explicitly disclose executing computation for obtaining correlation comprises: acquiring average values of respective elements of all of the local samples comprised in all of the local learning data; normalizing the local samples comprised in the local learning data by subtracting the acquired average values from the values of the elements of the local samples; and calculating the autocorrelation matrices of the local learning data from the normalized local samples, acquiring the results of computation comprises acquiring the calculated autocorrelation matrices, and integrating the results of computation comprises obtaining the sum of the autocorrelation matrices acquired from the client apparatuses. However, in the same field of endeavor, Chowdhary teaches executing computation for obtaining correlation comprises: acquiring average values of respective elements of all of the local samples comprised in all of the local learning data ([0149] Step 1: initializing a variable (x) corresponding to the data set containing all the data points of a cluster and size of (x) can be represented in a (m×n) matrix, where (m) is the number of input properties and (n) represents the number of data points; [0150] Step 2: computing a mean value (or average value) for each input property); normalizing the local samples comprised in the local learning data by subtracting the acquired average values from the values of the elements of the local samples ([0151] Step 3: subtracting the computed mean value from each value of this input property in (x); [0152] Step 4: repeating Step 3 all input properties and forming a (m×n)′ mean matrix of the (m×n) matrix); and calculating the autocorrelation matrices of the local learning data from the normalized local samples ([0153] Step 5: computing the covariance of each element in the (m×n)′ matrix and forming a (y×z) covariance matrix), acquiring the results of computation comprises acquiring the calculated autocorrelation matrices ([0154] Step 6: calculating the Eigen vectors for each element present in the (y×z) covariance matrix to form a (y×z)′ mean-covariance matrix), and integrating the results of computation comprises obtaining the sum of the autocorrelation matrices acquired from the client apparatuses ([0154] Step 6: calculating the Eigen vectors for each element present in the (y×z) covariance matrix to form a (y×z)′ mean-covariance matrix). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of utilizing principal component analysis methodology which converts the data points in the given cluster of correlated parameters into a set of data points of linearly uncorrelated parameters termed as principal components as suggested in Chowdhary into Grammenos and Kudinov’s system because both of these systems are addressing utilizing principal component analysis methodology for discovering linear structure or reducing dimensionality in data. This modification would have been motivated by the desire to provide a self-learning system that automatically learns and updates itself (Chowdhary, [0062]). Regarding dependent claim 4, the combination of Grammenos, Kudinov and Chowdhary teaches all the limitations as set forth in the rejection of claim 2 or 3 that is incorporated. Chowdhary further teaches wherein acquiring average values of respective elements of all of the local samples comprises calculating the average values of the elements of all of the local samples by secret calculation using the number of the local samples and the average values of the respective elements that are obtained from each client apparatus ([0150] Step 2: computing a mean value (or average value) for each input property. Examiner notes that “secret calculation” does not explicitly define what technical features are necessary for the calculation, thus Examiner broadly interprets it as any calculation). Regarding dependent claim 17, the combination of Grammenos, Kudinov and Chowdhary teaches all the limitations as set forth in the rejection of claim 2 that is incorporated. Chowdhary further teaches wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation ([0154] Step 6: calculating the Eigen vectors for each element present in the (y×z) covariance matrix to form a (y×z)′ mean-covariance matrix. Examiner notes that “secret calculation” does not explicitly define what technical features are necessary for the calculation, thus Examiner broadly interprets it as any calculation). Regarding dependent claim 19, the combination of Grammenos, Kudinov and Chowdhary teaches all the limitations as set forth in the rejection of claim 4 that is incorporated. Chowdhary further teaches wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation ([0154] Step 6: calculating the Eigen vectors for each element present in the (y×z) covariance matrix to form a (y×z)′ mean-covariance matrix. Examiner notes that “secret calculation” does not explicitly define what technical features are necessary for the calculation, thus Examiner broadly interprets it as any calculation). Regarding dependent claim 20, the combination of Grammenos, Kudinov and Chowdhary teaches all the limitations as set forth in the rejection of claim 2 that is incorporated. Chowdhary further teaches further comprising: receiving, by the client apparatuses, a designation of two or more elements from a plurality of elements that constitute the local samples ([0159] Referring to FIGS. 5( a) and 5(b) illustrating website screen shots of an embodiment of present disclosure with an example of foundry data input and data output for a parameter rejection predictor module 50 … The foundry user is enabled to select at least line-name 510 from a drop down menu, a component-name 520 from a drop down menu relating to the desired component to be manufactured, a parameter-selector 530 to enable the user to select desired one or more parameters from parameter-list displayed in the drop down menu, a value-inputting section 540 to enable the user to input desired values corresponding to the parameters selected, a rejection-selector 550 to enable the user to select one or more rejections from the rejection type list displayed in the drop down menu, and a date section 560 to enable the user to mention date for which the user desire to compute the rejections percentage received in his/her foundry), wherein, calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of the computation of the autocorrelation matrices acquired from the client apparatuses, regarding the two or more designated elements ([0159] Once the user presses on the search button 570 the parameter rejection predictor module 500 initiates an internal computations of the rejections received on the date mentioned corresponding to all the user inputs received), and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the calculated integration result by performing principal component analysis only on the calculated integration results, regarding the two or more designated elements ([0159] The output generated by the parameter rejection predictor module 500 which includes an input-summary section 570 displaying the brief summary of the user inputs provided at the first place and a rejection-predictor section 590 displaying a rejection percentage and a confidence percentage.). Claims 3, 6-8, 10, 16 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Grammenos, Kudinov, in view of Chowdhary as applied in claim 2, further in view of Vauterin et al. (hereinafter Vauterin), US 20050014195 A1. Regarding dependent claim 3, the combination of Grammenos, Kudinov and Chowdhary teaches all the limitations as set forth in the rejection of claim 2 that is incorporated. The combination of Grammenos, Kudinov and Chowdhary does not explicitly disclose receiving, by the client apparatuses, a designation of importances of the local samples, wherein the local samples are weighted according to the designated importances, the average values of the elements of all of the local samples are weighted average values that are weighted according to the importances, and calculating the integration result comprises calculating, by the server apparatus, a variance-covariance matrix of all of the local learning data as the integration result, by dividing the sum of the calculated autocorrelation matrices by sum of weights according to the importances. However, in the same field of endeavor, Vauterin teaches receiving, by the client apparatuses, a designation of importances of the local samples ([0077] each element is weighted by user-defined parameters), wherein the local samples are weighted according to the designated importances ([0077] each element is weighted by user-defined parameters), the average values of the elements of all of the local samples are weighted average values that are weighted according to the importances ([0077] calculating a composite similarity matrix by averaging the corresponding elements of the respective matrices), and calculating the integration result comprises calculating, by the server apparatus, a variance-covariance matrix of all of the local learning data as the integration result, by dividing the sum of the calculated autocorrelation matrices by sum of weights according to the importances ([0086] Equation [2]). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of calculating matrix with elements weighted by a user defined value to arrive at the average as suggested in Vauterin into Grammenos, Kudinov and Chowdhary’s system because both of these systems are addressing processing of large amounts of experimental data. This modification would have been motivated by the desire to save time/cost in a method which can produce accurate consensus classifications (Vauterin, [0004]). Regarding dependent claim 6, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 3 that is incorporated. Chowdhary further teaches further comprising: receiving, by the client apparatuses, a designation of two or more elements from a plurality of elements that constitute the local samples ([0159] Referring to FIGS. 5( a) and 5(b) illustrating website screen shots of an embodiment of present disclosure with an example of foundry data input and data output for a parameter rejection predictor module 50 … The foundry user is enabled to select at least line-name 510 from a drop down menu, a component-name 520 from a drop down menu relating to the desired component to be manufactured, a parameter-selector 530 to enable the user to select desired one or more parameters from parameter-list displayed in the drop down menu, a value-inputting section 540 to enable the user to input desired values corresponding to the parameters selected, a rejection-selector 550 to enable the user to select one or more rejections from the rejection type list displayed in the drop down menu, and a date section 560 to enable the user to mention date for which the user desire to compute the rejections percentage received in his/her foundry), wherein, calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of computation acquired from the client apparatuses, regarding the two or more designated elements ([0159] Once the user presses on the search button 570 the parameter rejection predictor module 500 initiates an internal computations of the rejections received on the date mentioned corresponding to all the user inputs received), and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the calculated integration result by performing principal component analysis, regarding the two or more designated elements ([0159] The output generated by the parameter rejection predictor module 500 which includes an input-summary section 570 displaying the brief summary of the user inputs provided at the first place and a rejection-predictor section 590 displaying a rejection percentage and a confidence percentage.). Regarding dependent claim 7, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 6 that is incorporated. Kudinov further teaches assigning, by the server apparatus, each client apparatus to at least one of a plurality of groups, based on a matching degree of the designated two or more elements ([0038] According to user generated data and other information which can be accessed (such as, e.g., brand and model of the user equipment) machine learning model type is identified, which is suitable for this user and user equipment. Personalization groups are formed based, as an example but not limited to, on the identified machine learning model type and/or type, brand or model of the user equipment, and/or user interests determined on the basis of user generated data during said waiting period for the purpose of machine learning model adaptation; [0039] According to the identified machine learning model type, the server sends a current version of the machine learning model to the user equipment. In this case, in an example embodiment, certain versions of machine learning models are only sent to users within corresponding personalization groups), wherein, calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of computation acquired from the client apparatuses in the same group, regarding the two or more designated elements ([0041] Each user equipment in which ML model training is completed sends its trained ML model to a server (such as a central server and/or a model aggregation server). Personalized models trained in different user equipment (e.g., within one individualization group) are aggregated at said server. Aggregation is implemented, e.g., by creating an averaged model. As a result of the aggregation, a new version of a model of a certain type is obtained. This new version of the model is sent to user equipments within a respective individualization group), and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the integration result calculated in the same group, regarding the two or more designated elements, by performing principal component analysis ([0046] ML model training is performed in a user equipment until a training stop condition is met in the user equipment, such as the achievement of ML models convergence ML among the user equipment, in an example embodiment within a certain individualization group. After that the trained ML models are transmitted to the server where they are aggregated (as a non-limiting example, by averaging the ML models); [0047] Alternatively or additionally, an ML model training stop criterion may include the achievement of a predetermined ML model quality characteristics value by the ML model, which may be expressed in terms of prediction accuracy or depending on the task: so, accuracy of word prediction may be evaluated in the task of predicting the next word; letter-wise or word-wise accuracy of text recognition may be evaluated in the task of recognizing handwritten text, etc.). Regarding dependent claim 8, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 3 that is incorporated. Kudinov further teaches assigning, by the server apparatus, each client apparatus to at least one of a plurality of groups ([0038] According to user generated data and other information which can be accessed (such as, e.g., brand and model of the user equipment) machine learning model type is identified, which is suitable for this user and user equipment. Personalization groups are formed based, as an example but not limited to, on the identified machine learning model type and/or type, brand or model of the user equipment, and/or user interests determined on the basis of user generated data during said waiting period for the purpose of machine learning model adaptation; [0039] According to the identified machine learning model type, the server sends a current version of the machine learning model to the user equipment. In this case, in an example embodiment, certain versions of machine learning models are only sent to users within corresponding personalization groups), wherein, calculating, by the server apparatus, the integration result comprises calculating the integration result by integrating the results of computation acquired from the client apparatuses in the same group ([0041] Each user equipment in which ML model training is completed sends its trained ML model to a server (such as a central server and/or a model aggregation server). Personalized models trained in different user equipment (e.g., within one individualization group) are aggregated at said server. Aggregation is implemented, e.g., by creating an averaged model. As a result of the aggregation, a new version of a model of a certain type is obtained. This new version of the model is sent to user equipments within a respective individualization group), and deriving, by the server apparatus, the one or more principal components comprises deriving the one or more principal components from the integration result calculated in the same group, by performing principal component analysis ([0046] ML model training is performed in a user equipment until a training stop condition is met in the user equipment, such as the achievement of ML models convergence ML among the user equipment, in an example embodiment within a certain individualization group. After that the trained ML models are transmitted to the server where they are aggregated (as a non-limiting example, by averaging the ML models); [0047] Alternatively or additionally, an ML model training stop criterion may include the achievement of a predetermined ML model quality characteristics value by the ML model, which may be expressed in terms of prediction accuracy or depending on the task: so, accuracy of word prediction may be evaluated in the task of predicting the next word; letter-wise or word-wise accuracy of text recognition may be evaluated in the task of recognizing handwritten text, etc.). Regarding dependent claim 10, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 8 that is incorporated. Kudinov further teaches wherein, assigning, by the server apparatus each client apparatus to at least one of a plurality of groups comprises acquiring attribute data regarding the local learning data from the client apparatuses ([0056] users are grouped in a variety of individualization groups, in particular according to the following criteria: topics of user generated text messages, user geographical location, user age, and the type of hardware on which the one or more software applications are run, and in which the one or more artificial intelligence features are used. Users may be grouped into individualization groups based, e.g., on technical parameters of user equipment: screen size, RAM size, type of processor etc.; geographical location of user equipment; user generated data content, e.g., at web pages (likes, comments, replies, posts, publications, etc.); demographic metadata (user sex, age, marital status, nationality, etc.)), performing clustering on the attribute data acquired from the client apparatuses ([0058] The number of individualization groups may be defined manually or by any suitable clustering methodology. Each individualization group corresponds to one ML model or one ML model type), and assigning each client apparatus to at least one of the plurality of groups based on the clustering result ([0039] According to the identified machine learning model type, the server sends a current version of the machine learning model to the user equipment. In this case, in an example embodiment, certain versions of machine learning models are only sent to users within corresponding personalization groups). Regarding dependent claim 16, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 3 that is incorporated. Chowdhary further teaches wherein acquiring average values of respective elements of all of the local samples comprises calculating the average values of the elements of all of the local samples by secret calculation using the number of the local samples and the average values of the respective elements that are obtained from each client apparatus ([0150] Step 2: computing a mean value (or average value) for each input property. Examiner notes that “secret calculation” does not explicitly define what technical features are necessary for the calculation, thus Examiner broadly interprets it as any calculation). Regarding dependent claim 18, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 3 that is incorporated. Chowdhary further teaches wherein integrating the results of the computation comprises integrating the results of the computation by secret calculation ([0154] Step 6: calculating the Eigen vectors for each element present in the (y×z) covariance matrix to form a (y×z)′ mean-covariance matrix. Examiner notes that “secret calculation” does not explicitly define what technical features are necessary for the calculation, thus Examiner broadly interprets it as any calculation). Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Grammenos, Kudinov, in view of Chowdhary, in view of Vauterin, as applied in claim 8, further in view of KIM at al. (hereinafter KIM), US 20120166394 A1. Regarding dependent claim 9, the combination of Grammenos, Kudinov, Chowdhary and Vauterin teaches all the limitations as set forth in the rejection of claim 8 that is incorporated. The combination of Grammenos, Kudinov, Chowdhary and Vauterin does not explicitly disclose wherein, assigning, by the server apparatus each client apparatus to at least one of a plurality of groups comprises distributing, by the server apparatus, a list indicating the plurality of groups to each client apparatus, causing the client apparatus to select one or more groups from the plurality of groups shown in the list, and assigning the client apparatus to the selected one or more groups. However, in the same field of endeavor, KIM teaches wherein, assigning, by the server apparatus each client apparatus to at least one of a plurality of groups comprises distributing, by the server apparatus, a list indicating the plurality of groups to each client apparatus ([0081] In accordance with embodiments, a distributed storage system may select target data nodes and store a target object and replicas thereof in the selected target data nodes based on locations of the target object and the target data nodes. The distributed storage system may include a plurality of data nodes, at least one selection agent, a client, and a proxy server. The plurality of data nodes may be configured to be grouped into a plurality of zone groups based on locations of the plurality of data nodes and configured to store a target object and replicas of the target object), causing the client apparatus to select one or more groups from the plurality of groups shown in the list ([0082] The at least one selection agent may be the client. In this case, the client may be configured to select the multiple target zone groups from the plurality of zone groups based on locations of the plurality of zone groups and the client in response to an inquiry from the proxy server, assign priorities to the selected multiple target zone groups and data nodes belonging to the selected multiple target zone groups, select the target data nodes by selecting one data node for each one of the selected multiple target zone groups based on the assigned priorities, store the target object in one of the target data nodes, store the replicas of the target object by performing the replication process using the replication agent, and transmit the data node list to the proxy server), and assigning the client apparatus to the selected one or more groups ([0083] The at least one selection agent may be the client and the proxy server. In this case, the client may be configured to select the multiple target zone groups from the plurality of zone groups based on locations of the plurality of zone groups in response to an inquiry from the proxy server, assign priorities to the selected multiple zone groups, receive a data node list from the proxy server, store the target object in one of the target data nodes included in the received list, store the replicas of the target object in other target data nodes included in the received data node list by performing the replication process using the replication agent, and transmit the data node list including information on the target data nodes and corresponding target zone groups that store the target object and the replicas of the target object to the proxy server. The proxy server may be configured to select the target data nodes by selecting one data node for each one of the selected multiple target zone groups which are selected by the client, transmit the data node list including information on the selected data nodes to the client, and update the metadata database based on the data node list. The proxy server may select one data node for each one of the selected multiple zone groups, as the target data nodes, based on an available storage capacity and an object storage history of data nodes belonging to the target zone groups). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of grouping data notes into zone group and providing clients interface to select as suggested in KIM into Grammenos, Kudinov, Chowdhary and Vauterin’s system because both of these systems are addressing client and server distributed system. This modification would have been motivated by the desire to improve time and improve overall performance (KIM, [0004]-[0006]). Response to Arguments Applicant's arguments filed 11/28/2025 have been fully considered. Each of applicant’s remarks is set forth, followed by examiner’s response. (1) The 35 U.S.C. 112(a) rejections to Claims 1, 14 and 15 are respectfully withdrawn in response to Applicant's amendment to these claims. (2) Regarding 35 U.S.C. 101 rejection to claims 1-20 directed to an abstract idea without significantly more, Applicant alleges Independent claims 1, 14, and 15 are directed to an integrated analysis method (claim 1), an integrated analysis apparatus (claim 14), and a non-transitory computer-readable medium storing an integrated analysis program (claim 15). Each claim recites a specific combination of elements including: execution of computations of an autocorrelation matrix on local learning data, comprising product image data or product attribute measurement data, by multiple client apparatuses; acquisition of computation results by a server apparatus; calculation of an integration result by integrating the received computation results; and derivation of one or more principal components from the integration result by the server apparatus. The above noted steps cannot practically be performed in the human mind because they involve: (i) operations on large-scale, distributed datasets, (ii) computation of autocorrelation matrixes on product image data or product attribute measurement data, and (iii) eigenvalue decomposition or principal component analysis, which are computationally intensive processes requiring machine execution. See MPEP§2106.04(a). The Federal Circuit has recognized that claims are not "mental processes" where "they could not, as a practical matter, be performed entirely in a human's mind." CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1372 (Fed. Cir. 2011); see also McRO, 837 F.3d at 1318. As to point (2), Examiner respectfully disagrees. The claim recites executing computation of an autocorrelation matrix, by each of a plurality of client apparatuses, on local learning data, comprising image data of images of products, or measurement data obtained by measuring attributes of products, for obtaining a correlation between elements in local samples comprised in the local learning data; calculating, by the server apparatus, an integration result indicating a correlation between elements of all of the local samples of all of the local learning data, by integrating the results of the computation of the autocorrelation matrices acquired from the client apparatuses; deriving, by the server apparatus, one or more principal components from the calculated integration result by performing principal component analysis for the entirety of the local learning data based on the integration result. As explained in the specification, the computation for obtaining correlation may include: a step of acquiring average values of respective elements of all of the local samples included in all of the local learning data; a step of normalizing (centralizing) the local samples included in the local learning data by subtracting the acquired average values from the values of the elements of the local samples (see [0011]), in the step of calculating, the server apparatus may calculate a variance-covariance matrix of all of the local learning data as the integration result, by dividing the sum of the autocorrelation matrices by the sum of weights according to the importances ([0012]). The broadest reasonable interpretation of these limitations requires a mathematical calculation. In addition, under its broadest reasonable interpretation in light of the specification, this limitation encompasses the mental process of evaluating data and selecting data based on judgement to obtain a correlation between elements in local samples, which is observing, evaluating and judging that is practically capable of being performed in the human mind with the assistance of pen and paper. Local learning data comprising image data of images of products, or measurement data obtained by measuring attributes of products recited at a high level of generality amounts to selecting a particular data source or type of data to be manipulated, which is a form of insignificant extra-solution activity. (3) Applicant further argues that the amended claims are explicitly limited to analyzing a specific type of data - product image data or product attribute measurement data. This ties the claimed method to the concrete technical field of industrial inspection, quality control, or manufacturing process analysis. This is not merely applying math but using a specific computational architecture to solve problems within a tangible technological domain, thus integrating any underlying mathematical concepts into a practical application under Alice Step 2A, Prong 2. As to point (3), Examiner respectfully disagrees. Local learning data comprising a specific type of data - product image data or product attribute measurement data recited at a high level of generality amounts to selecting a particular data source or type of data to be manipulated, which is a form of insignificant extra-solution activity. This limitation viewed individually or in combination, describes selecting a particular data source or type of data to be manipulated similar to selecting information, based on types of information and availability of information in a power-grid environment, for collection, analysis and display described in MPEP § 2106.05(g). (4) Regarding 35 U.S.C. 103 rejections, in the remarks, Applicant argues because, as admitted in the Office Action, Grammenos fails to disclose a plurality of client apparatuses and a server apparatus, Grammenos necessarily fails to disclose deriving, bya server apparatus, principal components from calculated integration results by performing PCA. Further, even if it can be established that clients/server apparatuses are disclosed, which Applicant does not concede, Kudinov discloses only full model updates and averaging and fails to disclose any computations or PCA operations as in the manner claimed. Therefore, regardless of whether Grammenos, Kudinov, Chowdhary, Vauterin, and Kim disclose for which they are cited, the cited references fail to teach or suggest at least "executing computation of an autocorrelation matrix, by each of a plurality of client apparatuses, on local learning data, comprising image data of images of products or measurement data obtained by measuring attributes of products,_for obtaining a correlation between elements in local samples comprised in the local learning data,""acquiring, by a server apparatus, results of the computation from the client apparatuses,""calculating, by the server apparatus, an integration result indicating a correlation between elements of all of the local samples of all of the local learning data, by integrating the results of the computation of the autocorrelation matrices acquired from the client apparatuses,""deriving, by the server apparatus, one or more principal components from the calculated integration result by performing principal component analysis,"and "outputting, by the server apparatus, information regarding the one or more derived principal components" as recited or analogously recited in amended independent claims 14 and 15. As to point (4), In response to Applicants’ arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091,231 USPQ 375 (Fed. Cir. 1986). (5) Applicant alleges Grammenos teaches the clients compute the local PCA and send the results (principal components or compressed subspaces) for merging. Therefore, because Grammenos discloses that the clients compute the local PCA directly and send the results (principal components or compressed subspaces), Grammenos fails to discloses providing correlation matrices to a server. Further, because Grammenos discloses that aggregation is done by merging subspaces using SVD-related methods, Grammenos fails to disclose computing a central correlation matrix before PCA. As to point (5), Examiner respectfully disagrees. Grammenos teaches each of a large number of independent clients can contribute to the training of a centralised model by computing local updates with their own data. Each client aims to build a matrix Y of n feature vectors of dimension d. Clients using the streaming algorithms to provide correlation matrices to the centralised model (i.e. a server) (Section 1. Introduction, 1st paragraph). (6) Applicant alleges Kudinov discloses that clients train ML models using public and private data, Kuidnov fails to disclose autocorrelation matrices or PCA. As to point (6), Applicant is reminded that claims 1, 14 and 15 are rejected using the combination of Grammenos and Kuidnov. Examiner notes that Grammenos teaches autocorrelation matrices or PCA. Therefore, claims 1-11 and 13-20 remain rejected as set forth above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action. Chauhan et al. (US 20030004909 A1) discloses a method and system for managing information, and, in particular, to methods and systems for providing an enhanced knowledge management system that integrates existing data with knowledge acquired through a knowledge management portal. It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 U.S.P.Q. 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 U.S.P.Q. 275, 277 (C.C.P.A. 1968)). Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMY P HOANG whose telephone number is (469)295-9134. The examiner can normally be reached M-TH 8:30-5:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JENNIFER WELCH can be reached at 571-272-7212. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMY P HOANG/Examiner, Art Unit 2143 /JENNIFER N WELCH/Supervisory Patent Examiner, Art Unit 2143
Read full office action

Prosecution Timeline

Apr 27, 2022
Application Filed
Apr 26, 2025
Non-Final Rejection — §101, §103
Jul 16, 2025
Response Filed
Oct 06, 2025
Final Rejection — §101, §103
Nov 28, 2025
Request for Continued Examination
Dec 01, 2025
Response after Non-Final Action
Jan 07, 2026
Non-Final Rejection — §101, §103
Apr 07, 2026
Response Filed

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602596
APPARATUS AND METHOD FOR VALIDATING DATASET BASED ON FEATURE COVERAGE
2y 5m to grant Granted Apr 14, 2026
Patent 12572263
ACCESS CARD WITH CONFIGURABLE RULES
2y 5m to grant Granted Mar 10, 2026
Patent 12536432
PRE-TRAINING METHOD OF NEURAL NETWORK MODEL, ELECTRONIC DEVICE AND MEDIUM
2y 5m to grant Granted Jan 27, 2026
Patent 12475669
METHOD AND APPARATUS WITH NEURAL NETWORK OPERATION FOR DATA NORMALIZATION
2y 5m to grant Granted Nov 18, 2025
Patent 12461595
SYSTEM AND METHOD FOR EMBEDDED COGNITIVE STATE METRIC SYSTEM
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+64.2%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 232 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month