Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status
The status of claims 1-13 is:
Claims 1-13 were pending as of the Non-Final Rejection mailed 08/20/2025.
Claims 1 and 8 are amended as of the remarks and amendments received 11/13/2025.
Claims 3-4, 6-7, 10-11, and 13 remain as originally presented as of the remarks and amendments received 11/13/2025.
Claims 2, 5, 9, and 12 are cancelled as of the remarks and amendments received 11/13/2025.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) are:
“3D pose estimation apparatus” in claims 8, 10-11, and 13
“affinity generation unit” in claim 8
“affinity fusion unit” in claims 8, 10-11, and 13
“pose estimation unit” in claims 8 and 13.
Because these claim limitation(s) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The rejections made under 35 USC 112(b) are moot as the claims that were previously rejected have been cancelled.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 7 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) do not fall within at least one of the four categories of patent eligible subject matter because “A computer-readable recording medium” includes signals per se under BRI. Examiner suggests amending the claim to state “A non-transitory computer-readable recording medium” to overcome this issue.
35 U.S.C. 101 requires that a claimed invention must fall within one of the four eligible categories of invention (i.e. process, machine, manufacture, or composition of matter) and must not be directed to subject matter encompassing a judicially recognized exception as interpreted by the courts. MPEP 2106. Three categories of subject matter are found to be judicially recognized exceptions to 35 U.S.C. § 101 (i.e. patent ineligible) (1) laws of nature, (2) physical phenomena, and (3) abstract ideas. MPEP 2106(II). To be patent-eligible, a claim directed to a judicial exception must as whole be integrated into a practical application or directed to significantly more than the exception itself (MPEP 2106). Hence, the claim must describe a process or product that applies the exception in a meaningful way, such that it is more than a drafting effort designed to monopolize the exception.
Claims 1, 3-4, 6-8, 10-11, and 13 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. In the analysis below, the method of independent claim 1 is considered representative of independent claim 8 and claim 7. Claim 7 is not directed to one of the four statutory categories of eligible subject matter as addressed above, but would still be rejected under 35 USC 101 even if it recited eligible subject matter. Each of the independent claims 1 and 8 are directed to one of the four statutory categories of eligible subject matter; thus, the claims pass Step 1 of the Subject Matter Eligibility Test (See flowchart in MPEP 2106).
Step 2A, Prong 1 Analysis
Independent claims 1 and 8 are directed to estimating a 3D pose of an object based on a graph convolution network (GCN) comprising: inputting a feature vector for a joint of the object; generating an affinity matrix according to the feature vector; generating a dynamic graph matrix by fusing the affinity matrix with a predefined static graph matrix of the graph convolution network; and estimating a 3D pose of the object for the feature vector by replacing the static graph matrix of the graph convolution network with the dynamic graph matrix, wherein generating the affinity matrix comprises, predicting a weight to be applied to a plurality of predefined expert matrices by applying a routing function to the feature vector; and generating the affinity matrix as a weighted sum of the predicted weight and the plurality of expert matrices. All of the steps of the independent claim are either generic data gathering, which is a mental process, or are mathematical concepts. Accordingly, the analysis under prong one of Step 2A of the Subject Matter Eligibility Test does not result in a conclusion of eligibility (See flowchart in MPEP 2106).
Additional elements
Independent claim 8 claims a 3D pose estimation apparatus, a graph convolution network (GCN), an affinity generation unit, an affinity fusion unit, and a pose estimation unit. Independent claim 1 does not have any additional elements.
Step 2A, Prong 2 Analysis
The above-identified elements do not integrate the judicial into a practical application nor do they suggest an improvement.
The additional elements of a 3D pose estimation apparatus, a graph convolution network (GCN), an affinity generation unit, an affinity fusion unit, and a pose estimation unit amounts to merely using generic computer hardware or components as a tool to perform the claimed mental process.
Using a general purpose computer to apply a judicial exception does not qualify as a particular machine and therefore, does not integrate a judicial exception into a practical application (See MPEP 2106.05(b)). Furthermore, implementing an abstract idea on a computer does not integrate a judicial exception into a practical application (See MPEP 2106.05(f)).
Moreover, the additional elements of the claims do not recite an improvement in the functioning of a computer or another technology or technical field, the claimed steps do not effect a transformation, and the claims do not apply the judicial exception in any meaningful way beyond generically linking the use of the judicial exception to a particular technological environment (See MPEP 2106.04(d)).
Further, the act of acquiring data is mere data gathering which amounts to insignificant extra-solution activity (See MPEP 2106.05(g)). Therefore, the analysis under prong two of step 2A of the Subject Matter Eligibility Test does not result in a conclusion of eligibility (See flowchart in MPEP 2106).
Step 2B
Finally, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception.
Regarding independent claims 1 and 8, as noted above, the additional elements are generic computer features which perform generic computer functions that are well-understood, routine, and conventional and do not amount to more than implementing the abstract idea with a computerized system. Thus, taken alone, the additional elements do not amount to significantly more than the above-identified judicial exception (the abstract idea).
Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves and other technology. Their collective functions merely provide conventional computer implementation, and mere implementation on a generic computer does not add significantly more to the claims. Accordingly, the analysis under step 2B of the Subject Matter Eligibility Test does not result in a conclusion of eligibility (See flowchart in MPEP 2106).
For all the foregoing reasons, independent claims 1 and 8 do not recite eligible subject matter under 35 USC 101.
Claims 3 and 10 claim wherein generating the dynamic graph matrix comprises, generating the dynamic graph matrix by performing a multiplication modulation using an element-by-element multiplication operation of the affinity matrix and the predefined static graph matrix. The features of claims 3 and 10 are directed to the mental process and mathematical concepts since they do not preclude the mental analysis nor add anything other than more mathematical concepts to independent claims 1 and 8. Accordingly, claims 3 and 10 do not integrate the judicial exception into a practical application or amount to significantly more than the judicial exception.
Claims 4 and 11 claim wherein generating the dynamic graph matrix comprises, generating the dynamic graph matrix by performing an additional modulation using a summation operation of the affinity matrix and the predefined static graph matrix. The features of claims 4 and 11 are directed to the mental process and mathematical concepts since they do not preclude the mental analysis nor add anything other than more mathematical concepts to independent claims 1 and 8. Accordingly, claims 4 and 11 do not integrate the judicial exception into a practical application or amount to significantly more than the judicial exception.
Claims 6 and 13 claim wherein generating the dynamic graph matrix comprises, calculating a transposition affinity matrix by performing a transposition operation on the dynamic graph matrix; and calculating a regular symmetric affinity matrix using an average operation of the dynamic graph matrix and the transposition affinity matrix, wherein estimating the 3D pose of the object comprises, estimating the 3D pose of the object for the feature vector by replacing the static graph matrix with the regular symmetric affinity matrix. The features of claims 6 and 13 are directed to the mental process and mathematical concepts since they do not preclude the mental analysis nor add anything other than more mathematical concepts to independent claims 1 and 8. Accordingly, claims 6 and 13 do not integrate the judicial exception into a practical application or amount to significantly more than the judicial exception.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 and 3 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang et al. (Zhang, J., Wang, Y., Zhou, Z., Luan, T., Wang, Z., & Qiao, Y. (2021). Learning dynamical human-joint affinity for 3d pose estimation in videos. IEEE Transactions on Image Processing, 30, 7914-7925., hereinafter “Zhang”) in view of Hu et al. (Hu, W., Zhang, C., Zhan, F., Zhang, L., & Wong, T. T. (2021, October). Conditional directed graph convolution for 3d human pose estimation. In Proceedings of the 29th ACM international conference on multimedia (pp. 602-611)., hereinafter “Hu”).
Regarding claim 1, Zhang discloses a pose estimation method in a 3D pose estimation apparatus for estimating a 3D pose of an object based on a graph convolution network (GCN) (Zhang Page 3: “our graph convolution with KNN can adaptively exploit most relevant joints to estimate a certain joint in 3D
space, which effectively reduces connection redundancy to boost performance”) comprising:
inputting a feature vector for a joint of the object (Zhang Page 3: “For frame t, the node set VS corresponds to a feature matrix of human joints in this frame, i.e., Xt = [x1t, x2t, ...., xNt] ∈ R N×Cx , where xit ∈ RCx is the feature vector of the i-th human joint and N is the number of joints”);
generating an affinity matrix according to the feature vector (Zhang Page 3: “Furthermore, the edge set ES is represented by an affinity matrix A ∈ R N×N”; Zhang Page 3-4: “To tackle the problem caused by the fixed affinity, we propose to adaptively construct edges between joints for each frame.Specifically, for a human joint i at frame t, we find a set of its K nearest joints Ωit, according to the feature matrix of human joints Xt in this frame, Ωit = KNN(xit, Xt, K), where KNN refers to the K-Nearest-Neighbor algorithm, and we compute Euclidean distance between xit and Xt to identify Ωit. When joint j belongs to Ωit, there is an edge between joint i and j. Formally, we can obtain a dynamical affinity matrix Dt”);
generating a dynamic graph matrix by fusing the affinity matrix with a predefined static graph matrix of the graph convolution network (Zhang Page 4: “To take it into account, we introduce
a concise weighting mechanism to represent the importance score of joint j ∈ Ωit, Rt(i, j) = γ([xit, xjt]), where γ(·) is a nonlinear mapping with the concatenated input of [xit, xjt]. In our experiment, a fully-connected layer works well for γ(·). Finally, we obtain a weighted affinity via element-wise multiplication”, the predefined static graph is the weighting mechanism); and
estimating a 3D pose of the object for the feature vector by replacing the static graph matrix of the graph convolution network with the dynamic graph matrix (Zhang Page 4: “This leads to our dynamical spatial graph convolution, Yt = σ(BtXtΦ), where Φ ∈ RCx×Cy is a parameter matrix”).
Zhang does not explicitly disclose the method, wherein generating the affinity matrix comprises,
predicting a weight to be applied to a plurality of predefined expert matrices by applying a routing function to the feature vector; and
generating the affinity matrix as a weighted sum of the predicted weight and the plurality of expert matrices.
However, Hu teaches the method, wherein generating the affinity matrix comprises,
predicting a weight to be applied to a plurality of predefined expert matrices by applying a routing function to the feature vector (Hu Page 605: “We use a routing function to predict the blending weights for the connection matrix bases from the previous layer’s output”); and
generating the affinity matrix as a weighted sum of the predicted weight and the plurality of expert matrices (Hu Page 605: “We use a routing function to predict the blending weights for the connection matrix bases from the previous layer’s output”).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of claimed invention to incorporate the weighted summation of Hu with the method of Zhang because it would improve the method by allowing for the more important values of the matrix to be weighted heavier, thereby improving the accuracy of the method. This motivation for the combination of Zhang and Hu is supported by KSR exemplary rationale (D) Applying a known technique to a known device (method, or product) ready for improvement to yield predictable results. MPEP 2141 (III).
Regarding claim 3, Zhang discloses the method, wherein generating the dynamic graph matrix comprises,
generating the dynamic graph matrix by performing a multiplication modulation using an element-by-element multiplication operation of the affinity matrix and the predefined static graph matrix (Zhang Page 4: “To take it into account, we introduce a concise weighting mechanism to represent the importance score of joint j ∈ Ωit, Rt(i, j) = γ([xit, xjt]), where γ(·) is a nonlinear mapping with the concatenated input of [xit, xjt]. In our experiment, a fully-connected layer works well for γ(·). Finally, we obtain a weighted affinity via element-wise multiplication”).
Claim(s) 4 is rejected under 35 U.S.C. 103 as being unpatentable over the Zhang and Hu combination in view of Li et al. (Li, C., Meng, Y., Chan, S. H., & Chen, Y. T. (2020, May). Learning 3d-aware egocentric spatial-temporal interaction via graph convolutional networks. In 2020 IEEE International Conference on Robotics and Automation (ICRA) (pp. 8418-8424). IEEE., hereinafter “Lu”).
Regarding claim 4, the Zhang and Hu combination does not explicitly disclose the method, wherein generating the dynamic graph matrix comprises,
generating the dynamic graph matrix by performing an additional modulation using a summation operation of the affinity matrix and the predefined static graph matrix.
However, Li teaches the method, wherein generating the dynamic graph matrix comprises,
generating the dynamic graph matrix by performing an additional modulation using a summation operation of the affinity matrix and the predefined static graph matrix (Li Page 4: “Ego features are aggregated by a element-wise summation from two types of graphs”).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of claimed invention to incorporate the summation of Li with the method of the Zhang and Hu combination because it is another way to combine the affinity matrix and the predefined static graph matrix instead of multiplying. This motivation for the combination of Zhang, Hu, and Li is supported by KSR exemplary rationale (A) Combining prior art elements according to known methods to yield predictable results. MPEP 2141 (III).
Claim(s) 6 is rejected under 35 U.S.C. 103 as being unpatentable over the Zhang and Hu combination in view of Xie et al. (Xie, J., Miao, Q., Liu, R., Xin, W., Tang, L., Zhong, S., & Gao, X. (2021). Attention adjacency matrix based graph convolutional networks for skeleton-based action recognition. Neurocomputing, 440, 230-239., hereinafter “Xie”).
Regarding claim 6, Zhang discloses the method, wherein estimating the 3D pose of the object comprises,
replacing the static graph matrix of the graph convolution network with another matrix (Zhang Page 4: “This leads to our dynamical spatial graph convolution, Yt = σ(BtXtΦ), where Φ ∈ RCx×Cy is a parameter matrix”).
However, the Zhang and Hu combination does not explicitly disclose the method, wherein generating the dynamic graph matrix comprises,
calculating a transposition affinity matrix by performing a transposition operation on the dynamic graph matrix; and
calculating a regular symmetric affinity matrix using an average operation of the dynamic graph matrix and the transposition affinity matrix.
However, Xie teaches the method, wherein generating the dynamic graph matrix comprises,
calculating a transposition affinity matrix by performing a transposition operation on the dynamic graph matrix (Xie Fig. 4 description: “Through the dimension-attention_V, we obtain the attention map of vertices, and then this attention map is multiplied by its transposition to obtain an ATM”); and
calculating a regular symmetric affinity matrix using an average operation of the dynamic graph matrix and the transposition affinity matrix (Xie Page 235: “Then, these three weights further form a column vector, which performs matrix multiplication with its transposition to obtain a 3 3 matrix. This matrix is called an ATM”).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of claimed invention to incorporate the transposed matrix of Xie with the method of the Zhang and Hu combination because the output would allow for the model to represent the association strength between two nodes in the matrix (Xie Page 235). This motivation for the combination of Zhang and Xie is supported by KSR exemplary rationale (G) Some teaching, suggestion, or motivation in the prior art that would have led one of ordinary skill to modify the prior art reference or to combine prior art teachings to arrive at the claimed invention. MPEP 2141 (III).
Claim(s) 7-8 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Hu and Lakshmi Naranyanan et al. (U.S. Patent Publication No 2021/0081782, hereinafter “Lakshmi”).
Regarding claim 7, the Zhang and Hu combination does not explicitly disclose a computer-readable recording medium, on which a computer program for performing the pose estimation method.
However, Lakshmi teaches a computer-readable recording medium, on which a computer program for performing the pose estimation method (Lakshmi [0031]: “The aspects discussed herein may be described and implemented in the context of non-transitory computer-readable storage medium storing computer-executable instructions. Non-transitory computer-readable storage media include computer storage media and communication media”).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of claimed invention to incorporate storing the method on a computer-readable medium as taught by Lakshmi with the method of Zhang because it would improve the method by allowing it to be run on any computing device. This motivation for the combination of Zhang and Hu is supported by KSR exemplary rationale (D) Applying a known technique to a known device (method, or product) ready for improvement to yield predictable results. MPEP 2141 (III).
Regarding claim 8, it is rejected under the same analysis as claim 1 above along with Lakshmi’s teaching of an apparatus (Lakshmi [0056]: “Additionally, the system 200 for action prediction of FIG. 2 may include a processor 378, a memory 282, a storage drive 284, a bus 286, and a sensor 288”) comprising: an affinity generation unit (Lakshmi [0024]: “The processor may include various modules to execute various functions”, under 112(f) interpretation is a component of the processor configured to perform a specific action), an affinity fusion unit (Lakshmi [0024]: “The processor may include various modules to execute various functions”, under 112(f) interpretation is a component of the processor configured to perform a specific action), and a pose estimation unit (Lakshmi [0024]: “The processor may include various modules to execute various functions”, under 112(f) interpretation is a component of the processor configured to perform a specific action).
It would have been obvious to someone of ordinary skill in the art before the effective filing date of claimed invention to incorporate the hardware of Lakshmi with the method of the Zhang and Hu combination because the hardware would be required in order to operate the method on an apparatus. This motivation for the combination of Zhang, Hu, and Lakshmi is supported by KSR exemplary rationale (A) Combining prior art elements according to known methods to yield predictable results. MPEP 2141 (III).
Regarding claim 10, it is rejected under the same analysis as claim 3 above along with Lakshmi’s teaching of the hardware.
Claim(s) 11 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Hu, Lakshmi, and Li.
Regarding claim 11, it is rejected under the same analysis as claim 4 above along with Lakshmi’s teaching of the hardware.
Claim(s) 13 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Hu, Lakshmi, and Xie.
Regarding claim 13, it is rejected under the same analysis as claim 6 above along with Lakshmi’s teaching of the hardware.
Response to Arguments
Applicant's arguments filed 11/13/2025 have been fully considered but they are not persuasive.
REJECTIONS UNDER 35 U.S.C. § 101:
Examiner respectfully disagrees with Applicant’s arguments on pages 5-8 of Applicant’s arguments and remarks that the amended independent claims overcome the 101 rejections. First, improvement cited by Applicant on page 6 of their arguments and remarks is not present in the claims. Further, all of the claim limitations are either data gathering, mathematical concepts, or a combination of the two being implemented on generic computer hardware. Second, the neural network as currently recited in the claims is not a specialized neural network. There is no claim limitations detailing training, or any other aspects of the neural network that make it uniquely able to carry out the mathematical processes listed in the claims. As such, it is currently a generic neural network which is considered generic computer hardware. Therefore, the claims are not directed to a technical solution.
Moving to whether the claims recite “significantly more”, Examiner respectfully disagrees with Applicant. The expert matrices, routing function, and weighted sum functions are all mathematical concepts. Even together, the concepts do not amount to significantly more. As such, the rejections made under 35 USC 101 are maintained.
REJECTIONS UNDER 35 USC §§ 102 AND 103:
Examiner respectfully disagrees with Applicant’s arguments on pages 8-10 of Applicant’s arguments and remarks that the amendments are not taught by the prior art. Applicant cites on page 9 that “predicting a weight by applying a routing function is applied directly to the feature vector for a joint, thereby producing a weight distribution.” Applicant argues that Hu does not disclose this because the routing feature is applied to intermediate features and that the claim “expressly requires that the routing function be applied to the feature vector for a joint of the object, not to generic intermediate outputs.” However, the claim language states “predicting a weight to be applied to a plurality of predefined expert matrices by applying a routing function to the feature vector.” The claim lacks the “directly to the feature vector” language that Applicant is arguing and as such, the BRI is of the claim language is broader than Applicant is arguing. Hu discloses the routing function being applied to the vector representations (Hu Page 605: “We use a routing function to predict the blending weights for the connection matrix bases from the previous layer’s output”), and since there is not the explicit language requiring the function apply directly to the feature vector as applicant is arguing, Hu’s teaching is within the BRI of the claim. Therefore, the rejections made under 35 USC 103 are maintained and modified for the dependent claims as required by the amendment.H
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Shi et al. (CN 113723185 A using the translation provided herein) discloses a method, device, and storage medium for converting a node characteristic into a joint coordinate vector and using that vector to classify the action characteristic of a limb using a classification network (Shi Abstract).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AIDAN KEUP whose telephone number is (703)756-4578. The examiner can normally be reached Monday - Friday 8:00-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emily Terrell can be reached at (571) 270-3717. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AIDAN KEUP/ Examiner, Art Unit 2666 /Molly Wilburn/Primary Examiner, Art Unit 2666