Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant's claim for foreign priority based on an application filed in China on 8 November, 2022. It is noted, however, that applicant has not filed a certified copy of the CN202211393732 application as required by 37 CFR 1.55.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1 - 20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
When reviewing independent claim 1, and based upon consideration of all of the relevant factors with respect to the claim as a whole, claims 1 - 20 are held to claim an abstract idea without reciting elements that amount to significantly more than the abstract idea and are therefore rejected as ineligible subject matter under 35 U.S.C. 101. The rationale, under MPEP § 2106, for this finding is explained below:
The claimed invention (1) must be directed to one of the four statutory categories, and (2) must not be wholly directed to subject matter encompassing a judicially recognized exception, as defined below. The following two step analysis is used to evaluate these criteria.
Step 1: Is the claim directed to one of the four patent-eligible subject matter categories: process, machine, manufacture, or composition of matter?
When examining the claim under 35 U.S.C. 101, the Examiner interprets that the claims is related to a process since the claim is directed to a group information guided smooth independent component analysis method for brain function network analysis.
Step 2a, Prong 1: Does the claim wholly embrace a judicially recognized exception, which includes laws of nature, physical phenomena, and abstract ideas, or is it a particular practical application of a judicial exception?
The Examiner interprets that the judicial exception applies since the claim 1 limitations of “preprocessing functional magnetic resonance imaging (fMRI) data of each subject and representing four-dimensional data as a two-dimensional matrix’, “performing dimension reduction on data of two-dimensional matrices of all subjects at both a subject level and a group level in sequence along a temporal direction by using principal component analysis (PCA)”, “performing independent component analysis (ICA) on dimension- reduced data to obtain group-level independent components (ICs) and identify FN-related group- level ICs”, “calculating 13 voxel-level features for each voxel based on each reference component that is initialized by using each FN-related group-level IC”, “constructing a multi-objective function by employing the voxel-level features of each component, each reference component, and an individual subject's data matrix, and then normalizing the multi-objective function”, iteratively optimizing the multi-objective function, if termination condition is not met, updating the reference component and then performing step 4 to step 5 again”, “outputting an IC that represents a functional network (FN) for the individual subject”, “calculating time course of the FN for the individual subject”, and “calculating FNs and related time courses for all subjects according to step 4 to step 8.” are directed to an abstract idea. The claim is related to mathematical concept as each step described in claim 1 is a recitation of mathematical steps performed on data . If the claim recites a judicial exception (i.e., an abstract idea enumerated in MPEP § 2106.04(a), a law of nature, or a natural phenomenon), the claim requires further analysis in Prong Two.
Step 2a, Prong 2: Does the claim recite additional elements that integrate the judicial exception into a practical application?
The Examiner interprets that the claim 1 limitations do not provide additional elements or combination of additional elements to a practical application since the claims are generally linking the use of the judicial exception to a particular technological environment or field of use – see MPEP 2106.05(h). See, MPEP §2106.04(a), Because a judicial exception is not eligible subject matter, Bilski, 561 U.S. at 601, 95 USPQ2d at 1005-06 (quoting Chakrabarty, 447 U.S. at 309, 206 USPQ at 197 (1980)), if there are no additional claim elements besides the judicial exception, or if the additional claim elements merely recite another judicial exception, that is insufficient to integrate the judicial exception into a practical application. See, e.g., RecogniCorp, LLC v. Nintendo Co., 855 F.3d 1322, 1327, 122 USPQ2d 1377 (Fed. Cir. 2017) ("Adding one abstract idea (math) to another abstract idea (encoding and decoding) does not render the claim non-abstract"). OR Genetic Techs. v. Merial LLC, 818 F.3d 1369, 1376, 118 USPQ2d 1541, 1546 (Fed. Cir. 2016) (eligibility "cannot be furnished by the unpatentable law of nature (or natural phenomenon or abstract idea) itself."). For a claim reciting a judicial exception to be eligible, the additional elements (if any) in the claim must "transform the nature of the claim" into a patent-eligible application of the judicial exception, Alice Corp., 573 U.S. at 217, 110 USPQ2d at 1981, either at Prong Two or in Step 2B. If there are no additional elements in the claim, then it cannot be eligible. In such a case, after making the appropriate rejection (see MPEP § 2106.07 for more information on formulating a rejection for lack of eligibility), it is a best practice for the examiner to recommend an amendment, if possible, that would resolve eligibility of the claim.
Step 2b: If a judicial exception into a practical application is not recited in the claim, the Examiner must interpret if the claim recites additional elements that amount to significantly more than the judicial exception.
The Examiner interprets that the claim does not amount to significantly more since the claim states no steps or means of any tangible output that is not solely the result of mathematical steps performed on the initial functional magnetic resonance imaging data.
Furthermore, the generic computer components of the memory and processor recited as performing generic computer functions that are well-understood, routine and conventional activities amount to no more than implementing the abstract idea with a computerized system.
Claims 2 - 20 depending on the independent claim include all the limitation of the independent claim. The Examiner finds that claims 2 – 20 do not state significantly more since the claims recite:
“wherein the pre-processing of fMRI data at step 1 further comprises: removing first few time points of fMRI data and performing slice timing correction, head motion correction, spatial normalization, and spatial smoothing.” in claims 2 and 13; “wherein data representing of each subject's fMRI data at step 1 refers to converting four-dimensional fMRI data into the two-dimensional matrix, which is drawing a three-dimensional image corresponding to each time point extracted from original fMRI data within a brain mask into a row, and then concatenating row vectors along the temporal direction to obtain a matrix X (size: N x S), wherein N represents the number of time points and S represents the number of voxels.” in claims 3 and 14; “wherein the performing of PCA-based dimension reduction on the two-dimensional matrix data at both the subject level and the group level in sequence along the temporal direction at step 2 further comprises: performing PCA-based dimension reduction on the matrix X of each subject to achieve a dimension-reduced matrix (size: n1 x S) , wherein n1< N, then concatenating the dimension-reduced matrices of all subjects along the temporal dimension, and performing PCA-based dimension reduction on the concatenated data to further obtain a dimension-reduced matrix (size: n2 x S), wherein, n2 is the number of components, and n1 is either equal to or greater than n2.” in claims 4 and 15; “wherein at step 3, the FastICA algorithm or Infomax algorithm is applied for the ICA to obtain group-level components that are used for the initialization of reference components (Ri, i = 1, 2, …, n2). ” in claims 5 and 16; “wherein 13 voxel-level features calculated based on each reference component at step 4 comprise four types of features, comprising: a first type of voxel-level features comprises activation state of each voxel in cerebrospinal fluid, an average activation state of neighbor voxels of each voxel in cerebrospinal fluid, and a mean similarity of activation states between a voxel and neighbor voxels thereof in cerebrospinal fluid; a second type of voxel-level features comprises activation state of each voxel in white matter, an average activation state of neighbor voxels of each voxel in white matter, and a mean similarity of activation states between a voxel and neighbor voxels thereof in white matter; a third type of voxel-level features comprises activation state of each voxel in brain edge, an average activation state of neighbor voxels of each voxel in brain edge, and a mean similarity of activation states between a voxel and neighbor voxels thereof in brain edge; and a fourth type of voxel-level features comprises a size of connected region of each voxel, a voxel degree of each voxel, an average voxel degree of neighbor voxels of each voxel, and a mean similarity of voxel degrees between a voxel and neighbor voxels thereof; and calculating the activation state of i-th voxel in cerebrospinal fluid, white matter, or brain edge comprises: Active_WMi=WMmask(i), Active_CSFi= CSFmask(i), and Active_BEi= BEmask(i); WMmask(*), CSFmask(*), and BEmask(*) denote that if a voxel is activated in the white matter mask, cerebrospinal fluid mask, or brain edge mask, the feature value is set to 1; otherwise, the feature value is set to 0; calculating the average activation state of neighbor voxels of i-th voxel in cerebrospinal fluid, white matter, or brain edge comprises: ActiveMean_WMi=
PNG
media_image1.png
42
130
media_image1.png
Greyscale
, ActiveMean_CSFi=
PNG
media_image2.png
43
129
media_image2.png
Greyscale
, and ActiveRatio_BEi=
PNG
media_image3.png
43
137
media_image3.png
Greyscale
wherein, the neighi represents the spatially neighbor voxels of the i -th voxel, and the neigh_leni , and the represents the number of neighbor voxels of the i-th voxel; calculating the mean similarity of activation states between a voxel and neighbor voxels thereof in cerebrospinal fluid, white matter, or brain edge comprises: the similarity between activation states of i-th voxel and j-th voxel in white matter is:
PNG
media_image4.png
48
398
media_image4.png
Greyscale
the mean similarity of activation states between the i -th voxel and neighbor voxels thereof in white matter is CorrMean_WMi=
PNG
media_image5.png
42
130
media_image5.png
Greyscale
the mean similarity of activation states between the i-th voxel and neighbor voxels thereof in cerebrospinal fluid and brain edge are CorrMean_CSFi=
PNG
media_image6.png
43
128
media_image6.png
Greyscale
and CorrMean_BEi =
PNG
media_image7.png
43
122
media_image7.png
Greyscale
, respectively; calculating the voxel degree of i-th voxel comprises:
PNG
media_image8.png
41
226
media_image8.png
Greyscale
wherein, Zi and Z represent the corresponding z-score of the i-th voxel and the j-th voxel in a component; calculating the average voxel degree of neighbor voxels of the i -th voxel comprises:
PNG
media_image9.png
36
263
media_image9.png
Greyscale
calculating the mean similarity of voxel degrees between the i -th voxel and neighbor voxels thereof comprises:
PNG
media_image10.png
48
346
media_image10.png
Greyscale
and calculating the size of connected region of each voxel comprises: aggregating a voxel into a region by a region growing algorithm, and the region growing is based on Z-scores of voxels in component, finally taking the voxel number of the region as a feature value.”, claim 6 and 17; “wherein during the construction of the multi-objective function by employing the voxel-level features of each component, each reference component, and the individual subject's data matrix at step 5, the multi-objective function is as follows:
PNG
media_image11.png
74
263
media_image11.png
Greyscale
an optimization target of the multi-objective function comprises independence of each component, correspondence between each component and related reference component Ri, and smoothness of each component; wherein, J(Yi) is a negative entropy of an estimated independent component
PNG
media_image12.png
26
93
media_image12.png
Greyscale
, which is used to reflect the independence of the component; wi (size: N x 1) represents associated unmixing vector;
PNG
media_image13.png
20
18
media_image13.png
Greyscale
represents a whitened X; v represents a Gaussian variable with zero mean and unit variance; G(*) represents any non-quadratic function, and E(*) represents the mathematical expectation; F(Yi) is used to represent the correspondence between Yi and Ri to ensure comparability of components across subjects; T(Yi) refers to a graph regularization term computed by using 13 voxel-level features; Tr(*) represents a trace of the matrix; L = D - Q represents a Laplace matrix, wherein D represents a degree matrix, and Q computed based on voxel-level features represents similarity between voxels in the component; regarding Q , an adjacency matrix is calculated to reflect the between-voxel similarity based on each type of voxel-level features, yielding Q1, Q2, Q3, and Q4, and then each Qi (i =1, 2, 3, 4) is taken independently as Q for removing specific noises or jointly to generate Q (Q = Q1 + Q2 + Q3 + Q4) for removing all types of noises; to compute each Qi (i =1, 2, 3, 4), first the voxel-level features within the same type are multiplied together to obtain VoxF_i (size: S x 1);then a sparse distance matrix WF is calculated based on space information of voxels and finally Qi- = WF * VoxF_i * VoxF_iT is calculated; and regarding the degree matrix D, values of diagonal elements of D are column sums of corresponding adjacency matrix Q.” in claims 7 and 18; “wherein the normalizing of the multi-objective function at step 5 further comprises: normalizing the multi-objective function by an inverse tangent function and an exponential function with base e; the normalized multi-objective function is denoted as:
PNG
media_image14.png
124
556
media_image14.png
Greyscale
, wherein, Yi0 denotes initial Yi and is i-th group-level IC.” in claims 8 and 19; “wherein iteratively optimizing the multi-objective function further comprises based on a linear weighted sum method, changing the multi-objective function to a unified objective function:
PNG
media_image15.png
29
286
media_image15.png
Greyscale
a, b, and c are weights, wherein b =
PNG
media_image16.png
39
93
media_image16.png
Greyscale
, a = c =
PNG
media_image17.png
31
27
media_image17.png
Greyscale
,and q represents the current iteration time ;a method for iteratively optimizing the multi-objective function is a gradient descent method, comprising:
PNG
media_image18.png
26
379
media_image18.png
Greyscale
,wherein E[G(v)] = E[log (cosh(v))] is a constant 0.375, so, K(Yi)=
PNG
media_image19.png
34
130
media_image19.png
Greyscale
VJ(Y); since
PNG
media_image20.png
24
519
media_image20.png
Greyscale
; because U(Yi)=
PNG
media_image21.png
38
376
media_image21.png
Greyscale
as such, the iteration is formulated as follows:
PNG
media_image22.png
23
329
media_image22.png
Greyscale
, for the iteration, wi is initialized with
PNG
media_image23.png
31
181
media_image23.png
Greyscale
represents an inverse of
PNG
media_image13.png
20
18
media_image13.png
Greyscale
; the value of wi after q iteration is:
PNG
media_image24.png
35
161
media_image24.png
Greyscale
; wherein, di=
PNG
media_image25.png
34
61
media_image25.png
Greyscale
is the iteration direction; the iteration step µ is determined by backtracing line search; the iteration step µ is initially set to 1, and if a total objective function value increases after each iteration, µ remains value, otherwise, the iteration step p is halved until objective function value increases; and if
PNG
media_image26.png
22
149
media_image26.png
Greyscale
or q reaches the maximum iteration time,
PNG
media_image27.png
26
68
media_image27.png
Greyscale
, go to perform step 7; otherwise, the reference component Ri is updated by using
PNG
media_image28.png
30
63
media_image28.png
Greyscale
and then go to perform step 4 to step 5.” in claim 9; “wherein the calculating time course of the FN for the individual subject at step 8 further comprises: calculating the time course (TC) corresponding to each component in the individual subject based on an estimated component Y and individual subject's data matrix X by formula:
PNG
media_image29.png
22
116
media_image29.png
Greyscale
wherein Y-1 represents an inverse of Y and E (*) represents a mathematical expectation.” in claim 10; and “wherein the calculating FNs and related time courses for all subjects at step 9 further comprises calculating each FN and related time courses for each individual subject according to step 4 to step 8.” in claim 11.
Thus, claims 2 - 20 recite the same abstract idea and therefore are not drawn to the eligible subject matter as they are directed to the abstract idea without significantly more.
Therefore, the Examiner interprets that the claims are rejected under 35 U.S.C. 101.
Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to a “computer program product comprising executable instructions” that is non-statutory subject matter. The broadest reasonable interpretation of a claim drawn to a computer-readable medium (also called machine readable medium and other such variations) typically covers forms of non-transitory tangible media and transitory propagating signals per se in view of the ordinary and customary meaning of computer readable media, particularly when the specification is silent. See MPEP 2111.01. A review of the specification reveals that “Any reference to a memory, a storage, a database or other medium used in the embodiments provided in the present disclosure may include at least one of a non-volatile memory and a volatile memory” in ¶ 0049.
A claim drawn to such a computer-readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 U.S.C. § 101 by adding the limitation "non-transitory computer-readable storage medium" to the claim.
Claim Objections
Claim 1 is objected to because of the following informalities:
Lines 9 and 12 state “FN-related group-level ICs”, however the acronym FN has not been previously defined. Line 17 defines FN to be “Functional Network” but it is unknown if this is the same as FN in lines 9 and 12.
Claim 12 is objected to because of the following informalities:
Line 4 states “any one of claim 1”. The inclusion of “any one of” appears to be included in error as this claim only depends from claim 1.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 5 and 16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 5 and 16 recite the limitation "the FastICA or Infomax algorithm" in line 2. There is insufficient antecedent basis for this limitation in the claim.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Ding et al (CN112002428, employed using the provided machine translation) teaches a method for processing functional MRI images to construct a whole brain individualized brain function map based on independent components networks. A distinct difference between this art and the applicant’s invention is that Ding does not appear to determine 13 different voxel-level features for each voxel as claimed in independent claim 1.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW JONES whose telephone number is (703)756-4573. The examiner can normally be reached Monday - Friday 8:00-5:00 EST, off Every Other Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571) 272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.J./ Examiner, Art Unit 2667
/MATTHEW C BELLA/ Supervisory Patent Examiner, Art Unit 2667