Prosecution Insights
Last updated: April 19, 2026
Application No. 18/247,590

Method and Apparatus for Predicting Relevance Degree, and Method and Apparatus for Training Machine Leaning Module

Non-Final OA §101§102§103
Filed
Apr 03, 2023
Examiner
BEAN, GRIFFIN TANNER
Art Unit
2121
Tech Center
2100 — Computer Architecture & Software
Assignee
BOE TECHNOLOGY GROUP CO., LTD.
OA Round
1 (Non-Final)
21%
Grant Probability
At Risk
1-2
OA Rounds
4y 4m
To Grant
50%
With Interview

Examiner Intelligence

Grants only 21% of cases
21%
Career Allow Rate
4 granted / 19 resolved
-33.9% vs TC avg
Strong +28% interview lift
Without
With
+28.4%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
45 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
37.7%
-2.3% vs TC avg
§103
40.4%
+0.4% vs TC avg
§102
11.2%
-28.8% vs TC avg
§112
9.7%
-30.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 19 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION This Action is responsive to claims filed 03/31/2023. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on 09/26/2023 was filed before the mailing of the first action. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Drawings Receipt of Drawings filed 03/31/2023 is acknowledged. These Drawings are acceptable. Status of the Claims Claims 5, 7- 8, 11, 13, 15, 19, 21-22, and 25 are cancelled by preliminary amendment. Claims 1-4, 6, 9-10, 12, 14, 16-18, 20, 23-24, and 26-27 are pending. Claim Objections Claims 4, 12, 18, and 26-27 objected to because of the following informalities: Claims 4 and 18 recite “an unit matrix” rather than “a unit matrix” Claims 12 and 26 begin with “A apparatus” rather than “An apparatus” The statutory category of Claim 27 does not match that of Claim 1, on which Claim 27 depends. Claim 1 recites a method (a process), Claim 27 recites a non-transitory computer-readable storage medium (a manufacture) Consistency between statutory categories would benefit clarity. Appropriate correction is suggested. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-4, 6, 9-10, 12, 14, 16-18, 20, 23-24, and 26-27 rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more; and because the claims as a whole, considering all claim elements both individually and in combination, do not amount to significantly more than the abstract idea, see Alice Corporation Pty. Ltd. v. CLS Bank International, et al, 573 U.S. (2014). In determining whether the claims are subject matter eligible, the Examiner applies the 2019 USPTO Patent Eligibility Guidelines. (2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, Jan. 7, 2019.) Step 1: Claims 1-4, 6, and 9-10 recite a method, which falls under the statutory category of a process. Claim 12 recites an apparatus, which falls under the statutory category of a machine. Claims 26, 14, 16-18, 20, and 23-24 recite an apparatus, which falls under the stator category of a machine. Claim 27 recites a non-transitory computer-readable storage medium, which falls under the statutory category of a manufacture. Step 2A – Prong 1: Claim 1 recites an abstract idea, law of nature, or natural phenomenon. The limitations of “constructing a heterogeneous matrix, wherein the heterogeneous matrix comprises a first matrix representing the similarity between each two medications in a set of medications, a second matrix representing a similarity between each two diseases in a set of diseases, and a third matrix representing a relevance degree between each medication in the set of medications and each disease in the set of diseases;”, “obtaining a feature vector of the each medication in the set of medications and a feature vector of the each disease in the set of diseases by using the heterogeneous matrix;”, and “obtaining a predicted result of the relevance degree between the each medication and the each disease according to the first predicted value of the relevance degree and the second predicted value of the relevance degree between the each medication and the each disease.” under the broadest reasonable interpretation, cover a mental process including an observation, evaluation, judgment or opinion that could be performed in the human mind or with the aid of pencil and paper. Constructing a matrix of values is practically performed within the human mind or with the aid of pen and paper. Obtaining a feature vector using the matrices is practically performed within the human mind or with the aid of pen and paper. Obtaining a predicted result of a relevance degree is practically performed within the human mind or with the aid of pen and paper. Step 2A – Prong 2: The additional elements of claim 1 do not integrate the abstract idea into a judicial exception. The claim recites the additional elements “heterogenous matrix” is recognized as generic computer components recited at a high level of generality. Although they have and execute instructions to perform the abstract idea itself, this also does not serve to integrate the abstract idea into a practical application as it merely amounts to instructions to "apply it." (See MPEP 2106.04(d)(2) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). The additional elements of “machine learning model” and “feature vector” are recognized as non-generic computer components, but are recited at a high level of generality and are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). The additional elements of “medications” and “diseases” are recognized as non-generic computer components, but are recited at a high level of generality and are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). The additional elements recited in the limitations “processing the feature vector of the each medication and the feature vector of the each disease by using a first machine learning model to obtain a first predicted value of the relevance degree between the each medication and the each disease;” and “processing the heterogeneous matrix by using a second machine learning model to obtain a relevance degree matrix, wherein the relevance degree matrix comprises a second predicted value of the relevance degree between the each medication and the each disease;” amount to instructions to apply the data manipulation abstract idea mental process steps (See MPEP 2106.05(f)). Step 2B: The additional elements of claim 1 do not amount to more than the judicial exception. The only limitation on the performance of the described method is a limitation reciting “heterogenous matrix” These elements are insufficient to transform a judicial exception to a patentable invention because the recited elements are considered insignificant extra-solution activity (generic computer system, processing resources, links the judicial exception to a particular, respective, technological environment). The claim thus recites computing components only at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using generic computer components; mere instructions to apply an exception using a generic computer component cannot provide an inventive concept (see MPEP 2106.05(f)). The additional elements of “machine learning model” and “feature vector” are recognized as non-generic computer components, but are recited at a high level of generality and are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). The additional elements of “medications” and “diseases” are recognized as non-generic computer components, but are recited at a high level of generality and are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). The additional elements recited in the limitations “processing the feature vector of the each medication and the feature vector of the each disease by using a first machine learning model to obtain a first predicted value of the relevance degree between the each medication and the each disease;” and “processing the heterogeneous matrix by using a second machine learning model to obtain a relevance degree matrix, wherein the relevance degree matrix comprises a second predicted value of the relevance degree between the each medication and the each disease;” amount to instructions to apply the data manipulation abstract idea mental process steps (See MPEP 2106.05(f)). Taken alone or in ordered combination, these additional elements do not amount to significantly more than the above-identified abstract idea. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. For the reasons above, claim 1 is rejected as being directed to non-patentable subject matter under §101. This rejection applies equally to independent claims 12 and 26. Claim 12 recites similar limitations to claim 1, with the exception of “A apparatus for predicting a relevance degree, comprising: a processor; and a memory coupled to the processor, storing program instructions which, when executed by the processor, cause the processor to” (generic computer components); therefore, both claims are similarly rejected. Claim 26 recites similar limitations to claim 1, with the exception of “A apparatus for training a machine learning model, comprising: a processor; and a memory coupled to the processor, storing program instructions which, when executed by the processor, cause the processor to;” (generic computer components), “determine a loss function according to the prediction result of the relevance degree between the each medication and the each disease;” (abstract idea mental process step), and “train the first machine learning model and the second machine learning model by using the loss function.” (instructions to apply the abstract idea (See MPEP 2106.05(f)); therefore, both claims are similarly rejected. Dependent Claims: Claim 2 (claim 16) recites refinements to how one would perform the abstract idea mental process step(s) of the independent claims. Claim 3 (claim 17) recites refinements to the data types manipulated in the independent claims. Claim 4 (claim 18) recites abstract idea mental process steps “generating…”, “generating…”, “splitting…”, and “generating…”. Claim 6 (claim 20) recites abstract idea mental process steps “generating…” and “generating…”. Claim 9 (claim 23) recites abstract idea mental process steps “splicing…” and instructions to apply the “splicing…” step “processing…” (See MPEP 2106.05(f)). Claim 10 (claim 24) recites abstract idea mental process steps “constructing…”, “constructing …”, “constructing …”, “constructing”, and “generating…”. Claim 14 recites refinements to the loss function. Claim 27 recites “A non-transitory computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions that, when executed by a processor,” (generic computer components). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3, 9-10, 12, 14, 16-17, 23-24, and 26-27 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Song et al. (Prediction of Drug-Related Diseases Through Integrating Pairwise Attributes and Neighbor Topological Structures, published 06/16/2021), hereinafter Song. In regards to claim 1: The present invention claims: “A method for predicting a relevance degree, comprising: constructing a heterogeneous matrix, wherein the heterogeneous matrix comprises a first matrix representing the similarity between each two medications in a set of medications, a second matrix representing a similarity between each two diseases in a set of diseases, and a third matrix representing a relevance degree between each medication in the set of medications and each disease in the set of diseases;” Song teaches “The model framework, ANPred, has been proposed for drug-disease association prediction (as shown in Fig. 1).” (Page 2964) and “The construction of bi-layer drug-disease heterogeneous network and random walk sequence set. (a) Construct a bi-layer drug-disease heterogeneous network, where R and D are the drug similarity matrix and the disease similarity matrix respectively, and A is the drug-disease association matrix, (b) Neighbor node sequence set based on random walk.” (Figure 2, Page 2965). Section 2.2 details the formulation and construction of these matrices. “obtaining a feature vector of the each medication in the set of medications and a feature vector of the each disease in the set of diseases by using the heterogeneous matrix;” Song Section 2.3, Page 2965, describes the formulation of a feature vector from the drug-drug similarity matrix and association matrix (F1) and the disease-disease similarity matrix and association matrix (F2), resulting in a feature vector matrix combining the two feature vectors (F). “processing the feature vector of the each medication and the feature vector of the each disease by using a first machine learning model to obtain a first predicted value of the relevance degree between the each medication and the each disease;” Song teaches “The neighbor topology representation of the drug ri obtained from the FAS framework of the drug is gri . The neighbor topology representation of the disease dj obtained from the FAS framework of the disease is gdj . The drug and disease node vectors were connected through the connection strategy to obtain the node pair neighbor topology matrix G (Fig. 1e). To obtain the neighbor topology vector of the deep drug-disease node pair, we feed the neighbor topology matrix G of the node pair to the convolutional neural network to obtain the output feature sn.” (Page 2968, Fig. 1e, the drug-drug features and disease-disease features are used to obtain one output by Fig. 1e’s CNN). “processing the heterogeneous matrix by using a second machine learning model to obtain a relevance degree matrix, wherein the relevance degree matrix comprises a second predicted value of the relevance degree between the each medication and the each disease;” Song teaches “A CNN-based module was constructed for the embedding matrix F to obtain the attribute representation of node pair. First, in order to obtain the edge information of the node pair feature matrix F, we used a padding operation on the feature matrix F to obtain a new feature matrix…where nconv is the number of circles padded around the feature matrix F. Second, the feature matrix F0 was sent to the convolutional neural network to learn the attribute representation of the drug-disease node pair.“ (Page 2966, Fig. 1b, the drug-disease association matrix is learned for a second output by another CNN). “and obtaining a predicted result of the relevance degree between the each medication and the each disease according to the first predicted value of the relevance degree and the second predicted value of the relevance degree between the each medication and the each disease.” Song teaches “The final predicted score is the weighted sum of sore1 and sore2 (Fig. 1f)” (Page 2968, Fig. 1f, the results of the two CNNs are combined into a weighted sum for the final result). In regards to claim 2: The present invention claims: “wherein the prediction result of the relevance degree between an ith medication and a jth disease is a weighted sum of the first predicted value of the relevance degree between the ith medication and the jth disease and the second predicted value of the relevance degree of the ith medication and the jth disease, 1<i<M, 1<j<N, M is a total number of the medications, and N is a total number of the diseases.” Song teaches “The final predicted score is the weighted sum of sore1 and sore2 (Fig. 1f)” (Page 2968, Fig. 1f, the results of the two CNNs are combined into a weighted sum for the final result). In regards to claim 3: The present invention claims: “wherein: the feature vector of the ith medication respectively comprises the similarity between the ith medication and the each medication in the set of medications and the relevance degree between the fi medication and the each disease in the set of diseases; and the feature vector of the jth disease respectively comprises the relevance degree between the jth disease and the each disease in the set of diseases, and the similarity between the jth disease and the each medication in the set of medications.” Song Section 2.3 (Pages 2965-2966) details the formulation of the feature matrix(s) included how the features of a given drug include the similarity between it and other drugs and association(s) with diseases, and vice versa. In regards to claim 9: While Song reads on claim 1, Song fails to explicitly teach “splicing the feature vector of the each medication and the feature vector of the each disease to obtain a splicing feature; and processing the splicing feature by using a first machine learning model to obtain the first predicted value of the relevance degree of the each medication and the each disease.” See Song Fig. 3, specifically item F, for drug and disease features being merged (mapping to the generic recitation of “spliced”) into a feature vector. In regards to claim 10: The present invention claims: “wherein the constructing the heterogeneous matrix comprises: constructing a first matrix, wherein the first matrix comprises the similarity between the each two medications in the set of medications;” See Song Sections 2.2 and 2.3, particularly Fig. 2 R. “constructing a second matrix, wherein the second matrix comprises the similarity between the each two diseases in the set of diseases;” See Song Sections 2.2 and 2.3, particularly Fig. 2 D. “constructing a third matrix, wherein the third matrix comprises the relevance degree between the each medication in the set of medications and the each disease in the set of diseases; and generating the heterogeneous matrix by using the first matrix, the second matrix and the third matrix.” Song Sections 2.2 and 2.3, particularly Fig. 2 A as well as subsequent use of all three matrices in the formulation of the network and resultant feature matrices. In regards to claim 12: Claim 12 recites similar limitation to Claim 1, with the exception of “A apparatus for predicting a relevance degree, comprising: a processor; and a memory coupled to the processor, storing program instructions which, when executed by the processor, cause the processor to” recited in Claim 12; therefore, both claims are similarly rejected. In regards to claim 26: Claim 26 recites similar limitations to claim 1, with the exception of “A apparatus for training a machine learning model, comprising: a processor; and a memory coupled to the processor, storing program instructions which, when executed by the processor, cause the processor to;”, “determine a loss function according to the prediction result of the relevance degree between the each medication and the each disease;”, and “train the first machine learning model and the second machine learning model by using the loss function.” Song Page 2967 goes into detail how the model(s) are trained with (similar) loss functions (Equations 17-19) as well as a loss function between the predicted output of the two models (Page 2968, Equations 25-28); therefore, both claims are similarly rejected. In regards to claim 14: The present invention claims: “wherein the loss function is a weighted sum of prediction results of the relevance degree of the each medication in the set of medications and the each disease in the set of diseases.” See the Rejections of Claims 1, 2, and 12 above and Song’s Equations 25-29 for the result/weighted loss functions of the drug and disease node pairs. In regards to claims 16-17 and 23-24: Claims 16-17 and 23-24 recite similar limitations to claims 2-3 and 10, with the exception of “A apparatus for training a machine learning model, comprising: a processor; and a memory coupled to the processor, storing program instructions which, when executed by the processor, cause the processor to;”, “determine a loss function according to the prediction result of the relevance degree between the each medication and the each disease;”, and “train the first machine learning model and the second machine learning model by using the loss function.” of Claim 26; therefore, both sets of Claims are similarly rejected. In regards to claim 27: The present invention claims: “A non-transitory computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions that, when executed by a processor, implement the method of 1.”; therefore, both Claim 27 and Claim 1 are similarly rejected. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 4, 6, 18, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Song as applied to Claims 1 and 26 above, in further view of Yu et al. (Predicting drug–disease associations through layer attention graph convolutional network, 2021), hereinafter Yu. In regards to claim 4: While Song teaches Claims 1 and 26 above, Song fails to explicitly teach the various matrices recited in: “generating a transformation matrix by using the heterogeneous matrix and an unit matrix; generating an embedded feature vector according to the transformation matrix, a degree matrix of the heterogeneous matrix and a feature matrix, wherein the feature matrix comprises the third matrix;” However, Yu, in a similar field of endeavor of drug-disease associations, teaches a heterogenous matrix, normalization of the matrix (mapping to the use of a generally recited unit matrix) (Page 3), as well as the formation of a feature embedding used said matrices (Page 3, Method Architecture). It is noted that Equation 6 of Yu is functionally identical to the equation(s) recited in Applicant’s Specification paragraphs [0008]-[0010]. Yu teaches “the development of efficient and high-accuracy computational methods for predicting drug–disease associations is of great significance.” and “we propose a novel computational method named as layer attention graph convolutional network (LAGCN) for the drug–disease association prediction. Specifically, LAGCN first integrates the known drug–disease associations, drug–drug similarities and disease–disease similarities into a heterogeneous network, and applies the graph convolution operation to the network to learn the embeddings of drugs and diseases.” (Abstract). It would have been obvious to one of ordinary skill in the art at the time of the Applicant’s filing to combine the matrix manipulation steps of Yu in a system such as Song’s to facilitate efficiency and accuracy. “splitting the embedded feature vector into a medication embedded vector and a disease embedded vector;” See above rejections relying on Song and Song Figure 3 for the feature vector being split into diseases and drug data. “and generating the relevance degree matrix by using the medication embedded vector, a preset weight vector and the disease embedded vector.” See the above rejections relying on Song, as well as Song Page 2966, right column, for the relevance degree matrix and weight matrix. In regards to claim 6: The present invention claims: “wherein the generating the embedded feature vector comprises: generating a temporary feature vector according to the transformation matrix, the degree matrix of the heterogeneous matrix, the feature matrix and a first learnable weight;” See the above rejection of claim 4 for Equation 6 of Yu matching the Equations used in the Applicant’s Specification for the embedded feature vector and/or temporary feature vector. See also Yu Equations 9 and 10 and Page 4 for the use of trainable matrices (mapping to the use of multiple weights). “and generating the embedded feature vector according to the transformation matrix, the degree matrix of the heterogeneous matrix, the temporary feature vector and a second learnable weight.” See above how a combination of Song and Yu reads on the formation of the feature embedding. In regards to claims 18 and 20: Claims 18 and 20 recite similar limitations to Claims 4 and 6, with the exception of “A apparatus for training a machine learning model, comprising: a processor; and a memory coupled to the processor, storing program instructions which, when executed by the processor, cause the processor to;”, “determine a loss function according to the prediction result of the relevance degree between the each medication and the each disease;”, and “train the first machine learning model and the second machine learning model by using the loss function.” of Claim 26; therefore, both sets of Claims are similarly rejected. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRIFFIN T BEAN whose telephone number is (703)756-1473. The examiner can normally be reached M - F 7:30 - 4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Li Zhen can be reached at (571) 272-3768. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GRIFFIN TANNER BEAN/Examiner, Art Unit 2121 /Li B. Zhen/Supervisory Patent Examiner, Art Unit 2121
Read full office action

Prosecution Timeline

Apr 03, 2023
Application Filed
Jan 06, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12424302
ACCELERATED MOLECULAR DYNAMICS SIMULATION METHOD ON A QUANTUM-CLASSICAL HYBRID COMPUTING SYSTEM
2y 5m to grant Granted Sep 23, 2025
Patent 12314861
SYSTEMS AND METHODS FOR SEMI-SUPERVISED LEARNING WITH CONTRASTIVE GRAPH REGULARIZATION
2y 5m to grant Granted May 27, 2025
Patent 12261947
LEARNING SYSTEM, LEARNING METHOD, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 25, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
21%
Grant Probability
50%
With Interview (+28.4%)
4y 4m
Median Time to Grant
Low
PTA Risk
Based on 19 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month