Prosecution Insights
Last updated: April 18, 2026
Application No. 18/842,557

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM

Non-Final OA §102§103§112
Filed
Aug 29, 2024
Examiner
SALVUCCI, MATTHEW D
Art Unit
2613
Tech Center
2600 — Communications
Assignee
Zozo Inc.
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant
99%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
348 granted / 485 resolved
+9.8% vs TC avg
Strong +28% interview lift
Without
With
+28.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
17 currently pending
Career history
502
Total Applications
across all art units

Statute-Specific Performance

§101
4.6%
-35.4% vs TC avg
§103
60.8%
+20.8% vs TC avg
§102
17.0%
-23.0% vs TC avg
§112
14.3%
-25.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 485 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: “a generation unit,” “an application unit,” and “a providing unit” in claim 1. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 Claims 1-9 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1, 8, and 9 each recites the limitation "a face image" multiple times in each of the respective first and second clauses. There is insufficient antecedent basis for this limitation in the claim as more than one of these limitations exist, rendering it unclear which is being referred to for each instance. For the purpose of examination they will all be considered different from one another. Appropriate correction and/or explanation is required. The remaining claims depend from independent claim 1, either directly or indirectly, and are rejected accordingly. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 2, 4, 5, 8, and 9 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kwak et al. (KR 20200107474; translation attached), hereinafter Kwak. Regarding claim 1, Kwak discloses an information processing apparatus including: a generation unit that generates makeup information for generating a face image in which same makeup as predetermined makeup is applied to a face image based on a face image of a poster with the predetermined makeup (Paragraph [0011]: a database is constructed by extracting and mapping makeup information based on video data related to makeup methods of influencers previously uploaded to the internet, and a personalized makeup recommendation service is provided based on virtual makeup synthesis and beauty score-based evaluation using the database, thereby improving user convenience and reducing infrastructure construction costs; Paragraph [0051]: the virtual makeup synthesis unit (100) can perform image synthesis processing between a source image and a user image according to a makeup style through artificial neural network learning processing, and to this end, can perform a process of extracting the makeup style application part from the source person image and applying it to the target person image; Paragraphs [0103]-[0104]: the user can select a recommended makeup style by referring to virtual makeup composite images, and as shown in FIG. 4(C), influencer video information corresponding to the selected makeup style can be provided according to the user's selection. To this end, the recommendation service provider (900) can obtain video link information corresponding to influencer video information from the style database (600) and provide it to the user terminal (10)…as shown in FIG. 4(C), video information of an influencer’s makeup technique corresponding to the style selected by the user can be output through the user terminal); an application unit that applies, to a face image of a user, makeup information that is generated from a face image that is selected by the user from among face images that are posted by posters who meet a predetermined condition (Paragraph [0032]: the service providing device (1000) receives and registers user information from the user terminal (10), obtains user image information and other condition setting information as user input information, and can provide recommendation information to the user terminal (10) to recommend a suitable makeup style corresponding to the user's bare face; Paragraphs [0045]-[0050]: the makeup style database (600) can store and manage the database constructed by performing source image extraction and mapping processing from the collected image information, and can provide makeup styles and source images according to the request of the conditional makeup method candidate extraction unit… the user information management unit (300) can build data for recommendation services by processing profiling (feature analysis) using face photos, classification of user types through beauty score evaluation and feedback, profiling through similar user clusters (collaborative filtering), and profiling through service usage patterns…And, the input information acquisition unit (350) can acquire input information for selecting a candidate for a recommended makeup method based on user information managed by the user information management unit (300) and user input information entered from the user terminal…For example, a user terminal (10) can transmit user input information including a user's bare face image, explicit condition information (time limit, makeup popularity, situation specification, etc.) and keyword information (search, filtering, recommendation) to a service providing device (1000), and the service providing device (1000) determines input information based on the received user input information and the management information of the user information management unit (300), and the determined input information can be transmitted to a conditional makeup candidate extraction unit (700) and a virtual makeup synthesis unit (100)…the virtual makeup synthesis unit (100) can perform one or more virtual synthesis processes on the user face image based on the makeup style and source image determined from the conditional makeup method candidates); and providing unit that provides, to the user, a content that indicates the face image to which the makeup information is applied by the application unit and a description content of a product that is used for the makeup (Fig. 4; Paragraphs [0103]-[0108]: the user can select a recommended makeup style by referring to virtual makeup composite images, and as shown in FIG. 4(C), influencer video information corresponding to the selected makeup style can be provided according to the user's selection…the recommendation service provider (900) can obtain video link information corresponding to influencer video information from the style database (600) and provide it to the user terminal (10)…as shown in FIG. 4(C), video information of an influencer’s makeup technique corresponding to the style selected by the user can be output through the user terminal (10)…FIGS. 4(D) and FIGS. 4(E) illustrate a cosmetic recommendation and purchase linkage function interface. The recommendation service providing unit (900) according to an embodiment of the present invention can recommend cosmetic information based on user profiling information and face image analysis, and cosmetic information used in the aforementioned influencer video may also be recommended…the recommendation service provider (900) can recommend candidates for partial cosmetics that produce a similar effect to partial makeup in the overall makeup method by considering the composition of the entire makeup method and the makeup condition (skin, color) of each part of the face, and link to a shopping mall that sells them…the recommendation service provider (900) may provide a process of prioritizing recommendations by recognizing the keyword or cosmetic packaging/container in the image when keyword information or cosmetic packaging/container that explicitly corresponds to the cosmetic is exposed in the video…the recommendation service provider (900) can provide a purchase service corresponding to the product selected by the user or provide a purchase linkage function with the product seller site, thereby enabling the user to immediately check and purchase products that match their makeup style. To this end, the recommendation service provider (900) may be equipped with a separate purchase module and payment module, or may additionally be equipped with a shopping site linkage module). Regarding claim 2, Kwak discloses the information processing apparatus according to claim 1, wherein the content is a content that includes information that allows, by operation of the user, the user to access to a predetermined electronic mall in which a product is available for purchase (Fig. 4; Paragraphs [0103]-[0108]: the user can select a recommended makeup style by referring to virtual makeup composite images, and as shown in FIG. 4(C), influencer video information corresponding to the selected makeup style can be provided according to the user's selection…the recommendation service provider (900) can obtain video link information corresponding to influencer video information from the style database (600) and provide it to the user terminal (10)…as shown in FIG. 4(C), video information of an influencer’s makeup technique corresponding to the style selected by the user can be output through the user terminal (10)…FIGS. 4(D) and FIGS. 4(E) illustrate a cosmetic recommendation and purchase linkage function interface. The recommendation service providing unit (900) according to an embodiment of the present invention can recommend cosmetic information based on user profiling information and face image analysis, and cosmetic information used in the aforementioned influencer video may also be recommended…the recommendation service provider (900) can recommend candidates for partial cosmetics that produce a similar effect to partial makeup in the overall makeup method by considering the composition of the entire makeup method and the makeup condition (skin, color) of each part of the face, and link to a shopping mall that sells them…the recommendation service provider (900) may provide a process of prioritizing recommendations by recognizing the keyword or cosmetic packaging/container in the image when keyword information or cosmetic packaging/container that explicitly corresponds to the cosmetic is exposed in the video…the recommendation service provider (900) can provide a purchase service corresponding to the product selected by the user or provide a purchase linkage function with the product seller site, thereby enabling the user to immediately check and purchase products that match their makeup style. To this end, the recommendation service provider (900) may be equipped with a separate purchase module and payment module, or may additionally be equipped with a shopping site linkage module). Regarding claim 4, Kwak discloses the information processing apparatus according to claim 1, wherein the application unit applies the makeup information that is generated from a face image that is selected from among face images that are preferentially selected or displayed based on a predetermined criterion (Paragraph [0049]: a user terminal (10) can transmit user input information including a user's bare face image, explicit condition information (time limit, makeup popularity, situation specification, etc.) and keyword information (search, filtering, recommendation) to a service providing device (1000), and the service providing device (1000) determines input information based on the received user input information and the management information of the user information management unit (300), and the determined input information can be transmitted to a conditional makeup candidate extraction unit (700) and a virtual makeup synthesis unit; Paragraphs [0103]-[0104]: the user can select a recommended makeup style by referring to virtual makeup composite images, and as shown in FIG. 4(C), influencer video information corresponding to the selected makeup style can be provided according to the user's selection. To this end, the recommendation service provider (900) can obtain video link information corresponding to influencer video information from the style database (600) and provide it to the user terminal (10)…as shown in FIG. 4(C), video information of an influencer’s makeup technique corresponding to the style selected by the user can be output through the user terminal). Regarding claim 5, Kwak discloses the information processing apparatus according to claim 4, wherein the application unit applies the makeup information that is generated from a face image that is selected from among face images that are preferentially selected or displayed in descending order of priorities, based on priorities that are determined based on at least one of face information, body information, and a product purchase history of the user (Paragraphs [0046]-[0049]: the user information management unit (300) can perform registration processing according to input information input from the user terminal (10), and map and store and manage the user's usage history information, the usage history information of similar users, and user preference information…the user information management unit (300) can build data for recommendation services by processing profiling (feature analysis) using face photos, classification of user types through beauty score evaluation and feedback, profiling through similar user clusters (collaborative filtering), and profiling through service usage patterns…the input information acquisition unit (350) can acquire input information for selecting a candidate for a recommended makeup method based on user information managed by the user information management unit (300) and user input information entered from the user terminal (10)…a user terminal (10) can transmit user input information including a user's bare face image, explicit condition information (time limit, makeup popularity, situation specification, etc.) and keyword information (search, filtering, recommendation) to a service providing device (1000), and the service providing device (1000) determines input information based on the received user input information and the management information of the user information management unit (300), and the determined input information can be transmitted to a conditional makeup candidate extraction unit (700) and a virtual makeup synthesis unit; Paragraph [0107]: the recommendation service provider (900) may provide a process of prioritizing recommendations by recognizing the keyword or cosmetic packaging/container in the image when keyword information or cosmetic packaging/container that explicitly corresponds to the cosmetic is exposed in the video). Regarding claim 8, the limitations of this claim substantially correspond to the limitations of claim 1; thus they are rejected on similar grounds. Regarding claim 9, the limitations of this claim substantially correspond to the limitations of claim 1 (except for the medium, which is disclosed by Kwak, Paragraph [0110]: method according to the present invention described above can be produced as a program to be executed on a computer and stored on a computer-readable recording medium, and examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.); thus they are rejected on similar grounds. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 6 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Kwak, in view of Sugaya (US Pub. 2019/0197736). Regarding claim 6, Kwak discloses the information processing apparatus according to claim 4. Kwak does not explicitly disclose wherein the application unit applies the makeup information that is generated from a face image that is selected from among face images that are preferentially selected or displayed in descending order of priorities, based on priorities that are determined based on at least one of an access history, a purchase history, and an evaluation history of the face image. However, Sugaya teaches virtual cosmetic application and shopping (Abstract), further comprising wherein the application unit applies the makeup information that is generated from a face image that is selected from among face images that are preferentially selected or displayed in descending order of priorities, based on priorities that are determined based on at least one of an access history, a purchase history, and an evaluation history of the face image (Fig. 3; Paragraphs [0121]-[0126]: providing unit 109 gives priority to a cosmetic where the similarity calculated by the first calculation unit 103 is higher than a threshold, based on additional information about the cosmetic or the user. This additional information includes, for example, a price of the cosmetic, a popularity of the cosmetic, a release date of the cosmetic, a type of a color of the user's skin, a favorite cosmetic brand name, a level of a makeup skill, or a type of a makeup tool owned by the user…price of the cosmetic, the popularity of the cosmetic, and the release date of the cosmetic are included in, for example, product information that is stored in the product table 131. For example, if the user ID given to the user is “001,” the information is extracted from the product information that is stored in association with the user ID “001” in the product table 131…type of the color of the user's skin, the favorite cosmetic brand name, the level of the makeup skill, and the type of the makeup tool owned by the user are stored in, for example, the user table 133. For example, if the user ID given to the user is “001,” the information is stored in association with the user ID “001” is read from the user table 133…if the additional information includes the price of the cosmetic, the lower the price of cosmetic is, the higher the priority is. Alternatively, the higher the price of cosmetic is, the higher the priority may be. If the additional information includes the popularity of the cosmetic, the higher the popularity is, the higher the priority is. If the additional information includes the release date of the cosmetic, the newer the release date is, the higher the priority is… the additional information includes the type of the color of the user's skin, the priority of the cosmetic having the color fitting this type is higher than the priorities of other cosmetics. For example, if the type of color of the user's skin is yellow-based color, a priority of a cosmetic having the yellow-based color is higher than priorities of other cosmetics…If the additional information includes the brand name of the user's favorite cosmetic, the priority of this brand cosmetic is higher than the priorities of the other brand cosmetics. For example, if the user likes a certain cosmetic brand, a priority of this brand cosmetic increases). Sugaya teaches that this will allow for presentation to user based on user preferences (Paragraphs [0122]-[0127]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kwak with the features of above as taught by Sugaya so as to allow for presentation to user based on user preferences as presented by Sugaya. Regarding claim 7, Kwak discloses the information processing apparatus according to claim 4. Kwak does not explicitly disclose wherein the application unit applies the makeup information that is generated from a face image that is selected from among face images that are preferentially selected or displayed in descending order of priorities, based on priorities that are determined based on a total price of products. However, Sugaya teaches virtual cosmetic application and shopping (Abstract), further comprising wherein the application unit applies the makeup information that is generated from a face image that is selected from among face images that are preferentially selected or displayed in descending order of priorities, based on priorities that are determined based on a total price of products (Fig. 3; Paragraphs [0121]-[0124]: providing unit 109 gives priority to a cosmetic where the similarity calculated by the first calculation unit 103 is higher than a threshold, based on additional information about the cosmetic or the user. This additional information includes, for example, a price of the cosmetic, a popularity of the cosmetic, a release date of the cosmetic, a type of a color of the user's skin, a favorite cosmetic brand name, a level of a makeup skill, or a type of a makeup tool owned by the user…price of the cosmetic, the popularity of the cosmetic, and the release date of the cosmetic are included in, for example, product information that is stored in the product table 131. For example, if the user ID given to the user is “001,” the information is extracted from the product information that is stored in association with the user ID “001” in the product table 131…type of the color of the user's skin, the favorite cosmetic brand name, the level of the makeup skill, and the type of the makeup tool owned by the user are stored in, for example, the user table 133. For example, if the user ID given to the user is “001,” the information is stored in association with the user ID “001” is read from the user table 133…if the additional information includes the price of the cosmetic, the lower the price of cosmetic is, the higher the priority is. Alternatively, the higher the price of cosmetic is, the higher the priority may be. If the additional information includes the popularity of the cosmetic, the higher the popularity is, the higher the priority is. If the additional information includes the release date of the cosmetic, the newer the release date is, the higher the priority is). Sugaya teaches that this will allow for presentation to user based on user preferences (Paragraphs [0122]-[0127]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kwak with the features of above as taught by Sugaya so as to allow for presentation to user based on user preferences as presented by Sugaya. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW D SALVUCCI whose telephone number is (571)270-5748. The examiner can normally be reached M-F: 7:30-4:00PT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, XIAO WU can be reached at (571) 272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MATTHEW SALVUCCI/Primary Examiner, Art Unit 2613
Read full office action

Prosecution Timeline

Aug 29, 2024
Application Filed
Apr 03, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597198
RAY TRACING METHOD AND APPARATUS BASED ON ATTENTION FOR DYNAMIC SCENES
2y 5m to grant Granted Apr 07, 2026
Patent 12597207
Camera Reprojection for Faces
2y 5m to grant Granted Apr 07, 2026
Patent 12579753
Phased Capture Assessment and Feedback for Mobile Dimensioning
2y 5m to grant Granted Mar 17, 2026
Patent 12561899
Vector Graphic Parsing and Transformation Engine
2y 5m to grant Granted Feb 24, 2026
Patent 12548256
IMAGE PROCESSING APPARATUS FOR GENERATING SURFACE PROFILE OF THREE-DIMENSIONAL GEOMETRIC MODEL, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
99%
With Interview (+28.5%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 485 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month