Prosecution Insights
Last updated: April 19, 2026
Application No. 18/572,557

INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING DEVICE, AND IMAGE DISPLAY DEVICE

Final Rejection §103
Filed
Dec 20, 2023
Examiner
WANG, YI
Art Unit
2619
Tech Center
2600 — Communications
Assignee
Maxell, Ltd.
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
91%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
368 granted / 481 resolved
+14.5% vs TC avg
Moderate +15% lift
Without
With
+14.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
24 currently pending
Career history
505
Total Applications
across all art units

Statute-Specific Performance

§101
5.3%
-34.7% vs TC avg
§103
64.1%
+24.1% vs TC avg
§102
10.3%
-29.7% vs TC avg
§112
11.7%
-28.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 481 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This is in response to applicant’s amendment/response filed on 01/20/2026, which has been entered and made of record. Claims 1, 11, and 21 have been amended. Claims 20 and 22-30 have already been cancelled. No claim has been added. Claims 1-19 and 21 are pending in the application. Response to Arguments Applicant’s arguments with respect to claim(s) 1, 8, and 15, and the dependent claims have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant’s arguments directed to amended limitation have been addressed in the detail rejection below with new reference by Sekiguchi et al. Applicant submits that “While Iwasaki discloses "the HMO control part 14 recognizes a size of the AR marker within the display area of the display part 11" (para. [0059]) from which it can be understood that the size of the "AR marker" is recognized, as understood from the following phrase "the HMD control part 14 controls the display of the virtual image forming apparatus by the display part 11" (id.), the display area of the HMO displays a virtual image forming apparatus, not an AR marker. . . In other words, the difference between Iwasaki and the presently claimed invention lies in the point that the AR marker described in Iwasaki (corresponding to the image of the "screen" of an information processing device) is not actually displayed in the display area of HMO (corresponding to the display area of an image display device)." (Remarks, p. 16-17). The examiner disagrees with Applicant’s premises and conclusion. Iwasaki teaches recognizing a size of the “AR marker” based on the photoaged screen, and further, “based on the recognized size, the HMD control part 14 controls the display of the virtual image forming apparatus by the display part 11 so that the virtual image forming apparatus becomes equal in size to the real machine as viewed from the user” (¶59). Thus the display size of the virtual image forming apparatus is recognized based on the AR marker size. The arguments regarding dependent claims for the virtue of their dependency are moot because the independent claims are not allowable. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 7, 11-12, 17, and 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sekiguchi et al. (US 20180366089 A1), and in view of Iwasaki (US 20190212962 A1). Regarding Claim 1, Sekiguchi discloses An information processing system (¶1 reciting “a display system”. Fig. 1 and Fig. 3) comprising: an image display device equipped with a camera; (Fig. 2 showing a HMD 300 equipped with a Imager 310. ¶39 reciting “The HMD 300 includes an imager 310 configured to image up a landscape”) and an information processing device configured to operate and display an application that is application software (Fig. 2 showing a display apparatus 200. ¶35 reciting “ In FIG. 1, a projecting apparatus 100 projects education content (a world map in the example,) on a screen 930.”; and ¶38 reciting “The display apparatus 200 includes a recording unit 210 configured to store a primary information database 510 and a secondary information database 520, a controller 220 configured to perform various kinds of processes such as an output of the primary information and the secondary information, a signal output unit 230 configured to output the primary information to the projecting apparatus 100”), wherein the information processing system has an information communication cooperation function between the image display device and the information processing device (¶38 reciting “The display apparatus 200 includes . . . a communication unit 240 configured to communicate with the HMD 300”), and wherein the image display device is configure to : (¶64 reciting “provided is a head mounted display cooperative display system including . . . ”) photograph and read a screen of the information processing device by the camera (¶64 reciting “an imager configured to capture a camera image in a direction in which the wearer faces, and an in-camera image staring point detector configured to detect a staring point of the wearer with respect to the camera image”), recognize a display position of the read screen of the information processing device as a display area based on the photographed screen (¶44 reciting “In FIG. 5, a screen 281 is an education content selection screen and displays a plurality of icons including an icon 282 for displaying a world map. Here, when the icon 282 is selected, a world map 283 is displayed as illustrated in a lower part of FIG. 5. In the world map 283, markers 284 indicated by hatching are displayed at the four corners. This is identification information identifying a display region of the primary information”. Further, Claim 4 reciting “specifying a display region of the primary information in the camera image”), and display an image of the read screen of the information processing device in the recognized display area. (Fig. 5 showing displaying an image of the read screen in the recognized display region 283) However, Sekiguchi does not explicitly disclose recognizes a display size of the read screen of the information processing device as a display area. Iwasaki teaches “a display device” (ABST). More specifically, ¶59 recites “Also based on the captured image data captured by the image capturing part 12, the HMD control part 14 recognizes a size of the AR marker within the display area of the display part 11. Then, based on the recognized size, the HMD control part 14 controls the display of the virtual image forming apparatus by the display part 11 so that the virtual image forming apparatus becomes equal in size to the real machine as viewed from the user.” It would have been obvious to one with ordinary skill, before the effective filing date of the claimed invention, to adapt the method of recognizing a display size of an object in the captured image as a display area (taught by Iwasaki) in the system taught by Sekiguchi and to recognize a display size of the read screen of the information processing device as a display area. The suggestions/motivations would have been that “With this feature, it is possible to confirm the size of the real machine. For example, it is possible to confirm whether or not the real machine can be installed at an installation site of a user's desire, or to confirm an atmosphere resulting when the real machine is installed.” (¶92), and to apply a known technique to a known device (method, or product) ready for improvement to yield predictable results. Regarding Claim 2, Sekiguchi in view of Iwasaki discloses The information processing system according to claim 1, wherein the information processing device selects an application screen which is a screen of the application or a partial area of the application screen, (Sekiguchi, Fig. 5 showing an application screen 281, and ¶44 disclosing selecting an application screen 282.) and the image display device photographs and reads the selected application screen or partial area by the camera, recognizes a display position and a display size of the read application screen or partial area as the display area, and displays an application screen image which is an image of the selected application screen or partial area in the recognized display area. (see Claim 1 rejections for detailed analysis. Also see Sekiguchi Fig. 5) Regarding Claim 7. Sekiguchi in view of Iwasaki discloses The information processing system according to claim 2, wherein the information processing device selects the application screen or the partial area of the application screen, the image display device photographs the selected application screen or partial area by the camera, recognizes a display position and a display size of the photographed application screen or partial area as the display area, and displays an application screen image which is an image of the selected application screen or partial area in the recognized display area, (Sekiguchi, Fig. 5. See Claims 1 and 2 rejections) the information processing device selects another application screen to be displayed on the image display device or another partial area of another application screen, and the image display device photographs the selected another application screen or the partial area of the another application screen by the camera, recognizes a display position and a display size of the photographed another application screen or partial area of the another application screen as another display area, and displays another application screen image that is an image of the another application screen or the partial area of the another application screen image selected by the information processing device in the recognized another display area. (Sekiguchi, Fig. 5) Regarding Claim 11, Sekiguchi in view of Iwasaki discloses An information processing device (Sekiguchi , Fig. 2, display apparatus 200), comprising: a processor configured to: (Sekiguchi , Fig. 2, controller 220) perform a function of operating and displaying an application that is application software, communicate information with an image display device equipped with a camera, select a screen of the information processing device displayed on the image display device, cause the image display device to recognize a display position and a display size of a screen of the information processing device photographed and read by the camera as a display area based on the photographed screen, supply an image of a screen of the information processing device displayed on the image display device to the image display device, and control the image display device to display the supplied image of the screen of the information processing device in the recognized display area. (See Claim 1 rejections) Regarding Claim 12, Sekiguchi in view of Iwasaki discloses The information processing device according to claim 11, wherein the information processing device selects an application screen which is a screen of the application or a partial area of the application screen, causes the image display device to recognize a display position and a display size of the application screen or partial area photographed and read by the camera as the display area, supplies an application screen image, which is an image of the application screen or the partial area displayed on the image display device, to the image display device, and controls the image display device to display the application screen image that is the image of the supplied selected application screen or partial area in the recognized display area. (See Claim 2 and 11 rejections for detailed analysis.) Regarding Claim 17, Cho in view of Iwasaki discloses The information processing device according to claim 12, wherein the information processing device selects the application screen or the partial area of the application screen to be displayed on the image display device, causes the image display device to recognize a display position and a display size of the application screen or partial area photographed and read by the camera as the display area, supplies an application screen image, which the image of the application screen or partial area displayed on the image display device, to the image display device, and controls the image display device to display the supplied application screen image in the recognized display area, and further, the information processing device selects another application screen or another partial area of another application screen to be displayed on the image display device, causes the image display device to recognize a display position and a display size of the another application screen or another partial area of the another application screen photographed and read by the camera as another display area, supplies an application screen image an application screen image, which the image of the another application screen or partial area of the another application screen displayed on the image display device, to the image display device, and controls the image display device to display the supplied another application screen image in the recognized another display area. (See Claim 7 and 11 rejections for detailed analysis.) Regarding Claim 21, see Claim 11 rejections for detailed analysis. Claim(s) 3-5 and 13-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sekiguchi et al. (US 20180366089 A1), and in view of Iwasaki (US 20190212962 A1), and further in view of Faynshteyn et al. (US 20160055641 A1). Regarding Claim 3, Sekiguchi in view of Iwasaki discloses The information processing system according to claim 2, the image display device photographs and reads the generated and displayed specific pattern by the camera, recognizes a display position and a display size of the read specific pattern as the display area, and displays an application screen image that is an image of the selected application screen or partial area in the recognized display area. (See Claim 1 rejections) However, Sekiguchi in view of Iwasaki does not explicitly disclose wherein the information processing device selects the application screen or the partial area of the application screen, and generates and displays a specific pattern having a range of the selected application screen or partial area. It is well known in the art to generate and display translucent pattern on a selected area. In addition, Faynshteyn teaches “As shown in FIG. 8, the rendering unit causes the display unit to display a square selector 801 mapped to the floor 803 of the captured space. The square selector 801 identifies the sample region of floor 803. In aspects, the square selector 801 is semi-transparent to simultaneously illustrate both its bounding area and the selected pattern, as shown. In alternate embodiments, however, the square selector 801 may be displayed as a transparent region with a defined border (not shown). In further aspects, a viewing window 805 is provided to display the selected area to the user at a location on the display unit, as shown. The rendering unit translates and flattens the pixels of the sample region bounded by the square selector 801 into the viewing window 805, and, in aspects, updates the display in real-time according to the user's repositioning of the selector.” (¶64). It would have been obvious to one with ordinary skill, before the effective filing date of the claimed invention, to modify the system (taught by Sekiguchi in view of Iwasaki) to generate the selected area with translucent pattern (taught by Faynshteyn). The suggestions/motivations would have been to apply a known technique to a known device (method, or product) ready for improvement to yield predictable results. Regarding Claim 4, Sekiguchi in view of Iwasaki and Faynshteyn discloses The information processing system according to claim 3, wherein the information processing device makes the specific pattern to be generated and displayed translucent. (Faynshteyn, ¶64. The suggestions/motivations would have been the same as that of Claim 3 rejections.). Regarding Claim 5, Sekiguchi in view of Iwasaki and Faynshteyn discloses The information processing system according to claim 3, wherein the image display device generates an area frame indicating a range of the specific pattern generated and displayed by the information processing device, and displays the area frame on a display screen of the image display device. (Sekiguchi, Fig. 5) Regarding Claim 13, Sekiguchi in view of Iwasaki and Faynshteyn discloses The information processing device according to claim 12, wherein the information processing device selects the application screen or the partial area displayed on the image display device, generates and displays a specific pattern having a range of the selected application screen or partial area, causes the image display device to recognize the specific pattern photographed and read by the camera as the display area, supplies an application screen image, which is an image of the application screen or partial area displayed on the image display device, to the image display device, and controls the image display device to display an application screen image, which is the supplied image of the application screen or partial area, in the recognized display area. (See Claim 3 and 11 rejections for detailed analysis.) Regarding Claim 14, Sekiguchi in view of Iwasaki and Faynshteyn discloses The information processing device according to claim 13, wherein the information processing device makes the specific pattern to be generated and displayed translucent. (See Claim 4 and 11 rejections for detailed analysis.) Regarding Claim 15, Sekiguchi in view of Iwasaki and Faynshteyn discloses The information processing device according to claim 13, wherein the information processing device causes the image display device to generate an area frame indicating a range of the specific pattern. (See Claim 5 and 11 rejections for detailed analysis.) Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sekiguchi et al. (US 20180366089 A1), and in view of Iwasaki (US 20190212962 A1), and further in view of Cho et al. (US 20150015459 A1). Regarding Claim 10. Sekiguchi in view of Iwasaki discloses The information processing system according to claim 2. However, Sekiguchi in view of Iwasaki does not explicitly disclose wherein when the application screen image selected by the information processing device is displayed in the display area of the image display device, the information processing device selects whether to always display or display for each trigger notification, and in a case where the always display is selected, the image display device always displays the application screen image, and in a case where the display for each trigger notification is selected, the image display device starts or stops the display of the application screen image according to trigger notification information of start or stop of display generated by the information processing device. Cho, in Figs. 4 and 5, shows the display of the notification for the occurrence of the event is triggered by the location information. ¶74 recites “the mobile device 100a can provide a detail notification 20a for the occurred event in the mobile device 100a. Hence, referring to FIG. 4, the mobile device 100a executes an application related to the text message and can provide the content of the received message on the executed application.”; and ¶80 recites “Referring to FIG. 5, the HMD 200 can display a notification indicating a message reception in the front-light mode.” It would have been obvious to one with ordinary skill, before the effective filing date of the claimed invention, to modify the system (taught by Sekiguchi in view of Iwasaki) to display a screen according to trigger notification information (taught by Cho). The suggestions/motivations would have been make it convenient to a user, and to apply a known technique to a known device (method, or product) ready for improvement to yield predictable results. Claim(s) 6 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sekiguchi in view of Iwasaki, and further in view of Watanabe et al. (US 20180164589 A1). Regarding Claim 6, Sekiguchi in view of Iwasaki discloses The information processing system according to claim 2. However, Sekiguchi in view of Iwasaki does not explicitly disclose wherein the information processing device activates and displays display application switching software that switches a plurality of the applications on a display screen of the information processing device, selects an icon of each application indicated on the activated and displayed display application switching software, and thereby selects the application displayed on the image display device. It is well known in the art that a display device and activate and display application switching software with an icon of each application. In addition, Watanabe teaches in Fig. 7, and recites “At Step S1, a plurality of objects OB (also referred to as icons) for executing predetermined functions when touched are displayed on the smartphone A (such a screen may be referred to as a home screen). The plurality of objects OB include an object OB1. For example, the object OB1 is an image indicating that a web browser can be activated when touched.” (¶92) It would have been obvious to one with ordinary skill, before the effective filing date of the claimed invention, to modify the device (taught by Sekiguchi in view of Iwasaki) to activate and display application switching software (taught by Watanabe). The suggestions/motivations would have been to apply a known technique to a known device (method, or product) ready for improvement to yield predictable results. Regarding Claim 16, Sekiguchi in view of Iwasaki and Watanabe discloses The information processing device according to claim 12, wherein the information processing device activates and displays display application switching software that switches a plurality of the applications on a display screen of the information processing device, and selects an icon of each application indicated on the activated and displayed display application switching software, and thereby selects the application displayed on the image display device. (See Claim 6 and 11 rejections for detailed analysis.) Claim(s) 8-9 and 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sekiguchi in view of Iwasaki, and further in view of Mugura et al. (US 20200184219 A1). Regarding Claim 8. Sekiguchi in view of Iwasaki discloses The information processing system according to claim 2. However, Sekiguchi in view of Iwasaki does not explicitly disclose wherein the image display device or the information processing device identifies a harmony degree of contrast between the application screen image inserted into the display screen of the image display device and the image of the display screen of the image display device, and the information processing device performs image quality adjustment of the application screen such that the contrast is harmonized when disharmony of the contrast is identified by the identified contrast harmonization degree. Mugura teaches “an information processing apparatus, an information processing method, a program, and a moving body that can appropriately display content on top of a scene viewed by a user.” (¶1). More specifically, Mugura teaches identifying a fitting contrast (corresponding to a harmony degree of contrast) and adjusting the image based on the fitting contrast, and recites “The fitting contrast calculation unit 251 calculates fitting contrast of each information superimposition appropriate frame on the basis of the context at the location of the information superimposition appropriate frame that is set as the superimposition location of the content. The fitting contrast is used to adjust the contrast of the content such that the appearance of the content becomes the same as the actual appearance at the location of the information superimposition appropriate frame.” (¶375). It would have been obvious to one with ordinary skill, before the effective filing date of the claimed invention, to modify the device (taught by Sekiguchi in view of Iwasaki) to identify a contrast and to adjust the image to be inserted based on the identified contrast (taught by Mugura). The suggestions/motivations would have been to make the appearance of superimposed content the same as the actual appearance at the surrounding location (¶375), and to apply a known technique to a known device (method, or product) ready for improvement to yield predictable results. Regarding Claim 18, Sekiguchi in view of Iwasaki and Mugura discloses The information processing device according to claim 12, wherein when the information processing device or the image display device identifies disharmony of contrast between the application screen image inserted into the screen of the image display device and the image of the display screen of the image display device, the information processing device adjusts an image quality of the application screen such that the contrast is harmonized. (See Claim 8 and 11 rejections for detailed analysis.) Regarding Claim 9. Sekiguchi in view of Iwasaki and Mugura discloses The information processing system according to claim 8, wherein the image display device includes an illuminance sensor configured to detect brightness around a field of view (Mugura, ¶180 reciting “The light state mode setting unit 212 may set the light state mode on the basis of the detection results of the sensors included in the data acquisition unit 102”), and the information processing device adjusts luminance of the application screen according to an output from the illuminance sensor. (Mugura, ¶176 reciting “The image analysis unit 211 analyzes a scene image obtained by imaging. As a result of the analysis of the scene image, . . ., and the brightness (luminance) of the surroundings is detected. In addition, as a result of the analysis of the scene image, the state of sunlight, the state of lighting, the state of atmosphere, and the like are also detected. Information indicating the analysis results of the image analysis unit 211 is supplied to the light state mode setting unit 212 and the object detection unit 213 and is also supplied to the display control unit 203.” Further, ¶177 reciting “The light state mode setting unit 212 sets a light state mode on the basis of information supplied from the image analysis unit 211. For example, one of a “daytime mode,” a “dusk mode,” and a “night mode” is set as the light state mode.” The suggestions/motivations would have been the same as that of Claim 8 rejections.) Regarding Claim 19, Sekiguchi in view of Iwasaki and Mugura discloses The information processing device according to claim 18, wherein the information processing device adjusts luminance of the application screen according to an output from an illuminance sensor of the image display device configured to detect brightness around a field of view. (See Claim 9 and 11 rejections for detailed analysis.) Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to YI WANG whose telephone number is (571)272-6022. The examiner can normally be reached 9am - 5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571)272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /YI WANG/Primary Examiner, Art Unit 2619
Read full office action

Prosecution Timeline

Dec 20, 2023
Application Filed
Oct 31, 2025
Non-Final Rejection — §103
Jan 09, 2026
Examiner Interview Summary
Jan 09, 2026
Applicant Interview (Telephonic)
Jan 20, 2026
Response Filed
Mar 07, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12579758
DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR INTERACTING WITH VIRTUAL OBJECTS USING HAND GESTURES
2y 5m to grant Granted Mar 17, 2026
Patent 12579752
SYSTEM AND METHOD FOR CREATING AND FURNISHING DIGITAL MODELS OF INDOOR SPACES
2y 5m to grant Granted Mar 17, 2026
Patent 12579708
CHARACTER DISPLAY METHOD AND APPARATUS, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12573009
IMAGE PROCESSING METHOD, IMAGE GENERATING METHOD, APPARATUS, DEVICE, AND MEDIUM
2y 5m to grant Granted Mar 10, 2026
Patent 12562084
AUGMENTED REALITY WINDOW
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
91%
With Interview (+14.7%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 481 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month