Prosecution Insights
Last updated: April 19, 2026
Application No. 19/207,678

PROXIMITY DETECTION ON HUMAN MACHINE INTERFACE

Final Rejection §103
Filed
May 14, 2025
Examiner
ILUYOMADE, IFEDAYO B
Art Unit
2624
Tech Center
2600 — Communications
Assignee
VERTIV CORPORATION
OA Round
2 (Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
83%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
464 granted / 630 resolved
+11.7% vs TC avg
Moderate +9% lift
Without
With
+9.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
27 currently pending
Career history
657
Total Applications
across all art units

Statute-Specific Performance

§101
1.4%
-38.6% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
29.7%
-10.3% vs TC avg
§112
6.1%
-33.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 630 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . The amendment filed on 02/05/2026 have been entered. Claims 9-16, 18 and 20 have been canceled. Claims 1-5, 7, 17 and 19 are pending. Claims 21-29 have been added. Claims 1-8, 17, 19, and 21-29 are pending Response to Arguments Applicant's arguments filed 02/05/2026 have been fully considered but they are not persuasive. Regarding claims 1-5, 7, 17 and 19, applicant argues on pages 7-9 of the Remarks that Lee and Wang fail to teach or suggestion that any data obtained by the proximity sensor can be used to identify a gesture, as recited; and there is no teaching or suggestion of the display being controlled "based on [a] gesture," as recited. In response to the applicant’s argument, examiner respectfully disagrees. Lee describes plurality of proximity values recognize gesture movement in fig. 2-3 and Paragraph 98, “If the light quantity data received by the proximity sensor 230 is greater than a predetermined value (i.e., if the proximity sensor 230 receives a larger amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a proximity state indicating that the target object 300 approaches the mobile terminal. Similarly, when light quantity data is less than a predetermined value (i.e., when the proximity sensor 230 receives a smaller amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a separation state indicating that the target object 300 is moving away from the mobile terminal” Paragraph 102 further describes the display being controlled by the recognized gesture, “when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated” Examiner assert that the combination of Lee and Wang describe the above limitations. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-8, 17, 19, 23, 25 and 28 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lee (US Pub. 20220157082) in view of Wang et al (US Provisional 63581534 as US Pub. 20250085800). Regarding claim 1, Lee discloses: A device, (at least refer to fig. 1 and paragraph 47. Describes the mobile terminal 100) comprising: A human machine interface (HMI) comprising a display, (at least refer to fig. 1 and paragraph 52. Describes the display unit may construct a mutual layer structure along with a touch sensor, or may be formed integrally with the touch sensor, such that the display unit can be implemented as a touchscreen. The touchscreen may serve as a user input unit 123 that provides an input interface to be used between the mobile terminal 100); A proximity detector configured to detect a plurality of proximity values of an object, wherein the plurality of proximity values represent movement of the object, (at least refer to fig. 2-3 and paragraph 98. Describes if the light quantity data received by the proximity sensor 230 is greater than a predetermined value (i.e., if the proximity sensor 230 receives a larger amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a proximity state indicating that the target object 300 approaches the mobile terminal. Similarly, when light quantity data is less than a predetermined value (i.e., when the proximity sensor 230 receives a smaller amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a separation state indicating that the target object 300 is moving away from the mobile terminal); and A control unit configured to identify a gesture based on the plurality of proximity values, and to control the display based on the gesture, (at least refer to fig. 2-3 and paragraph 102. Describes when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Lee does not disclose: An ambient light sensor configured to repeatedly measure an intensity of ambient light; Wang teaches: An ambient light sensor configured to repeatedly measure an intensity of ambient light, (at least refer to fig. 1-2 and paragraph 58. Describes for example, motion sensor data, ambient light sensor data, display touch sensor data, and/or other desired data may be used in determining whether the proximity sensor has been triggered by a user's finger or head. Para. 57, describes: ambient light sensor measurements may be lower when the device is brought to the user's head (e.g., the ambient light sensor may be at least partially covered, reducing the amount of ambient light that reaches the sensor) than when the user touches the display. Therefore, the ambient light sensor measurements may be used in combination with the IMU data to determine whether the display should remain on or be deactivated) The two references are analogous art because they are related with the same field of invention of electronic devices with sensors. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate ambient sensor for measure light intensity as taught by Wang with the display device as disclose by Lee. The motivation to combine the reference of Wang is to measures the amount of light energy on the device. Regarding claim 17, Lee discloses: A control method of a display of a human machine interface (HMI), (at least refer to fig. 1-2 and paragraphs 100, 52. Describes a method for enabling the proximity sensor 230 to sense the proximity of the target object 300 in the situation where the display 210 is in an inactive state. Para. 52, describes: the touchscreen may serve as a user input unit 123 that provides an input interface to be used between the mobile terminal 100) the control method comprising: Detecting a plurality of proximity values of an object, wherein the plurality of proximity values represent movement of the object, (at least refer to fig. 2-3 and paragraph 98. Describes if the light quantity data received by the proximity sensor 230 is greater than a predetermined value (i.e., if the proximity sensor 230 receives a larger amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a proximity state indicating that the target object 300 approaches the mobile terminal. Similarly, when light quantity data is less than a predetermined value (i.e., when the proximity sensor 230 receives a smaller amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a separation state indicating that the target object 300 is moving away from the mobile terminal); Identifying a gesture based on the plurality of proximity values; and controlling the display based on the gesture, (at least refer to fig. 2-3 and paragraph 102. Describes when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Lee does not disclose: Measuring an intensity of ambient light in an environment of the HMI; Wang teaches: Measuring an intensity of ambient light in an environment of the HMI, (at least refer to fig. 1-2 and paragraph 58. Describes for example, motion sensor data, ambient light sensor data, display touch sensor data, and/or other desired data may be used in determining whether the proximity sensor has been triggered by a user's finger or head. Para. 57, describes: ambient light sensor measurements may be lower when the device is brought to the user's head (e.g., the ambient light sensor may be at least partially covered, reducing the amount of ambient light that reaches the sensor) than when the user touches the display. Therefore, the ambient light sensor measurements may be used in combination with the IMU data to determine whether the display should remain on or be deactivated); Regarding the rejection of claim 17, refer to the motivation of claim 1. Regarding claim 2, Lee discloses: Wherein the proximity detector is configured to emit a signal and to detect a reflected signal reflected from the object, and comprises: circuit configured to determine the plurality of proximity values based on at least one of a time of the reflected signal and an intensity of the reflected signal, (at least refer to fig. 3-4 and paragraphs 98, 102. Describes the proximity sensor 230 may emit light to the target object 300, and may distinguish the proximity of the target object 300 using light quantity data reflected from the target object 300. If the light quantity data received by the proximity sensor 230 is greater than a predetermined value (i.e., if the proximity sensor 230 receives a larger amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a proximity state indicating that the target object 300 approaches the mobile terminal. Similarly, when light quantity data is less than a predetermined value (i.e., when the proximity sensor 230 receives a smaller amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a separation state indicating that the target object 300 is moving away from the mobile terminal. Para. 102, describes: when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Lee does not disclose: an application specific integrated circuit (ASIC) configured to determine the plurality of proximity values based on at least one of a time of the reflected signal and an intensity of the reflected signal. Wang teaches: an application specific integrated circuit (ASIC), (at least refer to fig. 1 and paragraph 19. Describes processing circuitry in controller 16 may be used to control the operation of device 10. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, application specific integrated circuits) Regarding the rejection of claim 2, refer to the motivation of claim 1. Regarding claim 4, Lee does not disclose: Wherein the control unit is further configured to deactivate the display based on a predetermined delay having passed after a proximity value which is outside of a proximity value range. Wang teaches: Wherein the control unit is further configured to deactivate the display based on a predetermined delay having passed after a proximity value which is outside of a proximity value range, (at least refer to fig. 6 and paragraphs 49, 56. Describes the device (e.g., a controller in the device) may wait until these determinations have been made before proceeding to finger determination 76. Para. 56, describes: Alternatively, if it is determined that the user is not briefly attempting to interact with the display using their finger, it may be determined that the user is holding the device to their head (e.g., to make a phone call), and the display may be turned off/deactivated, as indicated by box 82). Regarding the rejection of claim 4, refer to the motivation of claim 1. Regarding claim 5, Lee discloses: Wherein the control unit is further configured to control the device to thereby provide an interface for adjusting one or more configurable settings, (at least refer to fig. 2 and paragraph 100. Describes a method for enabling the proximity sensor 230 to sense the proximity of the target object 300 in a manner that the display 310 switches to the active state and outputs a graphical interface 211. The graphical interface 211 may refer to an interface that connects a motion of the user's hand sensed by the depth camera 230 to a specific function of the mobile terminal). Regarding claim 6, Lee discloses: Wherein the one or more configurable settings include a sensitivity setting, (at least refer to fig. 2 and paragraph 100. Describes a method for enabling the proximity sensor 230 to sense the proximity of the target object 300 in a manner that the display 310 switches to the active state and outputs a graphical interface 211. The graphical interface 211 may refer to an interface that connects a motion of the user's hand sensed by the depth camera 230 to a specific function of the mobile terminal). Lee does not disclose: a deactivation setting Wang teaches: a deactivation setting, (at least refer to fig. 2 and paragraph 58. Describes the ambient light sensor measurements may be used in combination with the IMU data to determine whether the display should remain on or be deactivated) Regarding the rejection of claim 6, refer to the motivation of claim 1. Regarding claim 7, Lee discloses: A bezel disposed around an outer periphery of the display, wherein the proximity detector is positioned behind a wall of the bezel, (at least refer to fig. 1b and paragraph 61. Describes the proximity sensor 141, the illumination sensor 142, the optical output unit, the first camera 121a, and the first manipulation unit 123a may be disposed at a front surface of a body frame of the mobile terminal 100) and is further configured to determine the plurality of proximity values using infrared light, (at least refer to fig. 1b and paragraphs 92-93. Describes after the depth information is calculated, the calculated depth information is synthesized with a photograph taken by the image sensor, resulting in the 3D-based imaging result. To this end, a laser infrared (IR) projector for emitting a laser beam having a specific pattern, an infrared depth sensor, an image sensor, a 3D processor, etc. can be used). Lee does not explicitly disclose: determine the plurality of proximity values using infrared light Wang teaches: determine the plurality of proximity values using infrared light, (at least refer to fig. 2-3 and paragraph 32. Describes proximity sensor 30 may emit light 36, such as using a light-emitting diode or other light source. Light 36 may be infrared light, near-infrared light, or other suitable light) Regarding the rejection of claim 7, refer to the motivation of claim 1. Regarding claim 8, Lee does not explicitly disclose: Wherein the wall of the bezel is comprised of a plastic material transparent to infrared light Wang teaches: Wherein the wall of the bezel is comprised of a plastic material transparent to infrared light, (at least refer to fig. 2-3 and paragraph 32. Describes proximity sensor 30 in device 10 may operate through layer 32. Layer 32 may include a transparent layer of device 10, such as a transparent cover layer that overlaps a display, a layer with a transparent opening, and/or one or more display layers, as examples. In some illustrative examples, proximity sensor 30 may operate through a display) Regarding the rejection of claim 8, refer to the motivation of claim 1. Regarding claim 19, Lee does not explicitly disclose: Deactivating the display based on a predetermined delay having passed after a proximity value which is outside of a proximity value range. Wang teaches: Deactivating the display based on a predetermined delay having passed after a proximity value which is outside of a proximity value range, (at least refer to fig. 6 and paragraphs 49, 56. Describes the device (e.g., a controller in the device) may wait until these determinations have been made before proceeding to finger determination 76. Para. 56, describes: Alternatively, if it is determined that the user is not briefly attempting to interact with the display using their finger, it may be determined that the user is holding the device to their head (e.g., to make a phone call), and the display may be turned off/deactivated, as indicated by box 82). Regarding the rejection of claim 19, refer to the motivation of claim 1. Regarding claim 23, Lee discloses: Wherein the control unit is further configured to activate the display in response to a proximity value being within a proximity value range, (at least refer to fig. 2-3 and paragraph 102. Describes when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Regarding claim 25, Lee discloses: Wherein the detecting the plurality of proximity values comprises: emitting a signal and detecting a reflected signal reflected from the object; and determining the plurality of proximity values based on at least one of a time of the reflected signal and an intensity of the reflected signal, (at least refer to fig. 3-4 and paragraphs 98, 102. Describes the proximity sensor 230 may emit light to the target object 300, and may distinguish the proximity of the target object 300 using light quantity data reflected from the target object 300. If the light quantity data received by the proximity sensor 230 is greater than a predetermined value (i.e., if the proximity sensor 230 receives a larger amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a proximity state indicating that the target object 300 approaches the mobile terminal. Similarly, when light quantity data is less than a predetermined value (i.e., when the proximity sensor 230 receives a smaller amount of light quantity data), the proximity sensor 230 can recognize that the target object 300 is in a separation state indicating that the target object 300 is moving away from the mobile terminal. Para. 102, describes: when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Regarding claim 28, Lee discloses: Activating the display in response to a proximity value being within a proximity range, (at least refer to fig. 2-3 and paragraph 102. Describes when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Claim(s) 3, 21, 22, 26 and 27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lee (US Pub. 20220157082) in view of Wang et al (US Provisional 63581534 as US Pub. 20250085800) in further view of LÖTHGREN et al (US Pub. 20180196934). Regarding claim 3, Lee and Wang do not disclose: Wherein the control unit is further configured to adjust a brightness of the display based on the intensity of the ambient light. LÖTHGREN teaches: Wherein the control unit is further configured to adjust a brightness of the display based on the intensity of the ambient light, (at least refer to fig. 4-5C and paragraphs 57, 106. Describes the touch display 430 is turned on with at least reduced, or alternatively normal, brightness and is therefore capable of presenting certain information to the user 1. Para. 106, describes: the mobile communication terminal 400 uses the ambient light sensor 212 (which is conventionally used for automatic screen brightness adjustment of the touch display 430, 202), as said proximity sensor 420). The three references are analogous art because they are related with the same field of invention of electronic devices with sensors. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate brightness adjustment of the display as taught by LÖTHGREN with the ambient sensor for measure light intensity as taught by Wang with the display device as disclose by Lee. The motivation to combine the reference of LÖTHGREN is to provide visual comfort, better visibility, and battery saving, allowing users to adapt screen to different environments. Regarding claim 21, Lee discloses: Wherein the control unit is configured to control a brightness of the display based on the gesture, (at least refer to fig. 2-3 and paragraph 102. Describes when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Lee and Wang do not disclose: control a brightness of the display based on the gesture LÖTHGREN teaches: control a brightness of the display based on the gesture, (at least refer to fig. 5A-5C and paragraphs 54, 57, 66. Describes the touch display 430 is typically turned off, both as regards its brightness and its capability of detecting touch. Para. 57, describes: the touch display 430 is turned on with at least reduced, or alternatively normal, brightness and is therefore capable of presenting certain information to the user 1. Para. 66, describes: the controller 410 may detect an activation of the mobile communication terminal 400, and in response cause a switch from the inactive mode 610 into the lock screen mode 620. The activation may for instance be an actuation of the power button 224 or the home key 218). Regarding the rejection of claim 21, refer to the motivation of claim 3. Regarding claim 22, Lee and Wang do not disclose: Wherein the gesture comprises a swipe of a user's hand. LÖTHGREN teaches: Wherein the gesture comprises a swipe of a user's hand, (at least refer to fig. 6 and paragraph 69. Describes the user command may, for instance, be an unlock command resulting from actuation of the unlock control 540, and thus pertain to, for instance, a swipe motion pattern on the touch display 430). Regarding the rejection of claim 22, refer to the motivation of claim 3. Regarding claim 26, Lee discloses: Wherein the controlling the display comprises controlling a brightness of the display based on the gesture, (at least refer to fig. 2-3 and paragraph 102. Describes when the proximity sensor 230 sequentially detects the first proximity, the first separation, the second proximity, and the second separation of the target object 300 approaching or moving away from the proximity sensor 230 within a predetermined time, the display unit 210 and/or the depth camera 2230 can be activated). Lee and Wang do not disclose: Controlling a brightness of the display based on the gesture LÖTHGREN teaches: Controlling a brightness of the display based on the gesture, (at least refer to fig. 5A-5C and paragraphs 54, 57, 66. Describes the touch display 430 is typically turned off, both as regards its brightness and its capability of detecting touch. Para. 57, describes: the touch display 430 is turned on with at least reduced, or alternatively normal, brightness and is therefore capable of presenting certain information to the user 1. Para. 66, describes: the controller 410 may detect an activation of the mobile communication terminal 400, and in response cause a switch from the inactive mode 610 into the lock screen mode 620. The activation may for instance be an actuation of the power button 224 or the home key 218). Regarding the rejection of claim 26, refer to the motivation of claim 3. Regarding claim 27, Lee and Wang do not disclose: Wherein the gesture comprises a swipe of a user's hand. LÖTHGREN teaches: Wherein the gesture comprises a swipe of a user's hand, (at least refer to fig. 6 and paragraph 69. Describes the user command may, for instance, be an unlock command resulting from actuation of the unlock control 540, and thus pertain to, for instance, a swipe motion pattern on the touch display 430). Regarding the rejection of claim 27, refer to the motivation of claim 3. Allowable Subject Matter Claims 24 and 29 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to IFEDAYO B ILUYOMADE whose telephone number is (571)270-7118. The examiner can normally be reached Monday-Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached at 5712707230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /IFEDAYO B ILUYOMADE/Primary Examiner, Art Unit 2624 03/16/2026
Read full office action

Prosecution Timeline

May 14, 2025
Application Filed
Dec 20, 2025
Non-Final Rejection — §103
Feb 05, 2026
Response Filed
Mar 17, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596432
Methods Circuits Devices Systems Applications and Functionally Associated Machine Executable Code for Digital Device Display Adjustment
2y 5m to grant Granted Apr 07, 2026
Patent 12586513
DISPLAY DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12561026
DISPLAY DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12562095
DISPLAY DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12554129
EYE IMAGING IN HEAD WORN COMPUTING
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
83%
With Interview (+9.2%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 630 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month