Prosecution Insights
Last updated: April 19, 2026
Application No. 18/424,060

ELECTRONIC DEVICE FOR DISPLAYING IMAGE AND OPERATION METHOD THEREOF

Final Rejection §102
Filed
Jan 26, 2024
Examiner
PATEL, SHIVANG I
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
2 (Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
93%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
309 granted / 415 resolved
+12.5% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
22 currently pending
Career history
437
Total Applications
across all art units

Statute-Specific Performance

§101
10.3%
-29.7% vs TC avg
§103
57.8%
+17.8% vs TC avg
§102
16.7%
-23.3% vs TC avg
§112
13.5%
-26.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 415 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, see page 11, filed 12/23/2025 with respect title objection have been fully considered and are persuasive. Applicant has amended title to be more descriptive. The title objection has been withdrawn. Applicant’s arguments, see page 11-14, filed 12/23/2025 with respect to 35 USC §102(a)(2) rejection of claims 1-20 have been fully considered but were not persuasive. Applicant has amended claims and argues previously cited references fails to disclose amended claim limitations. Applicant argues receive a third image associated with the second image through functions to which power is not blocked by the first processor, in the second mode in which the first processor is in sleep state where power is blocked to at least any one of a plurality of function blocks to be executed by the first processor. Applicant points to paragraph [0102-0103] of instant application specification to provide support for amended claim language. In response examiner points paragraph [0027] of Jeon to disclose main processor and auxiliary processor that operate independently and uses lower power for designated function. Examiner also points to paragraph [0096] of Jeon to disclose power management module that may adjust power to the processor. Jeon’s lower power mode for designated functions and power management processor teaches argued claim limitations. The 35 USC §102(a)(2) rejection of claims 1-20 has been affirmed. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Jeon et al (US 20190069244 A1). Regarding claim 1, Jeon discloses an electronic device ([0026] electronic device), comprising: a display ([0026] display device 160); at least one sensor module ([0026] sensor module 176); a time module configured to receive time information ([0063] time-related data); memory ([0026] memory 130); a first processor ([0027] main processor); a second processor ([0027] auxiliary processor); and a sensor hub configured to transmit sensor data obtained by means of the sensor module in a second mode to at least any one of the first processor and the second processor ([0027] the processor 120 may include a main processor 121 (e.g., central processing unit or application processor) and an auxiliary processor 123 (e.g., a graphics processing device, an image signal processor, a sensor hub processor, or a communications processor) that is operated independently of the main processor 121 and additionally or alternatively uses lower power than the main processor 121 or is specialized for a designated function.), wherein the memory store one or more computer programs including computer-executable instructions that, when executed by the first processor ([0027] general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing), cause the electronic device to generate and output a first image in a first mode ([0061] state of being turned on (or powered on) in a first mode (or normal mode)), and wherein the memory store the one or more computer programs including computer-executable instructions that, when executed by the second processor, cause the electronic device to: generate and output a second image in the second mode running at lower power than the first mode ([0061] a state of being turned off (or powered off) in a second mode (or watch only mode).), receive a third image associated with the second image through functions to which power is not blocked by the first processor, in the second mode in which the first processor is in a sleep state where power is blocked to at least any one of a plurality of function blocks to be executed by the first processor ([0068] a sleep mode or an inactive/deactivated state, the component in question may be in a state where it cannot perform at least some of the functions that it can perform in normal mode or active state., [0027] the processor 120 may include a main processor 121 (e.g., central processing unit or application processor) and an auxiliary processor 123 (e.g., a graphics processing device, an image signal processor, a sensor hub processor, or a communications processor) that is operated independently of the main processor 121 and additionally or alternatively uses lower power than the main processor 121 or is specialized for a designated function), blend at least any one of the sensor data received from the sensor module and the time information received through the time module with the third image to generate the second image ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.), and display the second image on the display ([0086] display 360 may display the watch face image). Regarding claim 2, Jeon discloses wherein the third image is associated with at least a portion of the first image received from the first processor ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.) Regarding claim 3, Jeon discloses wherein the one or more computer programs further comprise computer-executable instructions that, when executed by the second processor, cause the electronic device to select and display at least any one of a plurality of image objects corresponding to the time information on the display, upon the second mode ([0084] the watch face image may include a clock face image(e.g., face and at least one of letters, numbers, symbols, images, etc., formed on the face)), and wherein the plurality of image objects comprise at least any one of a plurality of hour image objects, a plurality of minute image objects, and a plurality of second image objects ([0084] at least one indicator image, such as an hour hand image, a minute hand image, and/or a second hand image.) Regarding claim 4, Jeon discloses select and update any one of the plurality of minute image objects to the display every first period, in the second mode ([0088] depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion, information associated with the second motion, DDI ON command, or DDI OFF command) from the sensor module 376 or depending on the expiration of the timer.) and select and update any one of the plurality of hour image objects to the display every second period different from the first period ([0084] the watch face image may include a clock face image(e.g., face and at least one of letters, numbers, symbols, images, etc., formed on the face) and at least one indicator image, such as an hour hand image, a minute hand image, and/or a second hand image.). Regarding claim 5, Jeon discloses wherein the at least one sensor module senses at least any one of exercise information or biometric information of a user of the electronic device ([0086] the biometric information-related data from the sensor module), and wherein the one or more computer programs further comprise computer-executable instructions that, when executed by the second processor, cause the electronic device to blend and display a sensor image generated based on the sensed at least any one of exercise information or biometric information of the user, the time information, and a background image corresponding to the third image on the display in the second mode ([0088] in the second mode, the display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data, the time-related data, and/or the biometric information-related data which are stored in the internal memory of the display, depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion). Regarding claim 6, Jeon discloses wherein the one or more computer programs further comprise computer-executable instructions that, when executed by the second processor cause the electronic device to: select and update a sensor image corresponding to the exercise information and the biometric information of the user to the display every specified period, in the second mode ([0093] biometric signal may include an electric signal (e.g., an electrocardiogram signal, a pulse wave signal, etc.) output from the biosensor, and the biometric information may include at least one of user's identification information, physical information, emotion information, health information, disease information, exercise information, activity information, stress information, and sleep information) Regarding claim 7, Jeon discloses wherein the one or more computer programs further comprise computer-executable instructions that, when executed by the second processor, cause the electronic device to blend and display at least any one of the sensor data and the time information on a background image corresponding to the third image on the display ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.). Regarding claim 8, Jeon discloses wherein the second processor includes: a meta parser configured to extract draw information for displaying the second image from pieces of meta data delivered from at least any one of the first processor and the sensor module ([0092] sensor module 376 may transmit information associated with the motion to the processor 320 in the first mode, and may transmit the information associated with the motion to the display 360 in the second mode.), and a data composer configured to generate the second image based on the extracted draw information ([0260] a screen (or graphical user interface (GUI)) including at least one graphical element 1410 (e.g., text, image, video, or a combination of some/all thereof) in a first mode.), and wherein the meta data comprises a watch position at which the time information will be displayed, a position of the sensor data, and the third image associated with the first image ([0261] The home screen may include the at least one graphical element 1410 (e.g., a shortcut icon for executing frequently used applications, time, weather, and biometric information).). Regarding claim 9, Jeon discloses wherein the sensor hub comprises: a sensor service configured to process the sensor data received from the sensor module ([0064] The processor 320 may transmit, to the display 360, the image data, the time-related data, and/or the biometric information-related data based on the information associated with the first motion (or in response to the reception of the information associated with the first motion)); and an inter-process communication (IPC) driver configured to deliver the sensor data processed from the sensor service to the second processor ([0066] the processor 320 may receive information associated with a second motion from the sensor module). Regarding claim 10, Jeon discloses wherein the first mode is an active mode ([0061] state of being turned on (or powered on) in a first mode (or normal mode)), and wherein the second mode is a low power display mode ([0061] a state of being turned off (or powered off) in a second mode (or watch only mode).), Regarding claim 11, Jeon discloses an electronic device ([0026] electronic device), comprising: a display ([0026] display device 160); at least one sensor module configured to sense activity information of a user who uses the electronic device ([0026] sensor module 176); memory ([0026] memory 130); a first processor ([0027] main processor); a second processor ([0027] auxiliary processor); and a sensor hub configured to transmit sensor data obtained by means of the sensor module to the first processor in a first mode and transmit the sensor data to at least any one of the first processor and the second processor in a second mode ([0027] the processor 120 may include a main processor 121 (e.g., central processing unit or application processor) and an auxiliary processor 123 (e.g., a graphics processing device, an image signal processor, a sensor hub processor, or a communications processor) that is operated independently of the main processor 121 and additionally or alternatively uses lower power than the main processor 121 or is specialized for a designated function.), wherein the memory store one or more computer programs including computer-executable instructions that, when executed by the first processor ([0027] general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing), cause the electronic device to generate and output a first image in the first mode ([0061] state of being turned on (or powered on) in a first mode (or normal mode)), and wherein the memory store the one or more computer programs including computer-executable instructions that, when executed by the second processor cause the electronic device to: generate and output a second image in the second mode running at lower power than the first mode ([0061] a state of being turned off (or powered off) in a second mode (or watch only mode).), receive a third image associated with the second image through functions to which power is not blocked by the first processor, in the second mode in which the first processor is in a sleep state where power is blocked to at least any one of a plurality of function blocks to be executed by the first processor ([0068] a sleep mode or an inactive/deactivated state, the component in question may be in a state where it cannot perform at least some of the functions that it can perform in normal mode or active state., [0027] the processor 120 may include a main processor 121 (e.g., central processing unit or application processor) and an auxiliary processor 123 (e.g., a graphics processing device, an image signal processor, a sensor hub processor, or a communications processor) that is operated independently of the main processor 121 and additionally or alternatively uses lower power than the main processor 121 or is specialized for a designated function), blend the received third image with the sensor data received from the sensor module to generate the second image ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.), and display the second image on the display ([0086] display 360 may display the watch face image). Regarding claim 12, Jeon discloses a time module configured to receive time information, wherein the at least one sensor module senses at least any one of exercise information and biometric information of the user of the electronic device ([0086] the biometric information-related data from the sensor module), and wherein the one or more computer programs further comprise computer-executable instructions that, when executed by the second processor, cause the electronic device to: upon the second mode, update at least any one of the sensor data generated based on the sensed information of the user and the time information every certain period ([0088] in the second mode, the display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data, the time-related data, and/or the biometric information-related data which are stored in the internal memory of the display, depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion)., and blend and display the updated at least any one of the sensor data and the third image on the display ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.). Regarding claim 13, Jeon discloses wherein the second processor comprises: a meta parser configured to extract draw information for displaying the second image from pieces of meta data delivered from at least any one of the first processor and the sensor module ([0092] sensor module 376 may transmit information associated with the motion to the processor 320 in the first mode, and may transmit the information associated with the motion to the display 360 in the second mode.), and a data composer configured to blend at least any one of a sensor image and a watch image on a background image based on the extracted draw information to generate the second image ([0260] a screen (or graphical user interface (GUI)) including at least one graphical element 1410 (e.g., text, image, video, or a combination of some/all thereof) in a first mode.), and wherein the meta data comprises a watch position at which the time information will be displayed, a position of the sensor data, and the third image associated with the first image ([0261] The home screen may include the at least one graphical element 1410 (e.g., a shortcut icon for executing frequently used applications, time, weather, and biometric information).). Regarding claim 14, Jeon discloses wherein the sensor hub comprises: a sensor service configured to process the sensor data received from the sensor module ([0064] The processor 320 may transmit, to the display 360, the image data, the time-related data, and/or the biometric information-related data based on the information associated with the first motion (or in response to the reception of the information associated with the first motion)); and an inter-process communication (IPC) driver configured to deliver the sensor data processed from the sensor service to the second processor ([0066] the processor 320 may receive information associated with a second motion from the sensor module). Regarding claim 15, Jeon discloses wherein the first mode is an active mode ([0061] state of being turned on (or powered on) in a first mode (or normal mode)), and wherein the second mode is a low power display mode ([0061] a state of being turned off (or powered off) in a second mode (or watch only mode).), Regarding claim 16, Jeon discloses a method at an electronic device ([0026] electronic device), the method comprising: generating and outputting, by a first processor of the electronic device, a first image in a first mode ([0061] state of being turned on (or powered on) in a first mode (or normal mode)); generating and outputting, by a second processor of the electronic device, a second image, in a second mode running at lower power than the first mode ([0061] a state of being turned off (or powered off) in a second mode (or watch only mode).), receiving, by a time module, time information ([0063] time-related data); sensing, by at least one sensor module, activity information of a user who uses the electronic device, in the first mode and the second mode ([0086] the biometric information-related data from the sensor module),; transmitting, by a sensor hub, the activity information of the user, the activity information being sensed in the first mode, to the first processor and transmitting, by the sensor hub, the activity information of the user, the activity information being sensed in the second mode, to the second processor ([0088] in the second mode, the display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data, the time-related data, and/or the biometric information-related data which are stored in the internal memory of the display, depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion).; receive a third image associated with the second image through functions to which power is not blocked by the first processor, in the second mode in which the first processor is in a sleep state where power is blocked to at least any one of a plurality of function blocks to be executed by the first processor ([0068] a sleep mode or an inactive/deactivated state, the component in question may be in a state where it cannot perform at least some of the functions that it can perform in normal mode or active state., [0027] the processor 120 may include a main processor 121 (e.g., central processing unit or application processor) and an auxiliary processor 123 (e.g., a graphics processing device, an image signal processor, a sensor hub processor, or a communications processor) that is operated independently of the main processor 121 and additionally or alternatively uses lower power than the main processor 121 or is specialized for a designated function), blending, by the second processor, at least any one of sensor data received from the sensor module and the time information received through the time module with the third image to generate the second image ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.); and displaying, by the second processor, the second image on a display ([0086] display 360 may display the watch face image). Regarding claim 17, Jeon discloses wherein the at least one sensor module senses at least any one of exercise information and biometric information of the user of the electronic device ([0086] the biometric information-related data from the sensor module), and wherein the method further comprises: upon the second mode, blending and displaying, by the second processor, the sensor data generated based on the sensed information of the user, watch information, and the third image on the display ([0088] in the second mode, the display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data, the time-related data, and/or the biometric information-related data which are stored in the internal memory of the display, depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion). Regarding claim 18, Jeon discloses herein the generating and outputting of the second image in the second mode comprises: selecting and updating a sensor image corresponding to at least any one of exercise information and biometric information of the user to the display every certain period ([0088] in the second mode, the display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data, the time-related data, and/or the biometric information-related data which are stored in the internal memory of the display, depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion). Regarding claim 19, Jeon discloses wherein the generating and outputting of the second image in the second mode comprises: extracting draw information for displaying the second image from pieces of meta data delivered from at least any one of the first processor and the sensor module ([0092] sensor module 376 may transmit information associated with the motion to the processor 320 in the first mode, and may transmit the information associated with the motion to the display 360 in the second mode.),; and blending at least any one of the sensor data and the time information on the third image based on the extracted draw information to generate the second image ([0260] a screen (or graphical user interface (GUI)) including at least one graphical element 1410 (e.g., text, image, video, or a combination of some/all thereof) in a first mode.), and wherein the meta data comprises a watch position at which the time information will be displayed, a position of the sensor data, and the third image associated with the first image ([0261] The home screen may include the at least one graphical element 1410 (e.g., a shortcut icon for executing frequently used applications, time, weather, and biometric information).). Regarding claim 20, Jeon discloses one or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that, when executed by one or more processors, cause an electronic device to perform operations ([0274] implemented in hardware, firmware or via the execution of software or computer code that can be stored in a recording medium such as a CD ROM, a Digital Versatile Disc (DVD), a magnetic tape, a RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine readable medium and to be stored on a local recording medium), the operations comprising: generating and outputting, by a first processor of the one or more processors a first image in a first mode ([0061] state of being turned on (or powered on) in a first mode (or normal mode)); generating and outputting, by a second processor of the one or more processors, a second image, in a second mode running at lower power than the first mode ([0061] a state of being turned off (or powered off) in a second mode (or watch only mode).), receiving, by a time module, time information ([0063] time-related data); sensing, by at least one sensor module, activity information of a user who uses the electronic device, in the first mode and the second mode ([0086] the biometric information-related data from the sensor module),; transmitting, by a sensor hub, the activity information of the user, the activity information being sensed in the first mode, to the first processor and transmitting, by the sensor hub, the activity information of the user, the activity information being sensed in the second mode, to the second processor ([0088] in the second mode, the display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data, the time-related data, and/or the biometric information-related data which are stored in the internal memory of the display, depending on the signal/command/information (e.g., a wake up signal, a sleep signal, information associated with the first motion).; receive a third image associated with the second image through functions to which power is not blocked by the first processor, in the second mode in which the first processor is in a sleep state where power is blocked to at least any one of a plurality of function blocks to be executed by the first processor ([0068] a sleep mode or an inactive/deactivated state, the component in question may be in a state where it cannot perform at least some of the functions that it can perform in normal mode or active state., [0027] the processor 120 may include a main processor 121 (e.g., central processing unit or application processor) and an auxiliary processor 123 (e.g., a graphics processing device, an image signal processor, a sensor hub processor, or a communications processor) that is operated independently of the main processor 121 and additionally or alternatively uses lower power than the main processor 121 or is specialized for a designated function), and blending, by the second processor, at least any one of sensor data received from the sensor module and the time information received through the time module with the third image to generate the second image ([0086] display 360 may display the watch face image and/or the biometric information-related data based on at least a portion of the image data and the time-related data.); and displaying, by the second processor, the second image on a display ([0086] display 360 may display the watch face image). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHIVANG I PATEL whose telephone number is (571)272-8964. The examiner can normally be reached on M-F 9-5am. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached on 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHIVANG I PATEL/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Jan 26, 2024
Application Filed
Sep 24, 2025
Non-Final Rejection — §102
Nov 14, 2025
Interview Requested
Nov 25, 2025
Examiner Interview Summary
Nov 25, 2025
Applicant Interview (Telephonic)
Dec 23, 2025
Response Filed
Mar 06, 2026
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602847
SYSTEMS AND METHODS FOR LAYERED IMAGE GENERATION
2y 5m to grant Granted Apr 14, 2026
Patent 12599838
APPARATUS AND METHODS FOR RECORDING AND REPORTING ABUSIVE ONLINE INTERACTIONS
2y 5m to grant Granted Apr 14, 2026
Patent 12592004
IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12591947
DISTORTION-BASED IMAGE RENDERING
2y 5m to grant Granted Mar 31, 2026
Patent 12584296
Work Machine Display Control System, Work Machine Display System, Work Machine, Work Machine Display Control Method, And Work Machine Display Control Program
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
93%
With Interview (+18.5%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 415 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month