Prosecution Insights
Last updated: April 19, 2026
Application No. 17/720,799

METHOD FOR MAINTAINING SYSTEMS, IN PARTICULAR MACHINES IN WAREHOUSES

Final Rejection §103
Filed
Apr 14, 2022
Examiner
CAIN, AARON G
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Dematic GmbH
OA Round
6 (Final)
40%
Grant Probability
Moderate
7-8
OA Rounds
3y 3m
To Grant
66%
With Interview

Examiner Intelligence

Grants 40% of resolved cases
40%
Career Allow Rate
52 granted / 130 resolved
-12.0% vs TC avg
Strong +26% interview lift
Without
With
+26.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
42 currently pending
Career history
172
Total Applications
across all art units

Statute-Specific Performance

§101
4.3%
-35.7% vs TC avg
§103
57.4%
+17.4% vs TC avg
§102
19.7%
-20.3% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 130 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/30/2025 has been entered. Response to Arguments Applicant's arguments, see pages 6-12, filed 12/30/2025, regarding the rejection of claims 1, 3, 8-10, 12, and 16-20 under 35 U.S.C. 103 over Jöhnssen et al. EP 3534593 A1 (“Jöhnssen”) in combination with Jackson US 20200019237 A1 (“Jackson”) have been fully considered but they are not persuasive. First, applicant argues that Jöhnssen does not teach “the central controller to compare an optical fingerprint stored in the central controller with an instantaneous image of the shape of the respective system, and wherein the optical fingerprint comprises a stored image of the shape of the respective system itself.” However, Jöhnssen is not relied upon to teach the language regarding “the shape of” the respective system. Instead, Jackson is relied upon to teach the concept of controller capturing the image of the shape of the respective target. In the case of Jackson, the image is of a target object, which is why A system that relies on a captured image of the shape of the respective system instead of a QR code is simple modification of known technology to produce predictable results. As to the claim language itself, applicant appears to have misunderstood the rejection. The examiner is not rewriting the claim language by stating the Jackson teaches “a stored image of the shape of the target object”. The bolded language of the rejection refers to what Jackson teaches, and while Jackson does not directly teach “a stored image of the shape of the system”, as claimed in claim 1, Jöhnssen teaches the concept of an image of the system, at least in more broad terms, it just does not explicitly teach an image of the shape of the system. It is the combination of references together that teaches the claimed invention. One cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Second, applicant argues that Jackson is non analogous art on pages 9-11. However, Jackson refers to an invention in the field of robotic systems identifying objects with cameras. This is very clearly related to the subject matter of both Jöhnssen and the claimed invention. This is clearly analogous to the claimed invention. Applicant even acknowledges on page 11 that Jackson teaches that the user 100 is remote [from] robot 108, which further demonstrates in the cited language how the image of the shape of the target object is obtained via the camera of a mobile computer, such as the smartphone in FIG. 4B of Jackson. Of course, Jöhnssen also teaches that the image of the respective system is obtained via the camera of the mobile computer, as disclosed in further detail below. Finally, Jöhnssen’s reciting of capturing a QR code very clearly covers the concept of ensuring the service technician has visual contact with the corresponding system before allowing the technician to take over control of the respective system. The gaze-tracking device of Jackson does not need to teach the elements already covered by Jöhnssen to maintain the rejection. For this reason, the rejection of claims 1 and 10 is maintained, and likewise, the rejection of claim 18 is also maintained, as well as the dependent claims 3, 8-9, 12, 16-17, and 19-20. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 9-10, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Jöhnssen et al. EP 3534593 A1 (“Jöhnssen”) in combination with Jöhnssen et al. EP 3534593 A1 (“Jöhnssen”) in combination with Jackson US 20200019237 A1 (“Jackson”). Regarding Claim 1. Jöhnssen teaches a method for maintaining, commissioning and checking systems in warehouses (paragraph 3), wherein a service technician has a view of a respective system and makes wireless contact with a controller of the respective system via a mobile computer in order to take over control thereof (paragraph 6), said method comprising: taking over of control of the respective system by the mobile computer is permitted by a central controller only if the service technician can have visual contact with the corresponding system (paragraph 6), for which purpose a determination of the position and/or the orientation of the mobile computer of the service technician with respect to the respective system is effected via optical recognition (paragraph 7, the QR code of the system itself scanned by the portable device) of a fingerprint of the system (paragraphs 8 and 11, paragraph 16 describes an acoustic code, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)); wherein the determination of the position and/or orientation of the mobile computer of the service technician via optical recognition of the system is effected by means of a camera of the mobile computer which enables the central controller to compare an optical fingerprint stored in the central controller with an instantaneous image of the respective system obtained via the camera of the mobile computer (paragraph 7, the QR code of the system itself scanned by the portable device. For example, a plant operator with his smartphone would be able to scan a QR code displayed on the panel of the stationary operating device), and wherein the optical fingerprint comprises a stored image of the respective system itself (paragraph 7, the QR code located on a panel of the system itself scanned by the portable device. Note that an image of a QR code on the machine reads on a stored image of the respective system itself, as the claim language does not say it has to be a stored image of the entire respective system; any image of a part or whole of the system can read on the claim language). Jöhnssen does not teach: wherein the central controller compares an optical fingerprint stored in the central controller with an instantaneous image of the shape of a target object, and wherein the optical fingerprint stored image comprises a stored image of the shape of the target object. However, Jackson teaches: wherein the central controller compares an optical fingerprint stored in the central controller with an instantaneous image of the shape of a target object, and wherein the optical fingerprint stored image comprises a stored image of the shape of the target object (FIGS. 4A-4B depict a robot scanning objects, a user gazing at an exemplary virtual representation of an object, and a side view of the user selecting the detected object [paragraphs 12-14]. In the case of FIGS. 4B and 4C, an object is shown at 416, wherein the outline of the object is stored along with the image of the table, which also has a visible outline captured for the shape of the table [paragraphs 49-50]). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the invention of Jöhnssen with wherein the central controller compares an optical fingerprint stored in the central controller with an instantaneous image of the shape of a target object, and wherein the optical fingerprint stored image comprises a stored image of the shape of the target object as taught by Jackson because it would have been obvious to try. Having the optical fingerprint be more than a QR code and instead be a full image of the target system would be an obvious modification of known elements in the art to produce a predictable result. Regarding Claim 9. Jöhnssen in combination with Jackson teaches the method as claimed in claim 1. Jöhnssen also teaches: wherein the systems in question come from the field of conveying technology, storage technology, picking technology and/or sorting technology (paragraphs 1-3). Regarding Claim 10. Jöhnssen teaches a method for maintaining, commissioning and checking systems in warehouses (paragraph 3), wherein a service technician has a view of a respective system and makes wireless contact with a controller of the respective system via a mobile computer in order to take over control thereof (paragraph 6), said method comprising: determining the position and/or the orientation of the mobile computer of the service technician (paragraph 6) with respect to the respective system via optical recognition (paragraph 7, the QR code of the system itself scanned by the portable device) and/or acoustic recognition of a characteristic of the respective system (paragraphs 8 and 11, paragraph 16 describes an acoustic code, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)); and taking over of control of the respective system by the mobile computer is permitted by a central controller only if the service technician can have visual contact with the corresponding system based on said determining the position and/or orientation of the mobile computer (paragraph 6); and wherein said optical recognition comprises optically recognizing the characteristic of the respective system via a camera of the mobile computer which enables the central controller to compare an optical fingerprint stored in the central controller with an instantaneous image of the shape of the respective system obtained via the camera of the mobile computer (paragraph 7, the QR code of the system itself scanned by the portable device. For example, a plant operator with his smartphone would be able to scan a QR code displayed on the panel of the stationary operating device), and wherein the optical fingerprint comprises a stored image of the shape of the respective system itself (paragraph 7, the QR code located on a panel of the system itself scanned by the portable device. Note that an image of a QR code on the machine reads on a stored image of the respective system itself, as the claim language does not say it has to be a stored image of the entire respective system; any image of a part or whole of the system can read on the claim language). Jöhnssen does not teach: wherein the central controller compares an optical fingerprint stored in the central controller with an instantaneous image of the shape of a target object, and wherein the optical fingerprint stored image comprises a stored image of the shape of the target object. However, Jackson teaches: wherein the central controller compares an optical fingerprint stored in the central controller with an instantaneous image of the shape of a target object, and wherein the optical fingerprint stored image comprises a stored image of the shape of the target object (FIGS. 4A-4B depict a robot scanning objects, a user gazing at an exemplary virtual representation of an object, and a side view of the user selecting the detected object [paragraphs 12-14]. In the case of FIGS. 4B and 4C, an object is shown at 416, wherein the outline of the object is stored along with the image of the table, which also has a visible outline captured for the shape of the table [paragraphs 49-50]). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the invention of Jöhnssen with wherein the central controller compares an optical fingerprint stored in the central controller with an instantaneous image of the shape of a target object, and wherein the optical fingerprint stored image comprises a stored image of the shape of the target object as taught by Jackson because it would have been obvious to try. Having the optical fingerprint be more than a QR code and instead be a full image of the target system would be an obvious modification of known elements in the art to produce a predictable result. Regarding Claim 17. Jöhnssen in combination with Jackson teaches the method as claimed in claim 10. Jöhnssen also teaches: wherein the systems in question come from the field of conveying technology, storage technology, picking technology and/or sorting technology (paragraphs 1-3). Claim(s) 3, 8, 12, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Jöhnssen et al. EP 3534593 A1 (“Jöhnssen”) in combination with Pluemer US 20160091398 A1 (“Pluemer”) as applied to claims 1 and 10 above, and further in view of Pluemer US 20160091398 A1 (“Pluemer”). Regarding Claim 3. Jöhnssen in combination with Jackson teaches the method as claimed in claim 1. Jöhnssen also teaches: wherein the determination of the position and/or orientation of the mobile computer of the service technician via acoustic recognition is effected by means of a microphone of the mobile computer which enables the central controller to compare an acoustic fingerprint stored in the central controller with instantaneous acoustic recordings of the normal operating noises of the respective system (paragraph 16 (page 4 of the provided translation) describes an acoustic code, specifically, “[t]he mobile HMI client plays an acoustic code which the stationary HMI client records with its microphones. The stationary HMI client plays an acoustic code which the mobile HMI client records with its microphones”, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)). Jöhnssen does not teach: wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noises of the respective system. However, Pluemer teaches: wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noises of the respective system (The system 40 includes a reference database 52 that contains recorded and digitized sound patterns (acoustic fingerprints) from the machine 42 of known operating conditions and their associated sound patterns. Referencing U.S. Pat. No. 5,804,726, a device is disclosed for testing mechanical devices using acoustic signature analysis of the sonic signature of such devices by filtering out the frequencies of interest, recreating such frequencies from their respective harmonics, and then correlating the recreated frequencies to pre-determined known characteristics or parameters developed from valid devices to predict the long term operability of other unknown devices in a noisy environment like a factory or an airport wherein other sources of sound and/or vibration are present [paragraph 3]). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the invention of Jöhnssen with wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noises of the respective system as taught by Pleumer so as to base the acoustic fingerprint on the normal operating sounds of the system, because those sounds will be present when the user attempts to take over control of the respective system, and so including the normal operating sounds as part of the acoustic fingerprint will prevent the acoustic fingerprint from being rejected due to background noise. Regarding Claim 8. Jöhnssen in combination with Jackson and Pleumer teaches the method as claimed in claim 3. Jöhnssen also teaches: wherein the identification of the service technician is also effected by means of acoustic recognition (paragraphs 8 and 11, paragraph 16 describes an acoustic code, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)). Regarding Claim 12. Jöhnssen in combination with Jackson teaches the method as claimed in claim 10. Jöhnssen does not teach: wherein the determination of the position and/or orientation of the mobile computer of the service technician via acoustic recognition is effected by means of a microphone of the mobile computer which enables the central controller to compare an acoustic fingerprint stored in the central controller with instantaneous acoustic recordings of the normal operating noises of the respective system (paragraph 16 (page 4 of the provided translation) describes an acoustic code, specifically, “[t]he mobile HMI client plays an acoustic code which the stationary HMI client records with its microphones. The stationary HMI client plays an acoustic code which the mobile HMI client records with its microphones”, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)). Jöhnssen does not teach: wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noises of the respective system. However, Pluemer teaches: wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noises of the respective system (The system 40 includes a reference database 52 that contains recorded and digitized sound patterns (acoustic fingerprints) from the machine 42 of known operating conditions and their associated sound patterns. Referencing U.S. Pat. No. 5,804,726, a device is disclosed for testing mechanical devices using acoustic signature analysis of the sonic signature of such devices by filtering out the frequencies of interest, recreating such frequencies from their respective harmonics, and then correlating the recreated frequencies to pre-determined known characteristics or parameters developed from valid devices to predict the long term operability of other unknown devices in a noisy environment like a factory or an airport wherein other sources of sound and/or vibration are present [paragraph 3]). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the invention of Jöhnssen with wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noises of the respective system as taught by Pleumer so as to base the acoustic fingerprint on the normal operating sounds of the system, because those sounds will be present when the user attempts to take over control of the respective system, and so including the normal operating sounds as part of the acoustic fingerprint will prevent the acoustic fingerprint from being rejected due to background noise. Regarding Claim 16. Jöhnssen in combination with Jackson and Pluemer teaches the method as claimed in claim 12. Jöhnssen also teaches: further comprising identifying the service technician by means of acoustic recognition (paragraphs 8 and 11, paragraph 16 describes an acoustic code, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)). Claim(s) 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Jöhnssen et al. EP 3534593 A1 (“Jöhnssen”) in combination with Pluemer US 20160091398 A1 (“Pluemer”). Regarding Claim 18. Jöhnssen teaches a method for maintaining, commissioning and checking systems in warehouses (paragraph 3), wherein a service technician has a view of a respective system and makes wireless contact with a controller of the respective system via a mobile computer in order to take over control thereof (paragraph 6), said method comprising: determining the position and/or the orientation of the mobile computer held by the service technician with respect to the respective system via optical recognition (paragraph 7, the QR code of the system itself scanned by the portable device) and acoustic recognition of a characteristic of the respective system (paragraphs 8 and 11, paragraph 16 describes an acoustic code, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)); and taking over of control of the respective system by the mobile computer is permitted by a central controller only if the service technician can have visual contact with the corresponding system based on said determining the position and/or orientation of the mobile computer (paragraph 6); wherein said acoustic recognition comprises acoustically recognizing the characteristic of the respective system via a microphone of the mobile computer which enables the central controller to compare an acoustic fingerprint stored in the central controller with instantaneous acoustic recordings of normal operating noises of the respective system (paragraph 16 (page 4 of the provided translation) describes an acoustic code, specifically, “[t]he mobile HMI client plays an acoustic code which the stationary HMI client records with its microphones. The stationary HMI client plays an acoustic code which the mobile HMI client records with its microphones”, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)); and wherein said optical recognition comprises optically recognizing the characteristic of the respective system via a camera of the mobile computer which enables the central controller to compare an optical fingerprint stored in the central controller with an instantaneous image of the shape of the respective system obtained via the camera of the mobile computer (paragraph 7, the QR code of the system itself scanned by the portable device. For example, a plant operator with his smartphone would be able to scan a QR code displayed on the panel of the stationary operating device), and wherein the optical fingerprint comprises a stored image of the shape of the respective system itself (paragraph 7, the QR code located on a panel of the system itself scanned by the portable device. Note that an image of a QR code on the machine reads on a stored image of the respective system itself, as the claim language does not say it has to be a stored image of the entire respective system; any image of a part or whole of the system can read on the claim language). Jöhnssen does not teach: wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noise of the respective system, and wherein said acoustically recognizing via the microphone comprises monitoring both the frequency and the volume of the instantaneous noises of the respective system. However, Pleumer teaches: wherein the acoustic fingerprint comprises a stored recording of the normal operating noises of the respective system and not an acoustic sound other than the normal operating noise of the respective system (The system 40 includes a reference database 52 that contains recorded and digitized sound patterns (acoustic fingerprints) from the machine 42 of known operating conditions and their associated sound patterns), and wherein said acoustically recognizing via the microphone comprises monitoring both the frequency and the volume of the instantaneous noises of the respective system (Referencing U.S. Pat. No. 5,804,726, a device is disclosed for testing mechanical devices using acoustic signature analysis of the sonic signature of such devices by filtering out the frequencies of interest, recreating such frequencies from their respective harmonics, and then correlating the recreated frequencies to pre-determined known characteristics or parameters developed from valid devices to predict the long term operability of other unknown devices in a noisy environment like a factory or an airport wherein other sources of sound and/or vibration are present [paragraph 3]). Regarding Claim 19. Jöhnssen in combination with Pleumer teaches the method as claimed in claim 18. Jöhnssen also teaches: wherein the identification of the service technician is also effected by means of acoustic recognition (paragraphs 8 and 11, paragraph 16 describes an acoustic code, which reads on an acoustic fingerprint (pages 3 and 4 of Jöhnssen)). Regarding Claim 20. Jöhnssen in combination with Pleumer teaches the method as claimed in claim 18. Jöhnssen also teaches: wherein the systems in question come from the field of conveying technology, storage technology, picking technology and/or sorting technology (paragraphs 1-3). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AARON G CAIN whose telephone number is (571)272-7009. The examiner can normally be reached Monday: 7:30am - 4:30pm EST to Friday 7:30pm - 4:30am. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached at (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AARON G CAIN/Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Apr 14, 2022
Application Filed
May 28, 2024
Non-Final Rejection — §103
Aug 29, 2024
Response Filed
Sep 27, 2024
Final Rejection — §103
Nov 18, 2024
Interview Requested
Nov 25, 2024
Applicant Interview (Telephonic)
Nov 25, 2024
Examiner Interview Summary
Dec 03, 2024
Request for Continued Examination
Dec 04, 2024
Response after Non-Final Action
Jan 07, 2025
Non-Final Rejection — §103
Apr 13, 2025
Response Filed
May 14, 2025
Final Rejection — §103
Jul 16, 2025
Response after Non-Final Action
Aug 18, 2025
Request for Continued Examination
Aug 27, 2025
Response after Non-Final Action
Oct 01, 2025
Non-Final Rejection — §103
Dec 30, 2025
Response Filed
Feb 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12573302
METHOD FOR INFRASTRUCTURE-SUPPORTED ASSISTING OF A MOTOR VEHICLE
2y 5m to grant Granted Mar 10, 2026
Patent 12558790
METHOD AND COMPUTING SYSTEMS FOR PERFORMING OBJECT DETECTION
2y 5m to grant Granted Feb 24, 2026
Patent 12552019
MACHINE LEARNING METHOD AND ROBOT SYSTEM
2y 5m to grant Granted Feb 17, 2026
Patent 12544144
DENTAL ROBOT AND ORAL NAVIGATION METHOD
2y 5m to grant Granted Feb 10, 2026
Patent 12541205
MOVEMENT CONTROL SUPPORT DEVICE AND METHOD
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
40%
Grant Probability
66%
With Interview (+26.1%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 130 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month