Prosecution Insights
Last updated: April 19, 2026
Application No. 19/082,404

METHOD AND APPARATUS FOR DETECTING FOREIGN OBJECTS USING VARIABLE MONITORING AREA IN WIRELESS CHARGING SYSTEM

Non-Final OA §102§103
Filed
Mar 18, 2025
Examiner
YESHAW, ESAYAS G
Art Unit
2849
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
563 granted / 648 resolved
+18.9% vs TC avg
Moderate +13% lift
Without
With
+12.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
48 currently pending
Career history
696
Total Applications
across all art units

Statute-Specific Performance

§101
0.6%
-39.4% vs TC avg
§103
51.1%
+11.1% vs TC avg
§102
35.2%
-4.8% vs TC avg
§112
8.6%
-31.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 648 resolved cases

Office Action

§102 §103
DETAILED ACTION The office action is in response to original application filed on 3-18-25. Claims 1-20 are pending in the application and have been examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted filed before the mailing of a first Office action on the merits. The submission is in compliance with the provisions of 37 CFR 1.97(b) (3). Accordingly, the information disclosure statement is being considered by the examiner. Priority Receipt is acknowledged of certified copies of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-2, 4-12 and 14-20 are rejected under 35 U.S.C. 102 (a) (2) as being anticipated by US 2023/0336034 to Ward et al. (“Ward”). Regarding claim 1, Ward discloses a foreign object detection method (see figure 1, figure 10, para. 0009, 0058, 0095, 0109-0110) comprising: acquiring a transmission current value (para; 0123, POD controller 1214 also sets the voltage and current delivered by the power supply 1201 during an impedance measurement using the digital datalink 1215 and para; 0121, Test current levels may also be varied up to -10% of the nominal power transfer current (e.g., under 10 AMPS for a 100 AMP WPT system)) of a wireless power transmitter (fig. 12); calculating a magnitude of a magnetic field (para; 0055, impedance detection system used herein is based on measuring techniques used in eddy current Nondestructive Testing (NDT). NDT relies on the concept described in Faraday's Law of Induction in which a time varying (alternating) magnetic field induces eddy currents on a coupled conductive object) based on the acquired transmission current value (para; 0123, POD controller 1214 also sets the voltage and current delivered by the power supply 1201 during an impedance measurement using the digital datalink 1215 and para; 0121, Test current levels may also be varied up to -10% of the nominal power transfer current (e.g., under 10 AMPS for a 100 AMP WPT system)); setting a monitoring area formed between the wireless power transmitter and a wireless power receiver (para; 0011, observing the area surrounding the charging position of the GTA using the at least one camera may include monitoring the area surrounding the charging position when the WPT station is in use for people, animals) according to the calculated magnitude of the magnetic field (para; 0160, impedance measurements, when a vehicle is not present and the GTA is inactive, is kept below a threshold (which varies according to individual coil assembly size, ferrite placement, number of windings) so as to prevent excessive generation of magnetic flux above the regulated amount); acquiring an image and determining whether a foreign object is present in the image (fig. 10, foreign object 1004 has been detected by the close in imaging system in the surface of the GTS 1001); generating information about the foreign object when it is determined that the foreign object is present in the image (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s))); and determining whether the foreign object is included in the monitoring area based on the information about the foreign object (para; 0011, Observing the area surrounding the charging position of the GTA using the at least one camera may include monitoring the area surrounding the charging position when the WPT station is not in use for people, animals, or vehicles that could interfere with charging by the WPT station). Regarding claim 2, Ward discloses stopping transmission power of the wireless power transmitter upon determining that the foreign object is included in the monitoring area (para; 0118, Use of the wide-area imaging system would ensure that the immediate area was clear prior to coil energization. The wide-area observation system would also trigger immediate de-energization of the GTA (s) if a person, animal, or vehicle approached the GTS under examination). Regarding claim 4, Ward discloses the information about the foreign object comprises an angle (709) between a centerline of a camera (fig. 8, 807) capturing an area including the monitoring area (figs. 7-8) and a center point of a virtual foreign object projected onto a reference plane (810), and a distance between the camera and the foreign object (para; 0104), and the determining of whether the foreign object is included in the monitoring area is based on the angle and the distance (para; 0012, analyzing images from the at least one camera may include receiving image data based on recent or edited images or representative models of at least one of the charging position or the area surrounding the charging position and comparing the recent or edited images or representative models to images captured while observing the charging position or the areas surrounding the charging position to identify image differences indicative of a foreign object). Regarding claim 5, Ward discloses the determining of whether the foreign object is present in the image comprises: comparing corresponding pixels (para; 0110, The optical system relies on very small numbers of pixels (down to a single pixel) for foreign object detection) between the acquired image and a pre-stored image without any foreign object (Para; 0156, The resultant foreground image may be stored in the database 1604 for future use in the analysis step 1607); detecting the number of pixels among the compared pixels that have different values (para; 0148; The multiple images are averaged on a pixel ( or block of pixels) basis. Since areas outside the GTA are irrelevant to close-in operation, such areas will be filtered out before processing, reducing processing load. Information in database 1504 may be used to determine the areas of interest from the prior identified pixels of interest)); and determining that the foreign object is present in the image when the number of detected pixels is greater than or equal to a predetermined number of pixels (para; 0149, Pixel subtraction at step 1506 may be used to compare the image, now consisting of two or more combined images, with a prior image (also comprised of several averaged images) showing the area of observation without a foreign object; Use previous image compare to present images by pixel map the can decide FOD present or not). Regarding claim 6, Ward discloses the camera is a distortion-free lens (para; 0004, Digital cameras use solid-state sensors (e.g., CCD (charge-coupled device) or, CMOS (complementary metal oxide semiconductor)) to capture light collected through a lens and to convert the captured light into electronic image data. The digital image contains a certain number of pixels, with each pixel being mapped onto a planar grid. Each pixel has its own tonal value that determines the image's hue or color), and the generating of the information about the foreign object (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s))) comprises: calculating the distance between the camera and an actual foreign object based on the number of pixels (para; 0110, The optical system relies on very small numbers of pixels (down to a single pixel) for foreign object detection) of the virtual foreign object projected onto the reference plane in the image, the number of pixels of the actual foreign object, and a distance between the camera and the reference plane; and calculating the angle between the centerline of the camera and the center point of the virtual foreign object projected onto the reference plane in the image, based on the camera’s field of view and the number of pixels in a reference direction of the image. Regarding claim 7, Ward discloses the generating of the information about the foreign object (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s))) comprises: inputting the image containing the foreign object into an artificial neural network to obtain a type of the foreign object; and determining a value of a pre-measured actual length corresponding to type of the obtained foreign object (para; 0087, A mathematical check of the lengths of the detected edges and parallel ness can be used to verify successful edge detection prior to shifting of the captured image in imaging processing) as the number of pixels of the actual foreign object using a value of a pre-measured actual length for each type (para; 0087, A mathematical check of the lengths of the detected edges and parallel ness can be used to verify successful edge detection prior to shifting of the captured image in imaging processing). Regarding claim 8, Ward discloses the camera is a distorted lens (para; 0004, Digital cameras use solid-state sensors (e.g., CCD (charge-coupled device) or, CMOS (complementary metal oxide semiconductor)) to capture light collected through a lens and to convert the captured light into electronic image data. The digital image contains a certain number of pixels, with each pixel being mapped onto a planar grid. Each pixel has its own tonal value that determines the image's hue or color), and the generating of the information about the foreign object (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s))) comprises: calculating the distance between the camera and the foreign object using an interpolation method based on a distance between the camera and the reference plane and the pre-measured length of the virtual foreign object projected onto the reference plane for each type of foreign object (fig. 10, 1004). Regarding claim 9, Ward discloses a lens of the camera (para; 0004, Digital cameras use solid-state sensors (e.g., CCD (charge-coupled device) or, CMOS (complementary metal oxide semiconductor)) to capture light collected through a lens and to convert the captured light into electronic image data. The digital image contains a certain number of pixels, with each pixel being mapped onto a planar grid. Each pixel has its own tonal value that determines the image's hue or color), the monitoring area, and the reference plane are positioned in a straight line, and the wireless power transmitter is spaced apart by a predetermined distance from the reference plane. Regarding claim 10, Ward discloses generating a foreign object detection alarm upon determining that the foreign object is included in the monitoring area (para; 0011, observing the area surrounding the charging position of the GTA using the at least one camera may include monitoring the area surrounding the charging position when the WPT station is in use for people, animals). Regarding claim 11, Ward discloses a foreign object detection apparatus (see figure 1, figure 10, para. 0009, 0058, 0095, 0109-0110) comprising: a processor (para; 0163, software may be executed on a digital signal processor, ASIC, microprocessor, or other type of processor operating on a computer system, such as a personal computer, server, or other computer system, turning such computer system into a specifically programmed machine) configured to acquire a transmission current value para; 0123, POD controller 1214 also sets the voltage and current delivered by the power supply 1201 during an impedance measurement using the digital datalink 1215 and para; 0121, Test current levels may also be varied up to -10% of the nominal power transfer current (e.g., under 10 AMPS for a 100 AMP WPT system)) of a wireless power transmitter (fig. 12), calculate a magnitude of a magnetic field (para; 0055, impedance detection system used herein is based on measuring techniques used in eddy current Nondestructive Testing (NDT). NDT relies on the concept described in Faraday's Law of Induction in which a time varying (alternating) magnetic field induces eddy currents on a coupled conductive object) based on the acquired transmission current value (para; 0123, POD controller 1214 also sets the voltage and current delivered by the power supply 1201 during an impedance measurement using the digital datalink 1215 and para; 0121, Test current levels may also be varied up to -10% of the nominal power transfer current (e.g., under 10 AMPS for a 100 AMP WPT system)), set a monitoring area formed between the wireless power transmitter and a wireless power receiver (para; 0011, observing the area surrounding the charging position of the GTA using the at least one camera may include monitoring the area surrounding the charging position when the WPT station is in use for people, animals) according to the calculated magnitude of the magnetic field (para; 0160, impedance measurements, when a vehicle is not present and the GTA is inactive, is kept below a threshold (which varies according to individual coil assembly size, ferrite placement, number of windings) so as to prevent excessive generation of magnetic flux above the regulated amount), acquire an image and determine whether a foreign object is present in the image (fig. 10, foreign object 1004 has been detected by the close in imaging system in the surface of the GTS 1001), generate information about the foreign object (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s))) when it is determined that the foreign object is present in the image (fig. 10, foreign object 1004 has been detected by the close in imaging system in the surface of the GTS 1001), and determine whether the foreign object is included in the monitoring area based on the information about the foreign object (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s)). Regarding claim 12, Ward discloses transmission power of the wireless power transmitter is cut off upon determining that the foreign object is included in the monitoring area (para; 0026, station controller may perform at least one of disabling the GTA in response to an alert from the FOD controller, reset local directional signals to the GTA, inform the reservation system that the GTA is off-line when a foreign object detection on the GTA is confirmed, and call for local maintenance or remote maintenance of the GTA). Regarding claim 14, Ward discloses a camera, wherein the image is captured by the camera, and the information about the foreign object (para; 0147, Information from database 1504 may be used in the centering step 1503 to compare the incoming imagery to prior, known good imagery (i.e., without foreign object(s))) comprises an angle between the centerline of the camera that captures an area including the monitoring area and a center point of the virtual foreign object projected onto a reference plane and the distance between the camera and the foreign object, and the processor determines whether the foreign object is included in the monitoring area based on the angle and the distance. Regarding claim 15, Ward discloses when determining whether the foreign object is present in the image (fig. 10, foreign object 1004 has been detected by the close in imaging system in the surface of the GTS 1001), the processor compares corresponding pixels between the acquired image and a pre-stored image (Para; 0156, The resultant foreground image may be stored in the database 1604 for future use in the analysis step 1607) without any foreign object, detects the number of pixels among the compared pixels that have different values (para; 0148; The multiple images are averaged on a pixel ( or block of pixels) basis. Since areas outside the GTA are irrelevant to close-in operation, such areas will be filtered out before processing, reducing processing load. Information in database 1504 may be used to determine the areas of interest from the prior identified pixels of interest)), and determines that the foreign object is present in the image when the number of detected pixels is greater than or equal to a predetermined number of pixels (para; 0149, Pixel subtraction at step 1506 may be used to compare the image, now consisting of two or more combined images, with a prior image (also comprised of several averaged images) showing the area of observation without a foreign object; Use previous image compare to present images by pixel map the can decide FOD present or not). Regarding claim 16, Ward discloses the camera is a distortion-free lens (para; 0004, Digital cameras use solid-state sensors (e.g., CCD (charge-coupled device) or, CMOS (complementary metal oxide semiconductor)), and the processor calculates the distance between the camera and an actual foreign object (para; 0012, analyzing images from the at least one camera may include receiving image data based on recent or edited images or representative models of at least one of the charging position or the area surrounding the charging position and comparing the recent or edited images or representative models to images captured while observing the charging position or the areas surrounding the charging position to identify image differences indicative of a foreign object) based on the number of pixels of the virtual foreign object projected onto the reference plane in the image (para; 0110, The optical system relies on very small numbers of pixels (down to a single pixel) for foreign object detection) between the acquired image and a pre-stored image without any foreign object (Para; 0156, The resultant foreground image may be stored in the database 1604 for future use in the analysis step 1607), the number of pixels of the actual foreign object, and a distance between the camera and the reference plane (para; 0012, analyzing images from the at least one camera may include receiving image data based on recent or edited images or representative models of at least one of the charging position or the area surrounding the charging position and comparing the recent or edited images or representative models to images captured while observing the charging position or the areas surrounding the charging position to identify image differences indicative of a foreign object), and calculates the angle between the centerline of the camera and the center point of the virtual foreign object projected onto the reference plane in the image, based on the camera’s field of view and the number of pixels in a reference direction of the image (para; 0096, determined by the elevation (height above pavement level) 704, the angle of inclination 709, and the optical characteristics of the lens 706). Regarding claim 17, Ward discloses an artificial neural network, wherein the processor inputs the image containing the foreign object into the artificial neural network to obtain a type of the foreign object, and determines a value of a pre-measured actual length corresponding to the type of the obtained foreign object (para; 0087, A mathematical check of the lengths of the detected edges and parallel ness can be used to verify successful edge detection prior to shifting of the captured image in imaging processing) as the number of pixels (para; 0110, The optical system relies on very small numbers of pixels (down to a single pixel) for foreign object detection) of the actual foreign object using a value of a pre-measured actual length for each type (para; 0087, A mathematical check of the lengths of the detected edges and parallel ness can be used to verify successful edge detection prior to shifting of the captured image in imaging processing). Regarding claim 18, Ward discloses the camera is a distortion lens (para; 0110, The optical system relies on very small numbers of pixels (down to a single pixel) for foreign object detection) between the acquired image and a pre-stored image without any foreign object (Para; 0156, The resultant foreground image may be stored in the database 1604 for future use in the analysis step 1607), and the processor calculates the distance between the camera and the foreign object using an interpolation method based on a distance between the camera and the reference plane (para; 0012, analyzing images from the at least one camera may include receiving image data based on recent or edited images or representative models of at least one of the charging position or the area surrounding the charging position and comparing the recent or edited images or representative models to images captured while observing the charging position or the areas surrounding the charging position to identify image differences indicative of a foreign object) and the pre-measured length of the virtual foreign object projected onto the reference plane for each type of foreign object (fig. 10, 1004). Regarding claim 19, Ward discloses a lens of the camera (para; 0004, Digital cameras use solid-state sensors (e.g., CCD (charge-coupled device) or, CMOS (complementary metal oxide semiconductor)) to capture light collected through a lens and to convert the captured light into electronic image data. The digital image contains a certain number of pixels, with each pixel being mapped onto a planar grid. Each pixel has its own tonal value that determines the image's hue or color), the monitoring area, and the reference plane are positioned in a straight line, and the wireless power transmitter is spaced apart by a predetermined distance from the reference plane (para; 0012, analyzing images from the at least one camera may include receiving image data based on recent or edited images or representative models of at least one of the charging position or the area surrounding the charging position and comparing the recent or edited images or representative models to images captured while observing the charging position or the areas surrounding the charging position to identify image differences indicative of a foreign object). Regarding claim 20, Ward discloses the processor generates a foreign object detection alarm upon determining that the foreign object is included in the monitoring area (para; 0011, initiating the failsafe operation to at least one of reduce or remove power from the WPT station, signal a driver or occupant of the vehicle, or set visual or audio alarms to alert the at least one person, animal, or vehicle). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3 and 13 are rejected under 35 U.S.C. 103 (a) as being unpatentable over US 2023/0336034 to Ward et al. (“Ward”) in view of US 11,742,703 to Sakita et al. (“Sakita”). Regarding claim 3, Ward discloses all the claim limitation as set forth in the rejection of claims above. But, Ward does not disclose the setting of the monitoring area comprises setting or updating the monitoring area using three-dimensional spatial variables based on a shape of a coil of the wireless power transmitter and the magnitude of the magnetic field. However, Sakita disclose the setting of the monitoring area comprises setting or updating the monitoring area using three-dimensional spatial variables based on a shape of a coil (Col. 16, lines 10-13, In FIG. 7, XYZ coordinate axes similar to those shown in FIG. 1 are shown; so that the state of the wireless power transmission device 11 and wireless power reception device 12, can be easily monitored before the start of power transmission and a monitoring process can be, performed without interfering with an operation of the device cross-section of the power transmission coil) of the wireless power transmitter and the magnitude of the magnetic field. Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify Ward by adding XYZ coordinate axes as part of its configuration as taught by Sakita, in order to viewing a floor plane on the ground surface and perpendicular, may have any angle with respect to both coils to updated reference image before power transfer start. Regarding claim 13, Ward discloses all the claim limitation as set forth in the rejection of claims above. But, Ward does not disclose the processor sets or updates the monitoring area by defining the monitoring area as a three-dimensional spatial variable based on a shape of a coil. However, Sakita discloses the processor sets or updates the monitoring area by defining the monitoring area as a three-dimensional spatial variable based on a shape of a coil (Col. 16, lines 10-13, In FIG. 7, XYZ coordinate axes similar to those shown in FIG. 1 are shown; so that the state of the wireless power transmission device 11 and wireless power reception device 12, can be easily monitored before the start of power transmission and a monitoring process can be, performed without interfering with an operation of the device cross-section of the power transmission coil). Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify Ward by adding XYZ coordinate axes as part of its configuration as taught by Sakita, in order to viewing a floor plane on the ground surface and perpendicular, may have any angle with respect to both coils to updated reference image before power transfer start. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Bell et al. US 9,893,538 Bl- A wireless charging system configured to generate and transmit power waves that, due to physical waveform characteristics converge at a predetermined location in a transmission field to generate a pocket of energy. Receivers associated with an electronic device being powered by the wireless charging system, may extract energy from these pockets of energy and then convert that energy into usable electric power for the electronic device associated with a receiver. The pocket of energy may manifest as a three-dimensional field (e.g., transmission field) where energy may be harvested by a receiver positioned within or nearby the pocket of energy. Video sensors capture actual video images of fields of view within the transmission field, and a processor identifies selected objects, selected events, and/or selected locations within the captured video images. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ESAYAS G YESHAW whose telephone number is (571)270-1959. The examiner can normally be reached Mon-Sat 9AM-7PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Menna Youssef can be reached at 5712703684. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ESAYAS G YESHAW/Examiner, Art Unit 2836 /Menatoallah Youssef/SPE, Art Unit 2849
Read full office action

Prosecution Timeline

Mar 18, 2025
Application Filed
Feb 25, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603457
PLUG-IN CONTACT DEVICE FOR PREVENTING AN ARC WHEN SEPARATING A DIRECT CURRENT CONNECTION
2y 5m to grant Granted Apr 14, 2026
Patent 12603565
Integrated Multi-Port Generator-Rectifier Device and Method
2y 5m to grant Granted Apr 14, 2026
Patent 12587109
POWER SUPPLY INCLUDING A RECONFIGURABLE ACTIVE FRONT END CONVERTER
2y 5m to grant Granted Mar 24, 2026
Patent 12562322
CIRCUIT BREAKER
2y 5m to grant Granted Feb 24, 2026
Patent 12552543
EMERGENCY ENERGY RESERVE SOLUTION FOR BATTERY ELECTRIFIED AIRCRAFT
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+12.6%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 648 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month