Prosecution Insights
Last updated: April 19, 2026
Application No. 18/405,618

MULTIPATH AND FALSE DETECTION MITIGATION

Final Rejection §103§112
Filed
Jan 05, 2024
Examiner
ABRAHAM, JOHN BISHOY SAM
Art Unit
3646
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
GM Cruise Holdings LLC
OA Round
2 (Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
5 granted / 7 resolved
+19.4% vs TC avg
Strong +40% interview lift
Without
With
+40.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
37 currently pending
Career history
44
Total Applications
across all art units

Statute-Specific Performance

§101
13.7%
-26.3% vs TC avg
§103
44.1%
+4.1% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
22.3%
-17.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 7 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, see Page 11, lines 14-20, filed 01/22/2026, with respect to 35 U.S.C. §101 rejection of claims 15-18 have been fully considered and are persuasive. The 35 U.S.C. §101 rejection of claims 15-18 has been withdrawn. Applicant’s arguments, see Page 12, lines 6-12, filed 01/22/2026, with respect to the 35 U.S.C. §102 rejection have been fully considered but they are not persuasive. Applicant’s arguments with respect to claim(s) 1-5, 7-18 and 20-22 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Additional reference is made to Uesato (US 20100134344) which explicitly teaches the subject matter added, the comparison of signal phase and/or magnitude to distinguish multipath from target returns, to claims 1, 15 and 20 which is not explicitly taught by Yoffe (US 20220214425). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph: Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 22 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 22 does not add a further limitation to claim 15 since the limitation of “a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal” is already in claim 15, see claim 15, lines 15-16. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-2, 5, 7-12, 14-17 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Yoffe et al. (US 20220214425) in view of Uesato (US 20100134344). Regarding claim 1, Yoffe teaches a radar sensor system ([0180] FIG. 8 is a schematic block diagram illustration of a radar frontend 804 and a radar processor 834, in accordance with some demonstrative aspects) comprising: at least one transmit antenna ([0181] In some demonstrative aspects, radar frontend 804 may be implemented as part of a MIMO radar utilizing a MIMO radar antenna 881 including a plurality of Tx antennas 814) configured to transmit a radar signal into an environment of the radar sensor system; at least one receive antenna ([0181] and a plurality of Rx antennas 816 configured to receive a plurality of Rx RF signals) configured to receive a return signal from the environment of the radar sensor system responsive to the radar signal; and radar processing circuitry ([0308] In some demonstrative aspects, radar processor 834 may be configured to determine, for example, based on the input information from the input 832, an AoA estimation profile for estimation of the AoA spectrum information of the range-Doppler bin, e.g., as described below) that is configured to perform acts comprising: generating a radar frame ([0348] In some demonstrative aspects, as shown in FIG. 9, AoA spectrum estimation processor 900 may be configured to process an input, for example, a Beam-Forming (BF) input 931 including radar data of a plurality of RD bins) having a detection point based on the return signal (0349] In one example, BF input 931 may include radar data of a radar frame, which may be determined based on the radar Rx data 811); applying a first window (Fig. 9, #912 BF algo 1) to the return signal on a beamforming stage to form a first windowed signal ([0355] In some demonstrative aspects, as shown in FIG. 9, AoA spectrum estimation processor 900 may include a memory 938 to store algorithm-specific information corresponding to the plurality of AoA estimation algorithms 910.); applying a second window different from the first window ([0355] For example, memory 938 may maintain algorithm-specific information defining a plurality of sets of algorithm-specific metrics for the plurality of AoA estimation algorithms 910.) to the return signal on the beamforming stage to form a second windowed signal (Fig. 9, #912 BF algo 2); determining if the detection point corresponds to a true target based on a comparison of the first windowed signal and the second windowed signal ([0329] In some demonstrative aspects, the one or more algorithm-specific metrics may include one or more target detection metrics corresponding to one or more types of targets supported for detection by the AoA spectrum estimation algorithm.); and outputting ([0130] In some demonstrative aspects, radar processor 309 may be configured to provide the determined radar information to a system controller 310 of device/system 301) an indication that the detection point corresponds to a true target ([0189] In some demonstrative aspects, radar processor 834 may be configured to generate radar information 813, for example, based on the radar signals communicated by MIMO radar antenna 881, e.g., as described below. For example, radar processor 104 (FIG. 1), radar processor 210 (FIG. 1), radar processor 309 (FIG. 3), radar processor 402 (FIG. 4), and/or radar processor 503 (FIG. 5), may include one or more elements of radar processor 834, and/or may perform one or more operations and/or functionalities of radar processor 834.). Yoffe fails to explicitly teach that determining if the detection point corresponds to a true target is responsive to a score exceeding a threshold where the score is determined based on a comparison of the first windowed signal and the second windowed signal, the comparison including a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal and a comparison of a first magnitude of the first windowed signal and second magnitude of the second windowed signal. However, Uesato teaches a radar sensor system (Abstract: An electronic scanning radar device that detects an azimuth angle of a target based on a phase difference between a first pair of received waves) which determines a score based on comparison of the first windowed signal and the second windowed signal ([0054] FIG. 6 is a flowchart explaining a procedure for operation of the signal processing portion 24. First, the signal processing portion 24 performs FFT (Fast Fourier Transform) processing of the beat signals, and detects the frequency corresponding to the frequency difference between transmission waves and reception waves in the rising-frequency intervals and falling-frequency intervals of the transmission wave, that is, the beat frequency, as well as the phase difference .phi. between the pair of received waves W21 and W22 (S10).) the comparison including a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal ([0054] Then, the signal processing portion 24 calculates the azimuth angle .theta. of the target from the phase difference .phi. of the received waves (S12).) and determining that the detection point corresponds to a true target responsive to the score exceeding a threshold ([0065] And, based on whether the level of the composite wave is equal to or greater than the threshold value T2, the signal processing portion 24 performs a true/false judgment of the azimuth angle .theta. (S84). At this time, if the level is equal to or greater than the threshold value T2, the detected azimuth angle .theta. can be judged to have been detected from waves received from a target which actually exists in the true/false judgment range, so that the detection result is judged to be true.). Yoffe and Uesato are both considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicle radar technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoffe by including the signal phase comparison of Uesato to yield a predictable result of using the phase difference of the signals in addition to a magnitude threshold to prevent false detections due to phase wrapping as noted by Uesato ([0019]-[0020]). Regarding claim 2, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches determining if the detection point corresponds to a true target based on a characteristic of the environment ([0274] For example, the environment information may be received at input 832, for example, from a camera, a rain sensor, a GPS system, e.g., including maps and/or an urban database, a sonar, e.g., to detect a very close range clutter, and/or any other additional or alternative sensor.). Regarding claim 5, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches generating at least one of range data or Doppler data based on the return signal ([0162] In some demonstrative aspects, the result of the second FFT may provide, e.g., when aggregated over the antennas, a range/Doppler (R/D) map 505. The R/D map may have FFT peaks 506, for example, including peaks of FFT output values (in terms of absolute values) for certain range/speed combinations, e.g., for range/Doppler bins. For example, a range/Doppler bin may correspond to a range bin and a Doppler bin.), and determining if the detection point corresponds to a true target based on the at least one of the range data or the Doppler data ([0162] For example, radar processor 503 may consider a peak as potentially corresponding to an object, e.g., of the range and speed corresponding to the peak's range bin and speed bin.). Regarding claim 7, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches the acts further comprising ([0503] Reference is made to FIG. 18, which schematically illustrates a graph 1800 depicting a plurality of AoA spectrums, in accordance with some demonstrative aspects.) determining the second window ([0507] In some demonstrative aspects, a curve 1804 depicts an AoA spectrum, for example, based on the IAA algorithm using the steering matrix including K same-angle steering vectors, followed by a plurality selected different-angle steering vectors, e.g., Ks*Ks different-angle steering vectors, which are selected based on the plurality of peaks 1812. [0508] In some demonstrative aspects, radar processor 834 (FIG. 8) may determine the AoA spectrum, e.g., curve 1804, to include first K elements of the IAA spectrum.) based on the first metric ([0504] For example, a curve 1802 depicts an AoA spectrum based on the 1D BF algorithm. [0505] For example, as shown in FIG. 18, a plurality of peaks 1812, which may be above a predefined threshold 1815, may be identified, e.g., by radar processor 834 (FIG. 8), for example, for selecting the different-angle steering vectors to be included in the IAA dictionary. [0506] In some demonstrative aspects, radar processor 834 (FIG. 8) may determine Ks selected angle values based on the plurality of peaks 1812, e.g., as described above). Regarding claim 8, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches wherein each ([0510] Reference is made to FIG. 19, which schematically illustrates a graph 1900 depicting AoA spectrums using different counts of different-angle steering vectors, in accordance with some demonstrative aspects.) of the first metric (Fig. 19; curve 1902; [0511] For example, a curve 1902 depicts an AoA spectrum according to an IAA algorithm using all possible 121*121=14641 possible steering vectors corresponding to all possible Tx-Rx combinations for a set of 121 possible angle values.) and the second metric (Fig. 19; curve 1904; [0512] For example, a curve 1904 depicts an AoA spectrum according to an IAA algorithm using 121 same-angle steering vectors and 18*18=364 selected different-angle steering vectors.) comprise a plurality of parameters having a plurality of parameter weights ([0694] In one example, the IAA algorithm may include an iterative estimation of a power spectrum, denoted pk, for example, based on the estimated covariance matrix, and a weight vector, denoted wk, corresponding to the steering matrix). Regarding claim 9, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches wherein: the first metric includes at least one of a first elevation angle, a first azimuth angle ([0506] In some demonstrative aspects, radar processor 834 (FIG. 8) may determine Ks selected angle values based on the plurality of peaks 1812, e.g., as described above.), or a first magnitude of the detection point ([0505] For example, as shown in FIG. 18, a plurality of peaks 1812, which may be above a predefined threshold 1815, may be identified, e.g., by radar processor 834 (FIG. 8), for example, for selecting the different-angle steering vectors to be included in the IAA dictionary.) based on the first windowed signal ([0504] For example, a curve 1802 depicts an AoA spectrum based on the 1D BF algorithm.), and the second metric includes at least one of a second elevation, a second azimuth angle ([0507] In some demonstrative aspects, a curve 1804 depicts an AoA spectrum, for example, based on the IAA algorithm using the steering matrix including K same-angle steering vectors, followed by a plurality selected different-angle steering vectors, e.g., Ks*Ks different-angle steering vectors, which are selected based on the plurality of peaks 1812.), or a second magnitude of the detection point ([0509] In some demonstrative aspects, as shown in FIG. 18, the resulting AoA spectrum based on the IAA algorithm using the Ks selected angles may include two amplitude peaks 1820, which are above −20 dB corresponding to a radar object.) based on the second windowed signal ([0508] In some demonstrative aspects, radar processor 834 (FIG. 8) may determine the AoA spectrum, e.g., curve 1804, to include first K elements of the IAA spectrum.). Regarding claim 10, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches wherein the radar frame includes a plurality of detection points ([0199] In some demonstrative aspects, the radar information 813 may include Point Cloud 1 (PC1) information, for example, including raw point cloud estimations, e.g., Range, Radial Velocity, Azimuth and/or Elevation.), the acts further comprising clustering ([0200] In some demonstrative aspects, the radar information 813 may include Point Cloud 2 (PC2) information, which may be generated, for example, based on the PC1 information. For example, the PC2 information may include clustering information, tracking information, e.g., tracking of probabilities and/or density functions, bounding box information, classification information, orientation information, and the like.) a portion of the detection points to form an evaluated object ([0951] For example, based on the mapping of the range-Doppler values 3306 to the fast-moving-close-range-object region, radar processor 834 (FIG. 8) may be configured to process radar data corresponding to range-Doppler values 3306 as radar data corresponding to range-Doppler values corresponding to fast moving close objects, e.g., cars). Regarding claim 11, Yoffe as modified by Uesato teaches the radar sensor system of claim 10, accordingly the rejection of claim 10 above is incorporated. Yoffe further teaches wherein the first metric and the second metric each include at least one of: a quantity of the detection points corresponding to the evaluated object ([0230] In some demonstrative aspects, radar processor 834 may be configured to select the AoA spectrum estimation algorithm, for example, based on one or more requirements with respect to the AoA spectrum, for example, whether a soft output, e.g., an AoA spectrum, or a hard output, e.g., including a plurality of target detections.); a density of the detection points corresponding to the evaluated object; a size of the evaluated object; a geometry of the evaluated object; a stability of angles of the detection points of the evaluated object based on the first windowed signal relative to angles of the detection points of the evaluated object based on the second windowed signal ([0830] In some demonstrative aspects, different objects in the environment may have different characteristic features. For example, a first type of an object, for example, an infrastructure object, e.g., a building, may be characterized by a shape and a location, while a second type of object, for example, vehicle may be characterized by a speed, a direction, a shape and a location.); or a stability of amplitudes of portions of the first windowed signal corresponding to the detection points of the evaluated object relative to amplitudes of portions of the second windowed signal corresponding to the detection points of the evaluated object (Fig. 17, ref no 1702 and 1704; [0479] In some demonstrative aspects, as shown in FIG. 17, curve 1704 includes, for a radar object, two amplitude peaks 1712, which are above −20 dB. In contrast, curve 1702 includes a large number of amplitude peaks, e.g., corresponding to MP signals. These multiple peaks of curve 1702 may not enable an accurate detection of the radar object.). Regarding claim 12, Yoffe as modified by Uesato teaches the radar sensor system of claim 10, accordingly the rejection of claim 10 above is incorporated. Yoffe further teaches the acts further comprising: clustering a second portion of the detection points to form a second evaluated object ([0952] In some demonstrative aspects, as shown in FIG. 33, a second plurality of range-Doppler values 3304 may be detected in a slow-moving-object region. For example, radar processor 834 (FIG. 8) may be configured to map range-Doppler values 3304 to the slow-moving-object region. For example, based on the mapping of the range-Doppler values 3304 to the slow-moving-object region, radar processor 834 (FIG. 8) may be configured to process radar data corresponding to range-Doppler values 3304 as radar data corresponding to range-Doppler values corresponding to slow moving objects, e.g., pedestrians.); and determining if the detection point corresponds to a true target based on a comparison of at least one of an angle or a Doppler of the first evaluated object and of at least one of an angle or a Doppler of the second evaluated object (Fig. 13-14). Regarding claim 14, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches wherein the radar sensor system is in communication with an autonomous vehicle (Fig, 3; [0130] In some demonstrative aspects, radar processor 309 may be configured to provide the determined radar information to a system controller 310 of device/system 301. For example, system controller 310 may include a vehicle controller), the acts further comprising causing a mechanical system of the autonomous vehicle ([0212] In some demonstrative aspects, radar processor 834 may be configured to process radar data in a vehicle, e.g., an autonomous vehicle, for example, vehicle 100 (FIG. 1).) to be controlled based upon the indication ([0131] In some demonstrative aspects, system controller 310 may be configured to control one or more controlled system components 311 of the system 301, e.g. a motor, a brake, steering, and the like, e.g. by one or more corresponding actuators.). Regarding claims 15 and 22, Yoffe teaches a method of determining if a detection point from a radar sensor system corresponds to a true target, the method comprising: generating a radar frame ([0348] In some demonstrative aspects, as shown in FIG. 9, AoA spectrum estimation processor 900 may be configured to process an input, for example, a Beam-Forming (BF) input 931 including radar data of a plurality of RD bins) comprising the detection point based on a return signal received by the radar sensor system from an environment of the radar sensor system ([0349] In one example, BF input 931 may include radar data of a radar frame, which may be determined based on the radar Rx data 811 (FIG. 8), e.g., as described above.), the return signal being received responsive to a radar signal transmitted into the environment by the radar sensor system (Fig. 8, Tx antennas 814; [0184] In some demonstrative aspects, radar frontend 804 may include one or more radios configured to generate and transmit the Tx RF signals via Tx antennas 814; and/or to process the Rx RF signals received via Rx antennas 816, e.g., as described below.), wherein the radar sensor system is in communication with an autonomous vehicle (Fig, 3; [0130] In some demonstrative aspects, radar processor 309 may be configured to provide the determined radar information to a system controller 310 of device/system 301. For example, system controller 310 may include a vehicle controller); applying a first window (Fig. 9, #912 BF algo 1) to the return signal on a beamforming stage to form a first windowed signal ([0355] In some demonstrative aspects, as shown in FIG. 9, AoA spectrum estimation processor 900 may include a memory 938 to store algorithm-specific information corresponding to the plurality of AoA estimation algorithms 910.); applying a second window different from the first window ([0355] For example, memory 938 may maintain algorithm-specific information defining a plurality of sets of algorithm-specific metrics for the plurality of AoA estimation algorithms 910.) to the return signal on the beamforming stage to form a second windowed signal (Fig. 9, #912 BF algo 2); determining if the detection point corresponds to a true target ([0329] In some demonstrative aspects, the one or more algorithm-specific metrics may include one or more target detection metrics corresponding to one or more types of targets supported for detection by the AoA spectrum estimation algorithm.); outputting ([0130] In some demonstrative aspects, radar processor 309 may be configured to provide the determined radar information to a system controller 310 of device/system 301) an indication that the detection point corresponds to a true target ([0189] In some demonstrative aspects, radar processor 834 may be configured to generate radar information 813, for example, based on the radar signals communicated by MIMO radar antenna 881, e.g., as described below. For example, radar processor 104 (FIG. 1), radar processor 210 (FIG. 1), radar processor 309 (FIG. 3), radar processor 402 (FIG. 4), and/or radar processor 503 (FIG. 5), may include one or more elements of radar processor 834, and/or may perform one or more operations and/or functionalities of radar processor 834.); and controlling a mechanical system of the autonomous vehicle ([0212] In some demonstrative aspects, radar processor 834 may be configured to process radar data in a vehicle, e.g., an autonomous vehicle, for example, vehicle 100 (FIG. 1).) based on the indication ([0131] In some demonstrative aspects, system controller 310 may be configured to control one or more controlled system components 311 of the system 301, e.g. a motor, a brake, steering, and the like, e.g. by one or more corresponding actuators.). Yoffe fails to teach determining a score based on a comparison of the first windowed signal and the second windowed signal, the comparison including a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal and a first magnitude of the first windowed signal and second magnitude of the second windowed signal and determining that the detection point corresponds to a true target responsive to the score exceeding a threshold However, Uesato teaches a radar sensor system (Abstract: An electronic scanning radar device that detects an azimuth angle of a target based on a phase difference between a first pair of received waves) which determines a score based on comparison of the first windowed signal and the second windowed signal ([0054] FIG. 6 is a flowchart explaining a procedure for operation of the signal processing portion 24. First, the signal processing portion 24 performs FFT (Fast Fourier Transform) processing of the beat signals, and detects the frequency corresponding to the frequency difference between transmission waves and reception waves in the rising-frequency intervals and falling-frequency intervals of the transmission wave, that is, the beat frequency, as well as the phase difference .phi. between the pair of received waves W21 and W22 (S10).) the comparison including a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal ([0054] Then, the signal processing portion 24 calculates the azimuth angle .theta. of the target from the phase difference .phi. of the received waves (S12).) and a first magnitude of the first windowed signal and second magnitude of the second windowed signal ([0058] FIG. 7A and FIG. 7B explain composite wave antenna patterns used in true/false judgments in the first configuration example. In FIG. 7A and FIG. 7B, the horizontal axis indicates the azimuth angle, and the vertical axis indicates the level of the received wave W21 or W22, or the level of the composite wave of the pair of received waves W21, W22.) and determining that the detection point corresponds to a true target responsive to the score exceeding a threshold ([0065] And, based on whether the level of the composite wave is equal to or greater than the threshold value T2, the signal processing portion 24 performs a true/false judgment of the azimuth angle .theta. (S84). At this time, if the level is equal to or greater than the threshold value T2, the detected azimuth angle .theta. can be judged to have been detected from waves received from a target which actually exists in the true/false judgment range, so that the detection result is judged to be true.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoffe by including the signal phase comparison of Uesato to yield a predictable result of using the phase difference of the signals in addition to a magnitude threshold to prevent false detections due to phase wrapping as noted by Uesato ([0019]-[0020]). Regarding claim 16, Yoffe discloses the method of claim 15, further comprising determining ([0274] For example, the environment information may be received at input 832, for example, from a camera, a rain sensor, a GPS system, e.g., including maps and/or an urban database, a sonar, e.g., to detect a very close range clutter, and/or any other additional or alternative sensor.) if the detection point corresponds to a true target based on a characteristic of the environment. Regarding claim 17, Yoffe discloses the method of claim 15, further comprising generating at least one of range data or Doppler data based on the return signal ([0162] In some demonstrative aspects, the result of the second FFT may provide, e.g., when aggregated over the antennas, a range/Doppler (R/D) map 505. The R/D map may have FFT peaks 506, for example, including peaks of FFT output values (in terms of absolute values) for certain range/speed combinations, e.g., for range/Doppler bins. For example, a range/Doppler bin may correspond to a range bin and a Doppler bin.), and determining if the detection point corresponds to a true target based on the at least one of the range data or the Doppler data ([0162] For example, radar processor 503 may consider a peak as potentially corresponding to an object, e.g., of the range and speed corresponding to the peak's range bin and speed bin.). Regarding claim 20, Yoffe discloses an autonomous vehicle ([0212] In some demonstrative aspects, radar processor 834 may be configured to process radar data in a vehicle, e.g., an autonomous vehicle, for example, vehicle 100 (FIG. 1).), comprising: a mechanical system ([0131] In some demonstrative aspects, system controller 310 may be configured to control one or more controlled system components 311 of the system 301, e.g. a motor, a brake, steering, and the like, e.g. by one or more corresponding actuators.); a radar sensor system (Fig. 8), comprising: a transmit antenna ([0181] In some demonstrative aspects, radar frontend 804 may be implemented as part of a MIMO radar utilizing a MIMO radar antenna 881 including a plurality of Tx antennas 814 configured to transmit a plurality of Tx RF signals) configured to transmit a radar signal into an environment of the radar sensor system; and a receive antenna ([0181] and a plurality of Rx antennas 816 configured to receive a plurality of Rx RF signals) configured to receive a return signal from the environment of the radar sensor system responsive to the radar signal; and a computing system (Fig. 8, ref. no. 834 Radar Processor), comprising: a processor (Fig. 8, ref. no. 836 Processor); and memory (Fig. 8, ref. no. 838 Memory) that stores instructions that, when executed by the processor, cause the processor to perform acts comprising: generating a radar frame ([0348] In some demonstrative aspects, as shown in FIG. 9, AoA spectrum estimation processor 900 may be configured to process an input, for example, a Beam-Forming (BF) input 931 including radar data of a plurality of RD bins) having a detection point based on the return signal (0349] In one example, BF input 931 may include radar data of a radar frame, which may be determined based on the radar Rx data 811); applying a first window (Fig. 9, #912 BF algo 1) to the return signal on a beamforming stage to form a first windowed signal ([0355] In some demonstrative aspects, as shown in FIG. 9, AoA spectrum estimation processor 900 may include a memory 938 to store algorithm-specific information corresponding to the plurality of AoA estimation algorithms 910.); applying a second window different from the first window ([0355] For example, memory 938 may maintain algorithm-specific information defining a plurality of sets of algorithm-specific metrics for the plurality of AoA estimation algorithms 910.) to the return signal on the beamforming stage to form a second windowed signal (Fig. 9, #912 BF algo 2); determining if the detection point corresponds to a true target based on a comparison of the first windowed signal and the second windowed signal ([0329] In some demonstrative aspects, the one or more algorithm-specific metrics may include one or more target detection metrics corresponding to one or more types of targets supported for detection by the AoA spectrum estimation algorithm.); and outputting ([0130] In some demonstrative aspects, radar processor 309 may be configured to provide the determined radar information to a system controller 310 of device/system 301) an indication that the detection point corresponds to a true target ([0189] In some demonstrative aspects, radar processor 834 may be configured to generate radar information 813, for example, based on the radar signals communicated by MIMO radar antenna 881, e.g., as described below. For example, radar processor 104 (FIG. 1), radar processor 210 (FIG. 1), radar processor 309 (FIG. 3), radar processor 402 (FIG. 4), and/or radar processor 503 (FIG. 5), may include one or more elements of radar processor 834, and/or may perform one or more operations and/or functionalities of radar processor 834.). Yoffe fails to teach determining a score based on a comparison of the first windowed signal and the second windowed signal, the comparison including a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal and a comparison of a first magnitude of the first windowed signal and second magnitude of the second windowed signal and determining that the detection point corresponds to a true target responsive to the score exceeding a threshold However, Uesato teaches a radar sensor system (Abstract: An electronic scanning radar device that detects an azimuth angle of a target based on a phase difference between a first pair of received waves) which determines a score based on comparison of the first windowed signal and the second windowed signal ([0054] FIG. 6 is a flowchart explaining a procedure for operation of the signal processing portion 24. First, the signal processing portion 24 performs FFT (Fast Fourier Transform) processing of the beat signals, and detects the frequency corresponding to the frequency difference between transmission waves and reception waves in the rising-frequency intervals and falling-frequency intervals of the transmission wave, that is, the beat frequency, as well as the phase difference .phi. between the pair of received waves W21 and W22 (S10).) the comparison including a comparison of a first phase of the first windowed signal and a second phase of the second windowed signal ([0054] Then, the signal processing portion 24 calculates the azimuth angle .theta. of the target from the phase difference .phi. of the received waves (S12).) and a first magnitude of the first windowed signal and second magnitude of the second windowed signal ([0058] FIG. 7A and FIG. 7B explain composite wave antenna patterns used in true/false judgments in the first configuration example. In FIG. 7A and FIG. 7B, the horizontal axis indicates the azimuth angle, and the vertical axis indicates the level of the received wave W21 or W22, or the level of the composite wave of the pair of received waves W21, W22.) and determining that the detection point corresponds to a true target responsive to the score exceeding a threshold ([0065] And, based on whether the level of the composite wave is equal to or greater than the threshold value T2, the signal processing portion 24 performs a true/false judgment of the azimuth angle .theta. (S84). At this time, if the level is equal to or greater than the threshold value T2, the detected azimuth angle .theta. can be judged to have been detected from waves received from a target which actually exists in the true/false judgment range, so that the detection result is judged to be true.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoffe by including the signal phase comparison of Uesato to yield a predictable result of using the phase difference of the signals in addition to a magnitude threshold to prevent false detections due to phase wrapping as noted by Uesato ([0019]-[0020]). Regarding claim 21, Yoffe as modified by Uesato teaches the radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches wherein the comparison includes a comparison of a first magnitude of the first windowed signal and second magnitude of the second windowed signal (). Claim(s) 3 is rejected under 35 U.S.C. 103 as being unpatentable over Yoffe as modified by Uesato as applied to claim 2 above, and further in view of Wang et al. (WO 2020176483). Regarding claim 3, Yoffe as modified by Uesato teaches the radar sensor system of claim 2, accordingly the rejection of claim 2 above is incorporated. Yoffe and Uesato failsto teach wherein the characteristic of the environment comprises a location of a reflective surface in map data of the environment of the radar sensor system. However, Wang teaches a radar sensor system for autonomous vehicles with false detection mitigation wherein a characteristic of the environment comprises a location of a reflective surface ([0029] In at least one example, the vehicle computing device(s) 172 can utilize the radar data 168 and the sensor data 170 captured by the sensor system(s) 164 in a reflection recognition component 174. For example, the reflection recognition component 174 can receive the radar data 168, including the object return 122, the first reflected return 138, and the second reflected return 140, and determine that the reflected returns 138, 140 are reflected returns, not returns associated with actual objects in the environment.) in map data of the environment of the radar sensor system ([0030] Additionally, or alternatively, map data available to the vehicle 102 (such as may be downloaded from time to time based on a location of the vehicle 102, or otherwise accessible to the vehicle) may comprise three-dimensional representations of an environment, such as, for example, a mesh of the local three-dimensional environment. In such an example, the reflection recognition component 174 may determine that first reflected return 138 is a non-physical return based on the knowledge of corresponding building 114). Yoffe, Uesato and Wang all considered to be analogous to the claimed invention because they are in the same field of endeavor of radar sensor systems for vehicles with false detection mitigation technology. A person of ordinary skill in the art would have had the technological capabilities to incorporate the data fusion capabilities and techniques of Wang with the radar sensor system of Yoffee as modified by Uesato to yield the predictable result of improved accuracy for object detection and classification ([0031] The map data may be helpful to identify static, fixed objects, e.g., the buildings 114, ground topologies, street signs, utility fixtures, or the like.). Claim(s) 4 is rejected under 35 U.S.C. 103 as being unpatentable over Yoffe as modified by Uesato as applied to claim 1 above, and further in view of Saxena et al. (US PG Pub 20220180146), hereinafter Saxena. Regarding claim 4, Yoffe as modified by Uesato teaches radar sensor system of claim 1, accordingly the rejection of claim 1 above is incorporated. Yoffe further teaches wherein: the at least one transmit antenna is further configured to transmit a second radar signal ([0155] In some demonstrative aspects, the digital reception data values may be represented in the form of a data cube 504. For example, the data cube 504 may include digitized samples of the radio receive signal, which is based on a radio signal transmitted from a transmit antenna and received by M receive antennas. In some demonstrative aspects, for example, with respect to a MIMO implementation, there may be multiple transmit antennas, and the number of samples may be multiplied accordingly.[0156] In some demonstrative aspects, a layer of the data cube 504, for example, a horizontal layer of the data cube 504, may include samples of an antenna, e.g., a respective antenna of the M antennas.) into the environment of the radar sensor system; and the at least one receive antenna is further configured to receive a second return signal ([0157] In some demonstrative aspects, data cube 504 may include samples for K chirps. For example, as shown in FIG. 5, the samples of the chirps may be arranged in a so-called “slow time”-direction.) from the environment of the radar sensor system responsive to the second radar signal; the acts further comprising: generating the radar frame having a second detection point based on the second return signal ([0537] In one example, a snapshot of the radar frame may include a radar measurement, e.g., of the radar Rx data 811, having a coherency in one or more relevant dimensions. For example, the snapshot of the radar frame may include a radar frame measurement of a received radar frame of a sequence of coherently transmitted radar chirps, for example, a sequence of coherently transmitted Frequency-Modulated Continuous Wave (FMCW) chirps and/or any other type of chirps, e.g., with a coherent reference phase.). Yoffe as modified by Uesato fails to teach where forming the first windowed signal includes applying a first weight of the first window to the first return signal and applying a second weight of the first window to the second return signal; forming the second windowed signal includes applying a third weight of the second window to the first return signal and applying a fourth weight of the second window to the second return signal; and at least one of the third weight is different from the first weight or the fourth weight is different from the second weight. However, Saxena teaches a computer system (Fig. 4, multi-objective automated machine learning system 400) which is capable of performing signal processing ([0083] The MOAML system 400 system further includes one or more input devices 410 and one or more output devices 412 communicatively coupled to the communications bus 402.) on the radar frame (Fig. 5, Data 502) wherein forming (Fig. 5, objectives 508; [0086] Also, in some embodiments, domain-specific custom objectives may be used.... In general, a transparent box is an objective where the functional or analytical form of the objective function f(x) is known, or in some instances, does not exist. For example, if a function is defined requiring a 2-dimensional input vector similar to F((x1, x2))=[x12+sqrt(x2)], then the functional form of this function is known. [0088] In general, the objectives are expressed as F = {f1, f2, . . . fn} and the objective weighting factors are expressed as W = { w1, w2, . . . wn}, where n is an integer, a set S of Pareto-optimal solutions will be generated, where there are N Pareto-optimal solutions in the set S, where there is no relationship between the number of Pareto-optimal solutions (N) and the number of objective functions or number of weights (n).) the first windowed signal includes applying a first weight of the first window to the first return signal and applying a second weight of the first window to the second return signal; forming the second windowed signal includes applying a third weight of the second window to the first return signal and applying a fourth weight of the second window to the second return signal; and at least one of the third weight is different from the first weight or the fourth weight is different from the second weight ([0091] Accordingly, the weights 525 for each of the objectives 508 may be user-selected based on the user's determinations of the relative importance of each objective 508). Yoffe , Uesato and Saxena are all considered to be analogous to the claimed invention because they are in the same field of endeavor of methods and systems for identifying objects observed by vehicle radar sensor technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoffe as modified by Uesato to incorporate the signal processing system of Saxena with the radar sensing system of Yoffe as modified by Uesato to yield a predictable result of a more versitale and robust radar sensing system through leveraging a powerful and flexible means of target classification, as noted by Saxena, not all machine learning systems are suited to multi-objective optimization ([0075] Some known machine learning systems configured for single objective optimization may lead to a sub-optimal machine learning model due to the imbalanced nature of the input data or may yield poor values for other objectives due to focus on only a single objective.) whereas Pareto optimal solutions are well suited ([0076] Some known mechanisms to perform multi-objective optimization include generating multiple optimization solutions and evaluating the solutions through analyzing the dominance thereof. Specifically, dominance is used to determine the quality of the solutions where a first solution is said to dominate a second solution if the first solution is better than or at least equal to the second solution in all objectives, and the first solution is strictly better than the second solution in at least one objective. Those solutions which are not dominated by any of the other solutions in light of all of the objectives are referred to as a Pareto-optimal solutions through Pareto optimization, i.e., Pareto-optimal solutions are non-dominated solutions and no other solution dominates them.). Claim(s) 13 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Yoffe as modified by Uesato as applied to claim 1 above, and further in view of Carlström (US PG Pub. 20240159889), hereinafter Carlström. Regarding claims 13 and 18, Yoffe as modified by Uesato teaches the radar sensor system of claim 1 and the method of claim 15, further comprising: determining a relative velocity of the detection point relative to the radar sensor system ([0067] In some demonstrative aspects, radar device 101 may be implemented as part of a vehicular system, for example, a system to be implemented and/or mounted in vehicle 100.[0068] In one example, radar device 101 may be implemented as part of an autonomous vehicle system, [0079] In some demonstrative aspects, the one or more parameters, attributes and/or information with respect to the object may include a range of the objects from the vehicle 100, an angle of the object with respect to the vehicle 100, a location of the object with respect to the vehicle 100, a relative speed of the object with respect to vehicle 100, and/or the like.); generating an ego motion estimation of the radar sensor system ([0275] In some demonstrative aspects, at least part of the environment information may be determined by radar processor 834, for example, based on analysis and/or processing of data, for example, data from previous radar frames, data from analysis of Rx data 811, e.g., number of the targets, Doppler of objects, ego speed estimation, and/or the like.); Yoffe as modified by Uesato fails to teach normalizing the relative velocity value of the detection point to the ego motion estimation to form an absolute velocity value of the detection point; and determining if the detection point corresponds to a true target based on the absolute velocity value of the detection point. However, Carlström teaches a method for identifying ‘ghost objects’ observed by a vehicular radar systems which includes determining the ego motion of the vehicle ([0023] The ego-motion of the vehicle may be determined based on measurements by the radar sensor or may be determined based on auxiliary measurement provided by a motion sensor such as, for example, an on-board odometry sensor, a GPS-based speedometer, or the like.) and normalizing the relative velocity value of the detection point to the ego motion estimation to form an absolute velocity value of the detection point ([0044] In other words, the threshold may be an adaptive threshold that depends on the ego-motion of the radar sensor velocities of two or more of the tracked objects. – This is the threshold used to determine the third condition of the set of conditions mentioned in paragraphs [0008] and [0041]); and determining ([0008] One embodiment relates to a computer-implemented method for identifying a ghost object observed by a radar sensor used to track objects, the method comprising: receiving a reflected radar signal from a candidate object; and identifying the candidate object as a ghost object in case a set of conditions is met, otherwise identifying the candidate object as a real object.) if the detection point corresponds to a true target based on the absolute velocity value of the detection point ([0041] The set of conditions may further comprise a third condition which is met when a difference between the measured range rate of the candidate object and a predicted range rate of the candidate object is below a threshold.). Yoffee, Uesato and Carlström are all considered to be analogous to the claimed invention because they are in the same technological field of endeavor, specifically, methods and systems for identifying objects observed by a vehicle radar sensor technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the radar sensor system and methods of Yoffe as modified by Uesato in view of Carlström to incorporate ego-motion corrected velocity test as taught by Carlström to gain the advantage of an additional test to check for multipath false signals that would increase the accuracy of object detection; and also since it has been held that if a technique has been used to improve one device, and a person of ordinary skill in the art would recognize that it would improve similar devices in the same way, using the technique is obvious unless its actual application is beyond his or her skill (MPEP 2143). For applicant’s benefit portions of the cited reference(s) have been cited to aid in the review of the rejection(s). While every attempt has been made to be thorough and consistent within the rejection it is noted that the PRIOR ART MUST BE CONSIDERED IN ITS ENTIRETY, INCLUDING DISCLOSURES THAT TEACH AWAY FROM THE CLAIMS. See MPEP 2141.02 VI. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: CN 113009448 discloses a detection method of multi-path target, device, device and storage medium, wherein the detection method of multi-path target comprises: under the condition that there are at least two targets in the point cloud identifying the radar frame, according to the radar information of the first target and the real-time environment information corresponding to the radar frame, inverting the inverted multi-path target of the first target; the first target is any one of at least two targets; obtaining the matching degree of the second target and the inversion multi-path target; the second target is any one target except the first target in at least two targets; when the matching degree is greater than the preset threshold value, detecting the second target as the multi-path target of the first target in the radar frame. The invention can detect the multi-path target and improve the tracking performance of the radar. EP 3588128 discloses a method for detection including height and azimuth estimation of objects in a scene by radar processing, radar signals are emitted to the scene and radar signals reflected from the scene are received using at least one multi-channel radar sensor, the radar sensor comprises a two-dimensional array of sensor elements with spatial diversity in horizontal and vertical axes. Measurement signals of the at least one radar sensor are processed to detect objects in the scene and a height and azimuth estimation of one or several detected objects is performed using compressed sensing. The compressed sensing is based on a combination of a near-field multipath observation model, which includes ground-reflected path contributions of the emitted and reflected radar signals for a flat surface, with possibly unknown reflection coefficient, for pairs of transmitting and receiving elements, and a group-sparse estimation model for measurement signals of mutually incoherent measurements taken at different distances from the detected object(s) or from mutually incoherent radar sensors or sensor elements. US 9128174 discloses an electronic scan type radar apparatus configured to transmit an electric wave and calculate an angle of a target based on a phase difference of respective reception signals, thereby detecting a target position. An antenna unit transmits and receives the electric wave and provided with two transmission antennae. A transmission unit alternately transmits an electric wave having a first beam pattern and an electric wave having a second beam pattern from the two transmission antennae. First and second reception units calculate arrival angles and reception levels of reflected waves calculated from respective reception signals which are obtained by receiving the reflected waves by the first and second beam patterns. A comparison unit compares the reception levels by combining the arrival angles of the reflected waves. A determination unit determines whether a target actually exists at the arrival angles in accordance a comparison result. US 20150204972 discloses a radar sensor for a motor vehicle and method for angle estimation of radar targets based on an antenna diagram that indicates, for various configurations of radar targets, pertinent amplitudes and/or phase correlations between signals which are obtained for the relevant configuration in multiple evaluation channels of the radar sensor, wherein for a single real target, the occurrence of a number n of apparent targets, which are caused by reflection of the signal coming from the real target from elongated objects, is modeled mathematically; a correlation between the location angle of the real target and the location angles of the apparent targets is calculated; and to estimate the location angle of the real target, a multi-target estimate is performed in an n-dimensional search space and the search is limited to a sub-space that is determined by the calculated correlation. US 8111192 discloses a method for noise discrimination in signals from a plurality of sensors by enhancing the phase difference in the signals such that off-axis pick-up is suppressed while on-axis pick-up is enhanced. Alternatively, attenuation/expansion are applied to the signals in a phase difference dependent manner, consistent with suppression of off-axis pick-up and on-axis enhancement. Nulls between sensitivity lobes are widened, effectively narrowing the sensitivity lobes and improving directionality and noise discrimination. US 20060114146 discloses a multi-targeting method for locating short-range target objects in terms of distance and angle. The method includes the following steps: a) a characteristic signal is emitted by a transmitting antenna of a first sensor element; b) the reflected characteristic signal is received by at least two adjacent reception antennae of the first sensor element; c) the difference in transit time of the reflected characteristic signal to the two adjacent reception antenna of the first sensor element is measured in order to determine the distance between the target objects and the first sensor element; and d) the phase differences of the characteristic signal between the two adjacent reception antenna of the first sensor element are measured in order to determine the angles between the target objects and the first sensor element. In addition, a device for implementing the above-mentioned method. JP 2006053025 discloses a radar system for the target detection based on the Doppler frequency axis obtained by Fourier transform of the signal communicated between the pulse repetition interval PRI and the tracking process comprises: a weight operation for computing a plurality of complex weight different in inclination to the Doppler frequency axis of the phase pattern by offsetting the reference time of the PRI; a Fourier transform process for obtaining filter data by the filter bank formed by the Fourier transform formed by successively set up with the computed plurality of complex weight; a maximum value selection process for detecting the maximum value among the plurality of complex weight with respect to every filter constituting the filter bank; and a target detection process for detecting the target on the basis of the detection result in the maximum value process part. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN BS ABRAHAM whose telephone number is (571)272-4145. The examiner can normally be reached Monday - Friday 9:00 am - 5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jack Keith can be reached at (571)272-6878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JBSA/Examiner, Art Unit 3646 /JACK W KEITH/Supervisory Patent Examiner, Art Unit 3646
Read full office action

Prosecution Timeline

Jan 05, 2024
Application Filed
Oct 17, 2025
Non-Final Rejection — §103, §112
Jan 06, 2026
Interview Requested
Jan 22, 2026
Response Filed
Jan 26, 2026
Applicant Interview (Telephonic)
Jan 28, 2026
Examiner Interview Summary
Mar 23, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12584991
UWB-BASED IN-VEHICLE 3D LOCALIZATION OF MOBILE DEVICES
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+40.0%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 7 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month