Prosecution Insights
Last updated: April 19, 2026
Application No. 18/596,395

Radar-Based Input Controls

Non-Final OA §103§112
Filed
Mar 05, 2024
Examiner
RAYNAL, ASHLEY BROWN
Art Unit
3648
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Google LLC
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
28 granted / 36 resolved
+25.8% vs TC avg
Strong +23% interview lift
Without
With
+22.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
33 currently pending
Career history
69
Total Applications
across all art units

Statute-Specific Performance

§101
7.5%
-32.5% vs TC avg
§103
48.4%
+8.4% vs TC avg
§102
19.6%
-20.4% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 36 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims The following is a non-final, first office action in response to the communication filed 03/05/2024. Claims 1-12 are currently pending and have been examined. Priority Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) is acknowledged. Claims 1-12 have support in PRO 63/039,013, and the instant application is entitled to the benefit of the provisional application, with an effective filing date of 02/29/2024. Information Disclosure Statement The information disclosure statement (IDS) submitted on 03/05/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement has been considered by the examiner. Specification Applicant is reminded of the proper language and format for an abstract of the disclosure. The language should be clear and concise and should not repeat information given in the title. It should avoid using phrases which can be implied, such as, “The disclosure concerns,” “The disclosure defined by this invention,” “The disclosure describes,” etc. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-12 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 1, line 2 recites “one or more radar transmission antennas”, and line 8 recites “ the plurality of radar transmission antennas”. It is unclear whether the plurality of antennas of line 8 is intended to refer to the one or more antennas of line 2. For examination purposes, line 8 will be read as “the one or more radar transmission antennas”. Claims 2-12 are also rejected since the claims are dependent on a previously rejected claim. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-4 and 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over Hsu et al. (US-12353638-B1; hereinafter Hsu) in view of Va et al. (US-20240369684-A1; hereinafter Va). Regarding claim 1, Hsu discloses [Note: what Hsu fails to disclose is strike-through] A system (see at least Fig. 1, apparatus 102) comprising: at least one transmit channel feeding one or more radar transmission antennas (see at least col. 14, lines 55-62; “The radar sensor 130 may comprise a plurality of antennas that radiate and collect electromagnetic signals. Antennas may be placed in various locations with a radar FOV 132 directing the gain of the antenna(s) in a particular direction. In some implementations, the radar sensor 130 and the antennas may be integrated into single device. In other implementations, the radar sensor 130 and the antennas may be separated, such as via a feedline.”); a plurality of radar reception antennas (see at least col. 14, lines 49-52; “The receive antenna index for each radar 130 is denoted by i, where i=0, 1, . . . , Nrx−1, with Nrx being the total number of receivers in each radar 130.”), one or more radar reception antennas of the plurality of radar reception antennas corresponding to an input location of a plurality of input locations (see at least Fig. 7, where inputs to the screen are detected by a plurality of radar detectors 130); and radar control circuitry (see at least col. 7, lines 51-53; “In one implementation the radar sensor 130 may comprise the BGT60TR13C 60 GHz radar sensor from Infineon Technologies AG.”) operatively coupled to the plurality of radar transmission antennas and the plurality of radar reception antennas (see at least col. 14, lines 59-62; “In some implementations, the radar sensor 130 and the antennas may be integrated into single device. In other implementations, the radar sensor 130 and the antennas may be separated, such as via a feedline.’), the radar control circuitry configured to: generate a transmission signal via the plurality of radar transmission antennas; receive one or more reflections of the transmission signal via at least one radar reception antenna of the plurality of radar reception antennas (see at least col. 14, lines 55-56; “The radar sensor 130 may comprise a plurality of antennas that radiate and collect electromagnetic signals.”), the one or more reflections of the transmission signal reflected from an object (see at least col. 7, lines 47-50; “A radar sensor 130 uses millimeter wavelength radio signals to acquire information about an object. The radar sensor 130 emits radio signals and detects the reflections of those radio signals.”); analyze, in response to the receipt, the one or more reflections of the transmission signal (see at least Fig. 4, flow diagram of analysis of radar signals starting with acquiring radar data at 402); determine, based on the analysis of the one or more reflections, whether the object comprises a hand of a user (see at least Fig. 2, lines 48-51; “By using the apparatus and techniques described in this disclosure, touch data indicative of a position of a user's hand may be accurately and cost effectively determined for large display devices.”) or an electronic device; determine, responsive to a determination that the one or more reflections are indicative of the hand of the user, a action performed by the user based at least in part on a spatial location of the action relative to a respective input location of the plurality of input locations (see at least col. 5, lines 14-17; “The touch data 148 may be used to operate the apparatus 102 or other devices. For example, the touch data 148 may be used to determine a user selection of a control that is presented on the display 110.”); and cause an action to be performed in response to the determination of the action, the action indicative of a user command to perform a function associated with the action (see again col. 5, lines 14-17). However, Hsu does not explicitly teach that the measured touch data may be in the form of a gesture. Hsu discloses a system using radar for touch interaction with non-touchscreen display. Va is directed to virtual touch interaction using radar for any display device. Va teaches: determine, responsive to a determination that the one or more reflections are indicative of the hand of the user (see at least [0110]; “The radar 540 can be configured to form sharp beams that can be used to constrain the detection region 542 to be a thin region near the screen 530. The detection region 542 is shown as a translucent trapezoid for ease of illustration. At high frequencies, such as mmWave and THz frequencies, a form-factor of a few centimeters can accommodate tens of antennas, which can support high angular resolution. The excellent ranging and high angular resolution of the radar 540 are useful for determining the precise location of the hand 510 in the sensing detection region 542.”), a gesture performed by the user based at least in part on a spatial location of the gesture relative to a respective input location of the plurality of input locations (see at least [0111] – [0112]; “Doppler processing can be used to detect a virtual click event or virtual swipe event to support virtual touch functionality… In some embodiments, a virtual click event occurs when the position of at least one button 522-524 is overlapped by a specific position where the hand 510 enters (and/or exits) the detection region 542.”); and cause an action to be performed in response to the determination of the gesture, the gesture indicative of a user command to perform a function associated with the gesture (see at least [0134]; “Some examples of those explicit touch events include click events, swipe/scroll events (which are the same motion-wise, but with different contexts), drag and drop, zooming, etc.”). Both Hsu and Va use radar detectors to give touchscreen functionality to non-touchscreen devices. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the touch data used in Hsu to include recognition of gestures, as taught by Va. One of ordinary skill would be motivated to include recognition of touch-screen gestures in order to be able to recognize user directions to swipe scroll, drag and zoom, as recognized by Va (see Va at least [0134]). Regarding claim 2, Hsu in view of Va teaches the system of claim 1. Va further teaches: wherein the gesture includes a movement performed by one or more digits of the hand of the user being moved relative to the one or more input locations of the plurality of input locations (see at least [0111] – [0112]; “Doppler processing can be used to detect a virtual click event or virtual swipe event to support virtual touch functionality… In some embodiments, a virtual click event occurs when the position of at least one button 522-524 is overlapped by a specific position where the hand 510 enters (and/or exits) the detection region 542.” Examiner maps the plurality of input locations to any area on the screen with touchscreen-like functionality.). It would have been obvious to combine Hsu and Va for the reasons given regarding claim 1. Regarding claim 3, Hsu in view of Va teaches the system of claim 2. Va further teaches: wherein the radar control circuitry is further configured to: determine, based on the analysis of the one or more reflections, a movement of the object when (i) the one of more reflections indicate that the object is larger than the hand of the user or one or more fingers of the hand of the user or (ii) the one or more reflections indicate that the object is beyond a threshold distance from the plurality of radar reception antennas based on at least one characteristic of the reflections (see at least [0170]; “At block 1325, when the estimated normal angle is within the ambiguous region, the processor 120 checks for triggering events by determining whether a triggering event is detected. Examples of triggering events include the hand moving into or pulling away from the screen. At block 1330, the processor 120 determines whether the direction of the movement of the hand is into or away from the ROI. Such triggering events can be detected using the Doppler, or using Doppler and angle variations… In the case of the hand pulling away but still stopping within the ambiguous region, the signature of the smaller motion of hand pulling away may or may not be reliably detectable depending on various factors such as the error level of the angle estimation (i.e., the size of the ambiguous region) and the range of the target. For a larger range, even with the same angle difference, the length of the arc will be longer and could have a higher probability of being detected using the Doppler due to a larger change in the displacement. If a triggering event is detected, then at block 1330, determination that the hand motion is away from or into the ROI causes the processor 120 to update the tracked state to the not-touched state at block 1315 or the touched state at block 1320, respectively.” Examiner maps the threshold distance to the boundary of the ambiguous region of Va.). It would have been obvious to combine Hsu and Va for the reasons given regarding claim 1. Regarding claim 4, Hsu in view of Va teaches the system of claim 1. Hsu further teaches: wherein the transmission signal comprises a radar signal between 57 GHz and 71 GHz (see at least col. 7, lines 51-53; “In one implementation the radar sensor 130 may comprise the BGT60TR13C 60 GHz radar sensor from Infineon Technologies AG.”). Regarding claim 9, Hsu in view of Va teaches the system of claim 1. Hsu further teaches: wherein the one or more reflections of the transmission signal are represented in one or more electronic signals (see at least col. 3, line 64 – col. 4, line 6; “During operation, a plurality of radars 130(1)-(N) provide input data 140. In some implementations the radar 130 may provide as output an analog signal comprising an intermediate frequency (IF) signal resulting from a product of the emitted radar frequency and a returned signal. The IF signal may then be digitized using an analog to digital converter (ADC) that samples the IF signal and provides as output a digitized representation, such as a stream of digital data. In some implementations the apparatus 102 may utilize oversampling to improve range resolution.”), and the analysis of the one or more reflections of the transmission signal comprises analyzing the one or more electronic signals (see at least col. 4, lines 26-27; “A preprocessing module 142 accepts and processes the input data 140 to determine input frames 144.”), and wherein the radar control circuitry is further configured to: analyze the one or more electronic signals using at least one of: a Fast Fourier Transform (see at least col. 4, lines 27-31; “In one implementation the preprocessing module 142 may apply a fast Fourier transform (FFT) to a plurality of the input data 140. Operation of the preprocessing module 142 is discussed in more detail with regard to FIG. 4.”); or a Chirp-Z transform. Regarding claim 10, Hsu in view of Va teaches the system of claim 1. Hsu further teaches: The system of claim 1, wherein the system comprises: a docking station configured to interoperate with an electronic device; a tablet computer; a smartphone; a computer; a television or other video display (see at least col. 2, lines 62-67; “The apparatus 102 comprises a display device (“display”) 110. The display 110 may comprise one or more of a light emitting diode (LED) display, organic LED display, liquid crystal display, quantum dot display, front projection display, rear projection display, and so forth. The display 110 may be operated to present images.”); a smartwatch; earbuds or headphones; virtual reality (VR) goggles; augmented reality (AR) glasses; an appliance; a smart speaker; or a control device for another system. In regard to claim 11, the limitation(s) recited is not required to be part of the claimed invention. Parent claim 10 teaches alternative limitations, i.e., "docking station" or “video display”. If a parent claim includes alternative limitations, and the reference teaches one of them, further limitations to the other alternative(s) in dependent claims are not required limitations. See Ex parte Werner, Appeal 2019-001448, Application No. 15/109,888, March 23, 2020, 15 pages. Here, Hsu teaches a video display, as detailed in the rejection of claim 10. Claim 11 is based on another alternative/other alternatives, i.e., a docking station. Claims 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Hsu in view of Va, further in view of Kushnir et al. (US-20200241672-A1; hereinafter Kushnir). Regarding claim 5, Hsu in view of Va teaches the system of claim 1. However, Hsu does not explicitly teach: wherein the plurality of radar reception antennas includes three reception antennas. Hsu discloses a system using radar for touch interaction with non-touchscreen display. Kushnir is directed to a device for detecting a touch input to a surface comprises at least one radar transmitter component configured to transmit electromagnetic radiation in a radio frequency spectrum. Kushnir teaches: wherein the plurality of radar reception antennas includes three reception antennas (see at least Fig. 8, radar detector #1 comprises 3 antennas. See also [0091]; “FIG. 8 shows a schematic diagram of an object scanned by two radar detectors 802; 804 (each comprising three antennas)”). Both Hsu and Kushnir teach using radar sensor to detect a hand position and give touchscreen-like functionality to a screen. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the radar sensors used in Hsu to include detectors with an array of three antennas, as taught by Kushnir. One of ordinary skill would be motivated to include the antenna arrangement of Kushnir in order to estimate the position coordinates of a detected object, as taught by Kushnir (see Kushnir at least [0091]; “FIG. 8 shows a schematic diagram of an object scanned by two radar detectors 802; 804 (each comprising three antennas), with electromagnetic radiation transmission patterns 812; 814 (user for scanning and object edges detection), and estimation of the object center coordinate 830 (e.g. the x-y-z position of the object touching the surface) based on the center 822; 824 of the electromagnetic radiation transmission patterns 812; 814.”). Regarding claim 6, Hsu in view of Va and Kushnir teaches the system of claim 5. Kushnir further teaches: wherein the plurality of radar reception antennas includes the at least three reception antennas in a linear array (see at least Fig. 8, where radar detector #1 comprises 3 antennas in a linear array). It would have been obvious to combine Hsu and Kushnir for the reasons given regarding claim 5. Claims 7 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Hsu in view of Va, further in view of Silverstein et al. (US-20170329449-A1; hereinafter Silverstein). Regarding claim 7, Hsu in view of Va teaches the system of claim 1. However, Hsu does not explicitly teach: wherein the transmission antennas and the reception antennas include printed circuit board (PCB) antennas. Hsu discloses a system using radar for touch interaction with non-touchscreen display. Silverstein is directed to devises using radar-based touch interfaces. Silverstein teaches: wherein the transmission antennas and the reception antennas include printed circuit board (PCB) antennas (see at least [0194] – [0198]; “Various types of internal antennas are optionally used with the various devices disclosed herein. Internal antennas are sometimes called embedded antennas. As used herein, an internal antenna includes any antenna that lies within the device casing… A board antenna is sometimes also called a printed circuit board (PCB) antenna or a PCB trace antenna. In some implementations, a board antenna is mounted on circuit board and is coupled to communications module. Board antennas are generally affected by the circuit board's substrate properties, such as the dielectric constant and dissipation factor.”). Both Hsu and Silverstein use radar sensors to detect a touch input. Hsu cites a specific integrated circuit as an example of the radar sensor used (see at least col. 7, lines 51-53; “In one implementation the radar sensor 130 may comprise the BGT60TR13C 60 GHz radar sensor from Infineon Technologies AG.”). Silverstein teaches that several types of internal antennas may be used in the radar sensor, including sheet metal antennas, board antennas and chip antennas (see at least [0194] – [0198]). A person of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the PCB antenna of Silverstein could have been substituted for the antenna of the integrated circuit of Hsu because Silverstein teaches that both types of antennas are able to perform touch sensing. Furthermore, a person of ordinary skill in the art would have been able to carry out the substitution. Finally, the substitution achieves the predictable result of allowing a radar to transmit and receive electromagnetic waves in order to perform touch sensing. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the PCB antenna of Silverstein for the antenna of the integrated circuit of Hsu according to known methods to yield the predictable result of providing a radar antenna adapted to performing touch sensing. Regarding claim 12, Hsu in view of Va teaches the system of claim 1. However, Hsu does not explicitly teach: wherein the function associated with the gesture includes one or more of decreasing or increasing playback volume, pausing or resuming playback of content, or skipping back or ahead in a media stream. Silverstein teaches: wherein the function associated with the gesture includes one or more of decreasing or increasing playback volume (see at least [0250] – [0251]; “In accordance with a determination that the object is in contact with the casing, the computing system identifies (1330) an input command based on at least one of: a location of the object, and a movement of the object (e.g., the object's velocity). In some implementations, the location of the object and the movement of the object are determined based on radar signals from the radar transceiver. In some implementations, the system identifies a particular input gesture based on the location and movement of the object across the casing surface. The computing system adjusts (1332) operation based on the input command. For example, the system detects a contact at a location on the casing that corresponds to a virtual on/off button and, in response to the contact, the system toggles a particular feature on or off. As another example, the system detects a swipe-up gesture across the surface of the casing, and in response to the gesture, the system increases a particular parameter (e.g., volume, luminosity, temperature, etc.).”), pausing or resuming playback of content, or skipping back or ahead in a media stream. Hsu, Va and Silverstein use radar sensors to detect a touch input. Hsu teaches tracking touch position over time (see col. 4, lines 49-56), Va teaches detecting a swiping motion (see [0134]), and Silverstein teaches adjusting the volume based on a detected swiping motion (see [0251]). It would have been obvious to one or ordinary skill in the art at the time of the claimed invention that an application of detection a moving position or a swipe, as taught by Hsu and Va, would be to adjust a volume level, as taught by Silverstein. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Hsu in view of Va, further in view of Silverstein and Kushnir. Regarding claim 8, Hsu in view of Va and Silverstein teaches the system of claim 7 and PCB antennas. However, these references do not explicitly teach: wherein the transmission antennas and the reception antennas include flame retardant 4 (FR4) PCB antennas. Kushnir teaches: wherein the transmission antennas and the reception antennas include flame retardant 4 (FR4) mounting (see at least [0098]; “For a 4 TX and 4 RX radar (comprising four 60 GHz LNAs (Low Noise Amplifier), four RX phase shifters, four 60 GHz PAs (Power Amplifier), four TX phase shifters, comb+routing, a synthesizer, direct current power, a six-port (radar module), four radar phase detectors and analog-to-digital-converters and miscellaneous other circuits), an RFIC (Radio Frequency Integrated Circuit) size of 3.2 mm2 may be required. A small 60 GHz RFEM (Radio frequency Front End Module) may be done on FR4 (fiberglass reinforced epoxy laminated) HDI (High Density Interconnect) substrate with a size of ˜15 mm2.”). Silverstein teaches that several types of internal antennas may be used in the radar sensor, including sheet metal antennas, board antennas and chip antennas (see at least [0194] – [0198]), but Silverstein does not teach a specific substrate. Kushnir teaches that radar components may be mounted on an FR4 substrate. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the PCB antennas used in Silverstein to be mounted on an FR4 substrate, as taught by Kushnir, because the substrate is shown by Kushnir to be suitable for radar applications. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ashley B. Raynal whose telephone number is (703)756-4546. The examiner can normally be reached Monday - Friday, 8 AM - 4 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vladimir Magloire can be reached at (571) 270-5144. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ASHLEY BROWN RAYNAL/Examiner, Art Unit 3648 /VLADIMIR MAGLOIRE/Supervisory Patent Examiner, Art Unit 3648
Read full office action

Prosecution Timeline

Mar 05, 2024
Application Filed
Feb 09, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601844
Satellite Signal Spoofing Detection System
2y 5m to grant Granted Apr 14, 2026
Patent 12578427
SYSTEMS AND METHODS FOR GENERATING INDEPENDENT TRANSMIT AND RECEIVE CALIBRATION MATRICES FOR MIMO RADAR SYSTEMS
2y 5m to grant Granted Mar 17, 2026
Patent 12567909
COHERENT RECEIVING DEVICE AND ANEMOMETRY LIDAR SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12560703
GENERATING POINT CLOUDS BASED UPON RADAR TENSORS
2y 5m to grant Granted Feb 24, 2026
Patent 12554013
AUTOMATIC OBSTACLE AVOIDANCE METHOD, ELECTRONIC DEVICE, AND UNMANNED AERIAL VEHICLE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
99%
With Interview (+22.7%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 36 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month