Prosecution Insights
Last updated: April 19, 2026
Application No. 18/607,178

NON-GESTURE REJECTIONS USING RADAR

Final Rejection §103
Filed
Mar 15, 2024
Examiner
ABDIN, SHAHEDA A
Art Unit
2627
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
2 (Final)
79%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
98%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
561 granted / 712 resolved
+16.8% vs TC avg
Strong +19% interview lift
Without
With
+19.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
21 currently pending
Career history
733
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
72.2%
+32.2% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
6.0%
-34.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 712 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 4-6,10 -12, 14-16 and 20 is/are rejected under 35 U.S.C. 103 as being obvious in view Qiu (US 20200356178 in view of Wang (US 2012/0280900). Regarding claims 1 and 11: Qiu discloses an electronic device (in Figs.1-2) comprising: a transceiver (270) configured to transmit and receive radar signals ([0057, 0065]); and a processor (processor corresponding to transceiver) operatively coupled to the transceiver (see abstract), the processor configured to: extract a plurality of feature vectors (i.e. gesture information, see Figs. 4b-4c) from a plurality of radar frames (radar signals; signature gesture) corresponding to the radar signals ([0035], [0042], [0050-0055, 0065], abstract); (a processor operably connected to a transceiver. The transceiver is configured to transmit and receive signals for measuring range and speed. The processor is configured to track movement of an object relative to the electronic device within a region of interest based on reflections of the signals received by the transceiver to identify range measurements and speed measurements with the object and further configured to identify a gesture based in part on the features from the reflected signals. Additionally, the processor is configured to perform an action indicated by the gesture). identify an activity (activity corresponding to signature gesture) based on the plurality of feature vectors (i.e. gesture direction, see Fig. 4B-4C); determine whether the identified activity corresponds with a non-gesture (i.e. non-signature gesture) (see [0035-0036], After identifying a signature gesture, the electronic device performs an action (such as an operation with respect to an application). Alternatively, after identifying a signature , the electronic device notifies the user to perform a non-signature gesture, and upon identifying the non-signature gesture, the electronic device will perform an action (such as an operation with respect to an application); and Note that Qiu discloses a non-gesture (i.e. non-signature gesture) ([0035-0036]). However, Qiu does not specifically disclose identify a gesture that corresponds with the activity subsequent to a determination that the activity failed to correspond with a non-gesture; and perform an action corresponding with the identified gesture. Wang discloses identify a gesture (identify gesture i.e. radial movement for gaming) that corresponds with the activity (gaming) subsequent to a determination that the activity failed (fails to sensing the gesture activities) to correspond with a non-gesture (i.e. volume increase or decrease of the device) ; and perform an action corresponding with the identified gesture (radial movement) (see Fig. 5) (When the optical sensor fails to sensing the activities (such as gaming activities or volume increasing or decreasing of the device) then the radar sensor identify the gesture that corresponding to the activities: camera 105a has limited capability for accurately determining whether an object is moving radially, i.e. towards or away from the terminal 100, data received from the radar sensor 105b can provide an accurate indication of radial movement. (see [0062-0066], [0069-0070]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qiu with the teaching of Wang, thereby achieving a greater accuracy in terms of identifying a particular input information. Regarding claims 2 and 12: Qiu in view of Wang discloses wherein to determine whether the identified activity corresponds with a non-gesture (i.e. touch object), the processor is further configured to: determine whether the identified activity is an early detection (provide information for adjust weighting factors for the gesture) (see Qiu [0105-0107] and Wang [0069-0070]); and if the identified activity is not an early detection: estimate a gesture start (enabling ) and a gesture end (disabling gesture) (The gesture recognition system may be configured to identify, from the received image and radio sensing signals, both a translational and a radial movement and/or radial distance for an object with respect to the apparatus and to determine therefrom the one or more predetermined gesture for controlling the first user interface function); preprocess the plurality of feature vectors (see Wang [0015] discloses the gesture recognition system may be configured to identify, from the received image and radio sensing signals, both a translational and a radial movement and/or radial distance for an object with respect to the apparatus and to determine therefrom the one or more predetermined gestures for controlling the first user interface function. The gesture recognition system may be configured to identify, from the received image signal, a motion vector associated with the foreground object's change of position between subsequent image frames and to derive therefrom the translational movement.); and perform an activity gating operation based on the preprocessed plurality of feature vectors (i.e. control the transmission and recieption of radar pulses, signal to pass through the selected time) ([0046-0047]). Same motivation as applied to claim 1. Regarding claims 4 and 14: Qiu in view of Wang discloses wherein to preprocess the plurality of feature vectors (see Qiu [0093]), the processor is further configured to: for each feature vector of the plurality of feature vectors: if a value of the feature vector indicates a failure (i.e. no recognized vactor) to detect activity (see Fig. 5H, [0092-0095]), change the value indicating a failure to detect activity to a value of a closest neighbor feature vector ([0127]); filter the feature vector with a median filter [0127]; and if the value of the feature vector is an abnormal value (invalid), change the abnormal value to a value of a closest neighbor having a normal value ([0127-0129]). Regarding claims 5 and 15: Qiu discloses wherein to perform the activity gating operation, the processor is further configured to: determine whether the identified activity corresponds with a non-gesture based on at least one of: a gesture length (gesture duration, [0149] ); a region of interest (ROI) (0101); a gesture start and gesture end (0100-0101); a gesture motion size [0179]; and a gesture slope signature . Regarding claims 6 and 16: Qiu discloses wherein to perform the activity gating operation, the processor is further configured to: determine that the identified activity corresponds with a gesture ([0037]); identify a gesture type corresponding with the identified activity ([0036-0038]); and perform a gesture gating operation based on the gesture type ([0037]). Regarding claims 10 and 20: Qiu discloses wherein the processor is further configured to perform a gesture gating operation based on the identified gesture [0037], wherein the action corresponding with the identified gesture is performed based on a result of the gesture gating operation ([0036-0038]). 3. Claim(s) 3 and 13 is/are rejected under 35 U.S.C. 103 as being obvious in view Qiu (US 20200356178 in view of Wang (US 2012/0280900) and further in view of Regani (US 20210232235 A1). Regarding claims 3 and 13: Qiu in view of Wang discloses wherein to determine whether the identified activity is an early detection, the processor is further configured to: detect a gesture end (see Wang ([0069-0070]); Qiu discloses determine angular features for a time-angle diagram (TAD) and a time-elevation diagram (TED) corresponding with the identified activity (see Qiu [0094-0095], Fig. 5G); However, Qiu in view of Wang does not specifically disclose determine a dispersion metric based on the angular features; and determine whether the dispersion metric falls within a dispersion range, wherein if the dispersion metric falls within the dispersion range, the identified activity is an early detection. Regani (US 20210232235 A1) discloses determine angular features for a time-angle diagram (TAD) and a time-elevation diagram (TED) corresponding with the identified activity (i.e. gesture) (see Fig. 5-6) [0275-0276]); determine a dispersion metric based on the angular features (different ranges and azimuth angless from the radar, also see [0190]); and determine whether the dispersion metric falls within a dispersion range, wherein if the dispersion metric falls within the dispersion range, the identified activity is an early detection (i.e. previously detected target) (see [0285]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qiu with the teaching of Wang, and Regani, thereby providing high efficient data transmission in the input device. Allowable Subject Matter Claims 7-9 and 17-19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Regarding claims 7 and 17: The closest art of record singly or in combination fails to teach or suggest the limitations “identify a segment of the input window with consecutive non-zero distance forward from the search pivot; trim the segment to exclude outside region of interest (ROI) frames from both ends of the segment; update the gesture start and the gesture end based on the trimmed segment; update the search pivot according to a trimmed segment start; and determine whether the search pivot has reached the input window start (see Fig. 2, and [0138-0139]). Response to Applicant’s argument 5. Applicant’s argument filed on 12/10//2025 has been considered but are not persuasive. Applicant argues that Qui in view of Wang fails to discloses the subject matters as describe at claim 1. More specifically the Applicant argues that Qui discloses an activity detector 540 that identifies a non-gesture portion and a gesture portion of extracted features such as range velocity and angle. See, e.g., Qui, paragraph [0100]. Therefore, determining whether an identified activity corresponds with a non-gesture as recited in Claim 1 is distinct from determining whether an identified activity corresponds with a non-signature gesture as allegedly disclosed by Qui. Wang does not provide any disclosure that remedies the noted deficiencies of Qui. The Office Action concedes that Qui does not disclose "disclose if the activity fails to correspond with a non-gesture: identify a gesture that corresponds with the activity; and perform an action corresponding with the., Wang fails to disclose or suggest checking for the condition of Claim 1 "if the activity fails to correspond with a non- gesture." Therefore, Wang also fails to disclose or suggest identifying a gesture that corresponds13. In responds the Examiner disagrees with the Applicant’s point of view. Wang discloses identify a gesture (identify gesture i.e. radial movement for gaming) that corresponds with the activity (gaming) subsequent to a determination that the activity failed (fails to sensing the gesture activities) to correspond with a non-gesture (i.e. volume increase or decrease of the device) ; and perform an action corresponding with the identified gesture (radial movement) (see Fig. 5) (When the optical sensor fails to sensing the activities (such as gaming activities or volume increasing or decreasing of the device) then the radar sensor identify the gesture that corresponding to the activities: camera 105a has limited capability for accurately determining whether an object is moving radially, i.e. towards or away from the terminal 100, data received from the radar sensor 105b can provide an accurate indication of radial movement. (see [0062-0066], [0069-0070]). Therefore, combining the reference of Qui and Wang would be obvious to meet the limitations as recited in claim 1 and 11. Inquiry 6. Any inquiry concerning this communication or earlier communication from the examiner should be directed to Shaheda Abdin whose telephone number is (571) 270-1673. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao could be reached at (571) 272-7671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about PAIR system, see http://pari-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHAHEDA A ABDIN/Primary Examiner, Art Unit 2627
Read full office action

Prosecution Timeline

Mar 15, 2024
Application Filed
Sep 06, 2025
Non-Final Rejection — §103
Dec 10, 2025
Response Filed
Mar 21, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603056
DATA DRIVING INTEGRATED CIRCUIT, DISPLAY APPARATUS, AND PIXEL COMPENSATION METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12598879
ORGANIC LIGHT EMITTING DIODE DISPLAY DEVICE AND METHOD OF DRIVING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12581801
DISPLAY SUBSTRATE AND DISPLAY APPARATUS
2y 5m to grant Granted Mar 17, 2026
Patent 12579959
DISPLAY DEVICE AND CONTROL METHOD THEREFOR
2y 5m to grant Granted Mar 17, 2026
Patent 12573345
GATE DRIVING PANEL CIRCUIT, DISPLAY PANEL AND DISPLAY DEVICE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
79%
Grant Probability
98%
With Interview (+19.0%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 712 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month