Prosecution Insights
Last updated: April 19, 2026
Application No. 17/975,863

MULTI-FRAME PROCESSING FOR FINE MOTION DETECTION, LOCALIZATION, AND/OR TRACKING

Final Rejection §103
Filed
Oct 28, 2022
Examiner
EDRADA, ISABELLA AMEYALI
Art Unit
3648
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Texas Instruments Incorporated
OA Round
2 (Final)
50%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
1 granted / 2 resolved
-2.0% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
46 currently pending
Career history
48
Total Applications
across all art units

Statute-Specific Performance

§101
8.4%
-31.6% vs TC avg
§103
50.8%
+10.8% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
12.6%
-27.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 2 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The Amendment filed 09/08/2025 has been entered. Claims 1-19 are pending in the application, where claim 20 has been withdrawn. Response to Arguments Applicant's arguments filed 09/08/2025 have been fully considered but they are not persuasive. Regarding Applicant’s arguments for the USC § 103 rejection of claims 1 and 11, Applicant argues on pg. 7 of the Remarks, “However, Va does not teach determining different locations of the same object as its motion changes from moving at an estimated velocity to stationary with fine motion present, in which a single frame of reflected chirps is used to determine the location of the object when it is determined to be moving at the estimated velocity and a plurality of frames of reflected chirps is used to determine the location of the object when it is determined to be stationary with fine motion present.” Examiner respectfully disagrees. In Fig. 5C, Va provides an example of detecting the same object, a hand, with various frame processing as the object moves to different locations with varying speed. Va elaborates further on pg. 12, paragraph 0125, “An illustrative example of this issue is shown in the diagram 580 of FIG. 5C. In this example, the target (the hand) is moving away from the device during the measurements.” In Fig. 6, Va discloses a method for determining a location of an object using single frames or a plurality of frames. On pg. 15, paragraph 0162, Va further discloses that this method can be repeated. “For example, while shown as a series of steps, various steps in FIGS. 6-12 could overlap, occur in parallel, or occur any number of times” It is reasonable to believe that this method could be repeated on the same object at different times, following the radar monitoring of the object as the object moves at different speeds and different locations. For at least these reasons, Examiner is unpersuaded and maintains previous rejections corresponding to the USC § 103 rejection. Therefore, the Examiner asserts that Va et al. (US 20220413120 A1) and Skeoch et al. (US 11703583 B1) disclose each and every limitation of independent claims 1 and 11 based on the broadest reasonable interpretation of claims 1 and 11. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-7, 10-16, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Va et al. (US 20220413120 A1) in view of Skeoch et al. (US 11703583 B1). Regarding claim 1, Va discloses [Note: what Va fails to clearly disclose is strike-through] A device (see Fig. 3A, electronic device 300) comprising: a radar sensor (see pg. 6, paragraph 0064, “the transmitter 304 and the receiver 306 can be included within the radar transceiver 270 of FIG. 2”) processing circuitry (see Fig. 3A, processor 302) configured to: determine, at a first time, that a first object is moving at an estimated velocity that is greater than a threshold level (see Figs. 5A and 5B; pg. 12, paragraph 0126, Figures 5A and 5B are “one example for detecting a moving object”; pg. 6, paragraph 0065, “The processor 302 can identify the information associated with the target object 308, such as the speed the target object 308 is moving”; see pg. 6, paragraph 0060, “the radar transceiver 270 can transmit and receive signals for measuring range and speed of an object”; pg. 13, paragraph 0132, “The amplitude level corresponds to an amount of movement of a detected object.”; pg. 13, paragraph 0134, “in step 706, the electronic device 200 compares the amplitude of the detected object to a threshold”); responsive to determining at the first time that the first object is moving at the estimated velocity, determine a first location of the first object (see Fig. 7, steps 706 and 708; Figs. 5A and 5B; pg. 12, paragraph 0126, Figures 5A and 5B are “one example for detecting a moving object and estimating its location”; pg. 6, paragraph 0065, “The processor 302 can identify the information associated with the target object 308, such as … the distance the target object 308 is from the electronic device 300”) using a single frame (see pg. 1, paragraph 0006, “The processor is also configured to detect the object using a single radar frame”) determine, at a second time after the first time, that the first object is stationary with fine motion present; (see Fig. 6; pg. 15, paragraph 0162, the process of Fig. 6 can be repeated, “For example, while shown as a series of steps, various steps in FIGS. 6-12 could overlap, occur in parallel, or occur any number of times”; pg. 11, paragraph 0117, radar is able to detect micro-movements; pg. 4, paragraph 0041, “when the object is moving slow or remaining stationary (except for micro-movements), embodiments of the present disclosure describe using a longer frame (or multiple frames) for the spatial covariance matrix”) responsive to determining at the second time the first object is stationary with fine motion present, determine a second location of the first object using a plurality of frames of the reflected chirps (see Fig. 7, steps 706, 710, and 712); determine that a second object is stationary (see pg. 16, paragraph 0167, the radar signals can determine if the object is stationary or moving); and responsive to determining that the second object is stationary, determine a third location of the second object (see pg. 6, paragraph 0065, “The processor 302 can identify the information associated with the target object 308, such as … the distance the target object 308 is from the electronic device 300”) using a plurality of frames (see pg. 16, paragraph 0166, “the electronic device 200 detects an object using… multiple radar frames”; pg. 4, paragraph 0041, “when the object is moving slow or remaining stationary (except for micro-movements), embodiments of the present disclosure describe using a longer frame (or multiple frames) for the spatial covariance matrix”) Skeoch discloses a radar sensor configured to receive reflected chirps (see Fig. 2; col. 3, lines 54-63, “the radar sensor 202 samples the monitored area 201 with a transmit signal, which includes the chirps… a chirp that is sent out reflects off surfaces and returns to the radar sensor 202 as reflection signals”)… … determine a… location… using reflected chirps (see col. 3, lines 48-51, the radar sensor can determine the distance of the object with respect to the sensor) It would have been obvious to someone with ordinary skill in the art prior to the effective filing date of the claimed invention to incorporate the features as disclosed by Skeoch into the invention of Va. Both Va and Skeoch are considered analogous arts to the claimed invention as they both disclose both moving and stationary object detection radar systems and methods. Va discloses the radar detection and distinction of moving and stationary objects, as well as their locations, using single frame and multi-frame detection and processing; however, Va fails to disclose the reflected chirps component of the radar device. This feature is disclosed by Skeoch where the radar device is able to transmit chirps and receive reflected chirps to determine aspects of a target. The combination of Va and Skeoch would be obvious with a reasonable expectation of success in order to improve object detection accuracy by providing more data to process (see Skeoch col. 3, lines 64-66) and because chirp radar signals can be used at short distances, the higher pulse rate can enable real-time tracking, and the chirps may provide higher spatial resolution. Regarding claim 2, Va further discloses The device of claim 1, wherein the estimated velocity is a first estimated velocity; wherein to determine, at the first time, that the first object is moving, the processing circuitry is configured to determine that the first estimated velocity of the first object (see pg. 6, paragraph 0060, “the radar transceiver 270 can transmit and receive signals for measuring range and speed of an object”; pg. 13, paragraph 0132, “The amplitude level corresponds to an amount of movement of a detected object.”) is greater than a threshold level (see pg. 13, paragraph 0134, “in step 706, the electronic device 200 compares the amplitude of the detected object to a threshold”), and wherein to determine that the second object is stationary, the processing circuitry is configured to determine that a second estimated velocity of the second object (see pg. 6, paragraph 0060, “the radar transceiver 270 can transmit and receive signals for measuring range and speed of an object”; pg. 12, paragraph 0124, “the low amplitude likely means that object has little movement ”) is greater than the threshold level (see pg. 13, paragraph 0134, “in step 706, the electronic device 200 compares the amplitude of the detected object to a threshold”). Regarding claim 3, Va further discloses The device of claim 1, wherein to determine the third location, the processing circuitry is configured to process a subset of the reflected chirps across the plurality of frames (see Fig. 5A, element 510 obtain radar measurements; pg. 10, paragraph 0108, “the processing could be performed once per N radar frames”; Fig. 5B, element 522 obtain 1 frame of radar measurements; pg. 10, paragraph 0113, “The step 522 can obtain the radar measurements from step 510 of FIG. 5A”). Regarding claim 4, Va further discloses The device of claim 1, wherein to determine the third location, the processing circuitry is configured to process a respective chirp from each frame of the plurality of frames (see Fig. 3B, element 344, individual pulses [or chirps] within a plurality of frames; Fig. 3C, element 358a, processing frame within the plurality of frames; Fig. 5B, element 524, processing done to identify range-amplitude map of each pulse). Regarding claim 5, Va further discloses The device of claim 1, wherein the processing circuitry is configured to run in a time-division mode to interleave the processing of the single frame and the processing of the plurality of frames (see Fig. 3C, single frame and multiple frames can be processed; pg. 11, paragraph 0119, “With single frame processing, the observation of the radar signal within the processing frame is equal to the frame TX interval”; pg. 11, paragraph 0123, “The difference between the single-frame and the multi-frame processing… is in the number of pulses”). Regarding claim 6, Va further discloses The device of claim 1, wherein the plurality of frames is a first plurality of frames, and wherein the processing circuitry is configured to: refrain from processing a second plurality of frames of the reflected chirps for at least five frames after processing the first plurality of frames (see Fig. 3C; pg. 8, paragraph 0084, the length of processing of plurality of frames can depend on the frame spacing and frame transmission interval, which are adjustable; pg. 11, paragraph 0120, frame spacing can change the processing duration); and process the second plurality of frames to determine the third location at least five frames after processing the first plurality of frames (see Fig. 3C; pg. 8, paragraph 0084, the length of processing of plurality of frames can depend on the frame spacing and frame transmission interval, which are adjustable; pg. 11, paragraph 0120, frame spacing can change the processing duration). Regarding claim 7, Va further discloses The device of claim 6, wherein the processing circuitry is further configured to operate in a single-frame processing mode after every frame to detect the first object (see Fig. 5B, element 520 is single frame processing). Regarding claim 10, Va further discloses The device of claim 1, wherein the processing circuitry is configured to: detect a first number of objects using the single frame (see Fig. 13, element 1304; pg. 7, paragraph 0075, there may be multiple targets distinguishable by delays); detect a second number of objects using the plurality of frames (see Fig. 13, element 1304; pg. 7, paragraph 0075, there may be multiple targets distinguishable by delays); determine a difference between the first and second numbers exceeds a threshold level (see pg. 11, paragraph 0116, “In step 532, the electronic device 200 performs the object detection [and]…detects the peaks in the range profile and compares the value at the peak with a detection threshold… This threshold could be selected to balance misdetection and false alarm rate”); and responsive to determining that the difference exceeds the threshold level, increase a confidence level for setting a new track (see Fig. 11, depending on object detection the system will determine K frames to use for detection). Regarding claims 11-16, the same cited sections and rationale for claims 1-6 are applied. The only difference between claims 1-6 and claims 11-16 is that claims 1-6 refer to a device while claims 11-16 refer to a method. The examiner considers Va pg. 1, paragraph 0007, “the method also includes detecting the object using a single radar frame or multiple radar frames from the radar signals” to show that the radar device performs the radar method of claims 11-16. Regarding claim 19, the same cited sections and rationale for claim 10 is applied. Claims 8, 9, 17, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Va et al. (US 20220413120 A1) in view of Skeoch et al. (US 11703583 B1) and further in view of Hevdeli et al. (US 20230196896 A1). Regarding claim 8, Va discloses [Note: what Va fails to clearly disclose is strike-through] The device of claim 1, wherein the processing circuitry is configured to: identify, in the plurality of frames, a set of points (see pg. 10, paragraph 0114, the RA [range-amplitude] map captures a set of values) having a first velocity (see pg. 13, paragraph 0132, “The amplitude level corresponds to an amount of movement of a detected object.”) exceeding a threshold level (see pg. 13, paragraph 0134, “in step 706, the electronic device 200 compares the amplitude of the detected object to a threshold”); Hevdeli discloses remove the set of points from a point cloud (see pg. 5, paragraph 0071, the processing unit can remove points according to a threshold in order to obtain a point cloud representing desired points); and determine the second location based on the point cloud (see pg. 5, paragraph 0073, a location of an object can be calculated from a point cloud). It would have been obvious to someone with ordinary skill in the art prior to the effective filing date of the claimed invention to incorporate the features as disclosed by Hevdeli into the invention of Va. Both Va and Hevdeli are considered analogous arts to the claimed invention as they both disclose radar devices and methods for detecting human movement in a room. Va discloses identifying a set of points, their velocity, and comparing it to a threshold level; however, Va and Skeoch fail to disclose a point cloud. This feature is disclosed by Hevdeli where the reflected measurements create a point cloud. The combination of Va and Hevdeli would be obvious with a reasonable expectation of success in order to improve accuracy of object detection and classification among many measurement reflection points (see Hevdeli pg. 1, paragraph 0007). Regarding claim 9, Hevdeli further discloses The device of claim 8, wherein the set of points is first set of points, and wherein the processing circuitry is configured to: identify, in the plurality of frames (see pg. 10, paragraph 0142, the processing unit can classify based on multiple frames), a second set of points having a second velocity not exceeding the threshold level (see pgs. 4-5, paragraph 0070, the processing unit can detect velocity of points in the point cloud and identify if they are below a threshold level); identify, in the single frame (see pg. 10, paragraph 0142, the processing unit can classify based on a single frame), a third set of points (see pg. 4, paragraph 0069, each reflected wave measurement may include a set of points); and add the second and third sets of points to the point cloud (see pg. 4, paragraph 0069, the set of points from each reflected wave makes up the point cloud). It would have been obvious to someone with ordinary skill in the art prior to the effective filing date of the claimed invention to incorporate the features as disclosed by Hevdeli into the invention of Va. Va fails to disclose adding sets of points to a point cloud. This feature is disclosed by Hevdeli where the reflected measurements create sets that make up a point cloud. The combination of Va and Hevdeli would be obvious with a reasonable expectation of success in order to improve accuracy of object detection and classification among many measurement reflection points, for example by creating reference parameters based on point cloud data and using that for comparison with other radar detections (see Hevdeli pg. 5, paragraph 0072). Regarding claim 17, the same cited section and rationale as claim 8 is applied. Regarding claim 18, the same cited section and rationale as claim 9 is applied. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ISABELLA AMEYALI EDRADA whose telephone number is (571)272-4859. The examiner can normally be reached Mon - Fri 9am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Kelleher can be reached at (571) 272-7753. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ISABELLA AMEYALI EDRADA/Examiner, Art Unit 3648 /William Kelleher/Supervisory Patent Examiner, Art Unit 3648
Read full office action

Prosecution Timeline

Oct 28, 2022
Application Filed
Jun 04, 2025
Non-Final Rejection — §103
Sep 08, 2025
Response Filed
Dec 05, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596175
A NON-RESOLVED TARGET DETECTION SYSTEM AND METHODS
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
50%
Grant Probability
99%
With Interview (+100.0%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 2 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month