Prosecution Insights
Last updated: April 19, 2026
Application No. 18/575,695

OBJECT TRACKING ON THE BASIS OF A MOVEMENT MODEL

Non-Final OA §101§102§103
Filed
Aug 16, 2024
Examiner
WILLIS, BRANDON Z.
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
VALEO SCHALTER UND SENSOREN GMBH
OA Round
1 (Non-Final)
69%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
140 granted / 203 resolved
+17.0% vs TC avg
Strong +38% interview lift
Without
With
+38.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
23 currently pending
Career history
226
Total Applications
across all art units

Statute-Specific Performance

§101
11.3%
-28.7% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
27.3%
-12.7% vs TC avg
§112
9.1%
-30.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 203 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 12/29/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Drawings The drawings are objected to because the unlabeled rectangular boxes shown in the drawings should be provided with descriptive text labels. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. In addition to Replacement Sheets containing the corrected drawing figure(s), applicant is required to submit a marked-up copy of each Replacement Sheet including annotations indicating the changes made to the previous version. The marked-up copy must be clearly labeled as “Annotated Sheets” and must be presented in the amendment or remarks section that explains the change(s) to the drawings. See 37 CFR 1.121(d)(1). Failure to timely submit the proposed drawing and marked-up copy will result in the abandonment of the application. Specification The disclosure is objected to because of the following informalities: On page 13, line 17, “step S4” should read “step S5”. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 15 is directed to a computer program product, i.e. “software per se”. “Software per se”, when claimed without any structural limitations, does not have a physical or tangible form. Therefore, it does not fall within one of the four categories of patent eligible subject matter and is ineligible under 35 USC 101. see MPEP 2106.03. If support is found within the specification, Applicant is advised to amend the claim(s) to recite “A non-transitory computer readable medium comprising a computer program comprising machine readable instructions that, when executed by a processor, performs: [the claimed functions]”, or equivalent language. see MPEP 2106.03 (I). A claim directed toward a non-transitory computer readable medium would comprise an article of manufacture and thus fall within one of the four categories of patent eligible subject matter. Therefore, claim 15 is rejected under 35 USC §101 as being directed toward ineligible subject matter. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1 and 6-15 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Becker et al. (U.S. Publication No. 2018/0322650; hereinafter Becker). Regarding claim 1, Becker teaches an object tracking method, the method comprising estimating, by at least one computing unit, a first state of an object to be tracked based on a predefined movement model for the object to be tracked (Becker: Par. 31; i.e., the vehicle filter can employ a motion model that models the location of the tracked vehicle using a vehicle bounding shape and an observation model that infers the location of an observation bounding shape from sensor observations), wherein the first state comprises a first direction of movement of a point to be tracked (Becker: Par. 39; i.e., the dominant side for the observation bounding shape can be the side that includes the dominant vertex and has an orientation that is closest to an expected orientation (e.g., an expected heading of the tracked vehicle); the expected heading corresponds to a first direction of movement of the vertex); generating, by an environmental sensor system, environmental sensor data which represent the object to be tracked (Becker: Par. 63; i.e., the perception system 103 can receive sensor data from the one or more sensors; Par. 63; i.e., the sensor data can include information that describes the location of objects within the surrounding environment of the autonomous vehicle); determining, by the at least one computing unit, a geometric orientation of the object to be tracked based on the environmental sensor data (Becker: Par. 69; i.e., the perception system 103 can determine, for each object, state data that describes a current state of such object. As examples, the state data for each object can describe an estimate of the object's: … current orientation); shifting, by the at least one computing unit, the point to be tracked depending on a first deviation of the geometric orientation from the first direction of movement; and determining, by the at least one computing unit, a second state of the object to be tracked depending on the first state and the shifted point (Becker: Par. 37; i.e., the vehicle filter can assign a weight to each of the one or more predicted shape locations predicted by the motion model, where the weight assigned to each predicted shape location corresponds to an amount of agreement between such predicted shape location and the shape location. In some implementations, a Gaussian representation of the estimated shape location or other estimated state parameters can be computed from the weights; Par. 40; i.e., the locations of the dominant vertex and/or side for each predicted shape location predicted by the motion model can be respectively compared to the locations of the dominant vertex and/or side of the observation bounding shape… predicted locations of the vehicle bounding shape can be updated or otherwise corrected based on the locations of the dominant vertex). Regarding claim 6, Becker teaches the method according to claim 1. Becker further teaches wherein the second state contains a second direction of movement of the shifted point (Becker: Par. 91; i.e., the object updater 304 can update the tracking system 208's best estimate of the current state of such tracked object; Par. 86; i.e., the state of each object can include the object's … current heading), and a second deviation of the geometric orientation from the second direction of movement is smaller than the first deviation (Becker: Par. 98; i.e., the respective weight assigned to each predicted shape location can be indicative of an amount of agreement between such predicted shape location and the shape location; the predicted shape location 754 can receive a higher weight than the predicted shape locations 752 and 756; the higher weight indicates a smaller deviation). Regarding claim 7, Becker teaches the method according to claim 6. Becker further teaches wherein the second direction of movement is equal to the geometric orientation (Becker: Fig. 7; i.e., as displayed in Figure 7, the heading 764 is equal to the orientation of the object). Regarding claim 8, Becker teaches the method according to claim 1. Becker further teaches generating a point cloud based on the environmental sensor data or providing the point cloud in the environmental sensor data; identifying a part of the point cloud representing the object to be tracked; and determining a bounding figure, wherein the bounding figure comprises the part of the point cloud, and wherein the bounding figure has a predefined shape (Becker: Par. 35; i.e., the observation model can fit the observation bounding shape around a plurality of observed points (e.g., points detected by a Light Detection and Ranging (LIDAR) system), thereby providing a shape location for the observation bounding shape… the observation bounding shape can be a bounding polygon (e.g., rectangle), bounding polyhedron (e.g., cube)), wherein the geometric orientation of the object to be tracked corresponds to a spatial orientation of the bounding figure (Becker: Fig. 7; i.e., as displayed in Figure 7, the orientation of the bounding figure corresponds to the orientation of the tracked object). Regarding claim 9, Becker teaches the method according to claim 1. Becker further teaches generating a camera image based on the environmental sensor data or providing the camera image in the environmental sensor data; and determining a bounding figure, wherein the bounding figure comprises a representation of the object to be tracked in the camera image, and wherein the bounding figure has a predefined shape (Becker: Par. 35; i.e., Par. 35; i.e., the observation model can fit the observation bounding shape around a plurality of observed points, thereby providing a shape location for the observation bounding shape… the observation bounding shape can be a bounding polygon (e.g., rectangle), bounding polyhedron (e.g., cube); Par. 66; i.e., for one or more cameras, various processing techniques can be performed to identify the location of a number of points that correspond to objects that are depicted in imagery captured by the one or more cameras), wherein the geometric orientation of the object to be tracked corresponds to a spatial orientation of the bounding figure (Becker: Fig. 7; i.e., as displayed in Figure 7, the orientation of the bounding figure corresponds to the orientation of the tracked object). Regarding claim 10, Becker teaches the method according to claim 8. Becker further teaches wherein the predefined shape of the bounding figure corresponds to a rectangle and the spatial orientation of the bounding figure is parallel to one side of the rectangle, or the predefined shape of the bounding figure corresponds to a cuboid and the spatial orientation of the bounding figure is parallel to an edge of the cuboid (Becker: Par. 35; i.e., the observation bounding shape can be a bounding polygon (e.g., rectangle), bounding polyhedron (e.g., cube); as displayed in Figure 7, the orientation of the bounding shape is parallel to one side of the rectangle). Regarding claim 11, Becker teaches the method according to claim 1. Becker further teaches wherein a method based on a Kalman filter is used to determine the second state (Becker: Par. 31; i.e., the vehicle filter can be or include an unscented Kalman filter). Regarding claim 12, Becker teaches the method according to claim 1. Becker further teaches a method for at least partially automatically guiding an ego vehicle, the method comprising: carrying out, by the ego vehicle, the object tracking method as claimed in claim 1; and generating, by a control unit of the ego vehicle, at least one control signal for at least partially automatically guiding the ego vehicle (Becker: Par. 32; i.e., an autonomous vehicle can include a tracking system that tracks the locations of proximate vehicles or other objects based on sensor observations; Par. 74; i.e., the motion planning system 105 can provide the selected motion plan to a vehicle controller 106 that controls one or more vehicle controls 107 (e.g., actuators or other devices that control gas flow, steering, braking, etc.) to execute the selected motion plan), wherein generating the at least one control signal depends on the second state of the object to be tracked (Becker: Par. 72; i.e., The motion planning system 105 can determine a motion plan for the autonomous vehicle 10 based at least in part on the predicted one or more future locations for the object and/or the state data for the object provided by the perception system). Regarding claim 13, Becker teaches an electronic vehicle guidance system for an ego vehicle (Becker: Par. 32; i.e., an autonomous vehicle can include a tracking system that tracks the locations of proximate vehicles or other objects based on sensor observations), the electronic vehicle guidance system comprising: at least one computing unit which is configured to estimate a first state of an object to be tracked based on a predefined movement model for the object to be tracked (Becker: Par. 31; i.e., the vehicle filter can employ a motion model that models the location of the tracked vehicle using a vehicle bounding shape and an observation model that infers the location of an observation bounding shape from sensor observations), wherein the first state comprises a first direction of movement of a point to be tracked (Becker: Par. 39; i.e., the dominant side for the observation bounding shape can be the side that includes the dominant vertex and has an orientation that is closest to an expected orientation (e.g., an expected heading of the tracked vehicle); the expected heading corresponds to a first direction of movement of the vertex); and an environmental sensor system which is configured to generate environmental sensor data representing the object to be tracked (Becker: Par. 63; i.e., the perception system 103 can receive sensor data from the one or more sensors; Par. 63; i.e., the sensor data can include information that describes the location of objects within the surrounding environment of the autonomous vehicle); wherein the at least one computing unit is configured to carry out a method comprising: determining a geometric orientation of the object to be tracked based on the environmental sensor data (Becker: Par. 69; i.e., the perception system 103 can determine, for each object, state data that describes a current state of such object. As examples, the state data for each object can describe an estimate of the object's: … current orientation); shifting the point to be tracked depending on a first deviation of the geometric orientation from the first direction of movement; and determining a second state of the object to be tracked depending on the first state and the shifted point (Becker: Par. 37; i.e., the vehicle filter can assign a weight to each of the one or more predicted shape locations predicted by the motion model, where the weight assigned to each predicted shape location corresponds to an amount of agreement between such predicted shape location and the shape location. In some implementations, a Gaussian representation of the estimated shape location or other estimated state parameters can be computed from the weights; Par. 40; i.e., the locations of the dominant vertex and/or side for each predicted shape location predicted by the motion model can be respectively compared to the locations of the dominant vertex and/or side of the observation bounding shape… predicted locations of the vehicle bounding shape can be updated or otherwise corrected based on the locations of the dominant vertex). Regarding claim 14, Becker teaches the system according to claim 13. Becker further teaches wherein the environmental sensor system contains a camera and/or a lidar system and/or a radar system (Becker: Par. 63; i.e., the one or more sensors 101 can include a Light Detection and Ranging (LIDAR) system, a Radio Detection and Ranging (RADAR) system, one or more cameras (e.g., visible spectrum cameras, infrared cameras, etc.)). Regarding claim 15, Becker teaches the method according to claim 1. Becker further teaches a computer program product containing instructions which, when executed by an electronic vehicle guidance system for an ego vehicle, cause the electronic vehicle guidance system to carry out the method as claimed in claim 1 (Becker: Par. 75; i.e., each of the perception system 103, the prediction system 104, the motion planning system 105, and the vehicle controller 106 includes one or more sets of computer-executable instructions that are stored in a tangible computer-readable storage medium, wherein the electronic vehicle guidance system comprises: the at least one computing unit; and the environmental sensor system (Becker: Par. 57; i.e., The vehicle computing system 102 includes one or more processors; Par. 62; i.e., the vehicle computing system 102 can include a perception system). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 2 and 4 are rejected under 35 U.S.C. 103 as being unpatentable over Becker and further in view of Hoang et al. (U.S. Patent No. 12110042; hereinafter Hoang). Regarding claim 2, Becker teaches the method according to claim 1, but does not explicitly teach determining a current radius of movement of the point to be tracked based on the first state; and shifting the point to be tracked depending on the current radius of movement. However, in the same field of endeavor, Hoang teaches determining a current radius of movement of the point to be tracked based on the first state (Hoang: Col. 21, lines 27-29; i.e., a pursuit algorithm determines the curvature of an arc that will connect a first position (e.g., of a moving actor) to a second goal position); and shifting the point to be tracked depending on the current radius of movement (Hoang: Col. 16, lines 15-19; i.e., FIG. 3 is an illustration of a moving actor 300 having a reference point 302 (e.g., an autonomous platform, a vehicle, a pedestrian, any moving object, etc.) that is traveling along path 304, with projected future-time positions 310, 320, and 330; the point to be tracked is shifted along the traveling path based on the radius of movement). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Becker to have further incorporated determining a current radius of movement of the point to be tracked based on the first state; and shifting the point to be tracked depending on the current radius of movement, as taught by Hoang. Doing so would allow for improved autonomous vehicle performance (Hoang: Col. 2, lines 16-20; i.e., the improved trajectory generation techniques introduced in the present disclosure can provide for autonomous platforms (e.g., autonomous vehicles) to use available data to more capably understand, anticipate, and interact with their environment(s)). Regarding claim 4, Becker in view of Hoang teaches the method according to claim 2. Hoang further teaches wherein the point to be tracked is shifted along a circular arc having a radius equal to the current radius of movement (Hoang: Col. 16, lines 15-19; i.e., FIG. 3 is an illustration of a moving actor 300 having a reference point 302 (e.g., an autonomous platform, a vehicle, a pedestrian, any moving object, etc.) that is traveling along path 304, with projected future-time positions 310, 320, and 330; the point to be tracked is shifted along the traveling path based on the radius of movement). Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Becker in view of Hoang and further in view of Smith et al. (U.S. Publication No. 2019/0317219; hereinafter Smith). Regarding claim 3, Becker in view of Hoang teaches the method according to claim 2, but does not explicitly teach wherein the first state contains a translational velocity of the point to be tracked and an angular velocity of the point to be tracked. However, in the same field of endeavor, Smith teaches wherein the first state contains a translational velocity of the point to be tracked and an angular velocity of the point to be tracked (Smith: Par. 52; i.e., phase coherent LIDAR data can indicate, for each of a plurality of points in an environment … a velocity for the point; Par. 56; i.e., when the point is moving, a Doppler frequency shift will be superimposed to the IF waveform, which can be used to determine radial velocity of the point). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Becker and Hoang to have further incorporated wherein the first state contains a translational velocity of the point to be tracked and an angular velocity of the point to be tracked, as taught by Smith. Doing so would allow for improved accuracy (Smith: Par. 60; i.e., This enables determination of ranges and velocities with a degree of accuracy that, absent increased angular velocity techniques described herein, would otherwise require a greater quantity of beams and sensing events in a sensing cycle). While Becker in view of Hoang and Smith do not explicitly teach the current radius of movement being determined as a ratio of the translational velocity to the angular velocity, the equation for translational velocity is well known in the art as turning radius multiplied by the angular velocity which is rearranged to determine the turning radius as a ratio of the translational velocity to the angular velocity. Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Becker in view of Hoang and further in view of Schiffman et al. (U.S. Patent No. 10816344; hereinafter Schiffman). Regarding claim 5, Becker in view of Hoang teaches the method according to claim 4, but does not explicitly teach wherein the first deviation is determined as a first angular difference between the geometric orientation and the first direction of movement. However, in the same field of endeavor, Schiffman teaches wherein the first deviation is determined as a first angular difference between the geometric orientation and the first direction of movement (Schiffman: Col. 6, lines 14-16; i.e., the processor 50 determines a difference between the refined pointing angle determined at 92 and the initial pointing angle determined). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Becker and Hoang to have further incorporated wherein the first deviation is determined as a first angular difference between the geometric orientation and the first direction of movement, as taught by Schiffman. Doing so would allow the system to determine if the angles have converged (Schiffman: Col. 6, lines 16-18; i.e., the processor 50 determines whether the difference between the initial pointing angle and the refined pointing angle indicate convergence). While Becker in view of Hoang and Schiffman does not explicitly teach shifting the point being carried out by a circular arc section having a length L = D*R, where D denotes the first angular difference and R denotes the current radius of movement, the equation for length of an arc section is well known in the art as the angular difference multiplied by the radius. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Additional prior art deemed pertinent in the art of object tracking includes Kellner (U.S. Publication No. 2023/0094836), and Cennamo et al. (U.S. Publication No. 2022/0120858). Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON Z WILLIS whose telephone number is (571)272-5427. The examiner can normally be reached Weekdays 8:00-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin D. Bishop can be reached at (571) 270-3713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BRANDON Z WILLIS/Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Aug 16, 2024
Application Filed
Dec 30, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602931
IDENTIFICATION OF UNKNOWN TRAFFIC OBJECTS
2y 5m to grant Granted Apr 14, 2026
Patent 12589767
SYSTEMS AND METHODS FOR GENERATING A DRIVING TRAJECTORY
2y 5m to grant Granted Mar 31, 2026
Patent 12545299
DYNAMICALLY WEIGHTING TRAINING DATA USING KINEMATIC COMPARISON
2y 5m to grant Granted Feb 10, 2026
Patent 12534072
TRANSPORT DANGEROUS SITUATION CONSENSUS
2y 5m to grant Granted Jan 27, 2026
Patent 12528483
METHOD, ELECTRONIC DEVICE AND MEDIUM FOR TARGET STATE ESTIMATION
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
69%
Grant Probability
99%
With Interview (+38.3%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 203 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month