DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/08/2026 has been entered.
Status of Claims
Claims 1-20 are pending. Claims 1, 4, 9, 13, and 15 have been amended in the reply filed 11/25/2025.
Response to Amendment/Arguments
Applicant's arguments filed on 11/25/2025 have been fully considered as below.
Regarding the rejection under 35 U.S.C. 112(b), applicant's arguments, see page 10 of Remarks, have been fully considered but they are not persuasive in view of the amendments. The claims were amended to recite new limitation, e.g., “first status”, “second status”, “third status”. However, the amendments do not resolve the rejections under 35 U.S.C. 112(b) set forth in previous Office Action filed 10/15/2025. Moreover, the amendments also raise new issue as discussed below. Therefore, the rejections are maintained and discussed below.
Regarding the rejection under 35 U.S.C. 112(a), applicant's arguments, see page 9 of Remarks, have been fully considered but they are not persuasive in view of the amendments. The claims were amended to recite new limitation, e.g., “first status”, “second status”, “third status”. However, the amendments do not resolve the rejections under 35 U.S.C. 112(a) set forth in previous Office Action filed 10/15/2025. Moreover, the amendments also raise new issue as discussed below. Therefore, the rejections are maintained and discussed below.
Regarding the rejections under 35 U.S.C. 102, applicant's arguments, see page 9 of Remarks, have been fully considered but are moot in view of the new grounds of rejection provided below, in light of newly found prior art, which was necessitated based on Applicant' s amendments which changed the scope of the claims.
Claim Rejections - 35 USC § 112(a)
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-20 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Regarding claim 1, the Application claims, “assigning a first parameter to the object, the first parameter indicating an object status of the object as a first status” and “determining that the first parameter is assigned to the object”, however, the Applicant fails to teach in such full, clear, concise and exact terms as to enable one skilled in the art how the respective limitation is implemented (i.e., performed, executed, etc.), and therefore claim 1 is rejected under this section. Specially, applicant’s disclosure is silent regarding the term “parameter”. The instant specification states, “[t]he method illustrated in FIG. 2 operates on only two statuses (in other words: with one of two parameters), on object level, and the parameters are “mature” and “coasted”. Once the object is created, in the method of FIG. 2, its status is “mature”. It can be changed to “coasted” in case when there is no measurement data (in other words: when measurement data 206 may not be associated with the object)” (published paragraph [0065]) and “[i]t will be understood that “status” is different from “state”. “Status” refers to the parameter assigned to an object (for example “young”, “mature”, or “coasted”). “State” refers to the state of the object, for example the linear velocity, angular velocity, heading angle, height, speed, and/or acceleration” (published paragraph [0070]). However, there is no disclosure regarding definition of “young”/”mature”/”coasted” parameter in term of established technical meaning to the skilled person. Moreover, the disclosure is also silent regarding the claimed “assigning a first parameter to the object, the first parameter indicating an object status of the object as a first status”. The instant specification states, “FIG. 4 shows a flow diagram 400 illustrating a method for tracking an object according to various embodiments. At 402, upon first detection of the object, a first parameter may be assigned to the object” (published par. [0071]). However, there is no disclosure regarding what method/steps/condition to assign the object a first parameter or to relate the object with a status “young”. Assigning a parameter/status is merely labeling an object as young parameter/status as stated in published paragraphs [0065, 0070, 0071]. In addition, the disclosure is silent regarding the claimed “determining that the first parameter is assigned to the object”. The specification states, “If it is determined in step 210 that the measurement data 206 can be associated with the object, then it is determined whether the object is “young” in step 302 (in other words, it is determined whether the status of the object is “young”; in other words: it is determined whether the parameter “young” is assigned to the object)” (published paragraph [0052]). Therefore, it is not positive that the step to determine the object has a “young” or first parameter/status based on whether the measurement data is associated with the object. Both of the steps “assigning a first parameter to the object, the first parameter indicating an object status of the object as a first status” and “determining that the first parameter is assigned to the object” are directed to core of the invention and it merely states a result to be achieved without providing technical features to solve it. Therefore, the claim is rejected under 35 U.S.C. 112(a). Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 1, the Applicant further claims, “determining whether a second parameter is to be assigned to the object, the second parameter indicating the object status of the object as a second status and no longer the first status, the object status of the object being assigned based on (a) a time since the object status of the first status is assigned to the object, or (b) an availability of computational resources for tracking the object”, however, the Applicant fails to teach in such full, clear, concise and exact terms as to enable one skilled in the art how the respective limitation is implemented (i.e., performed, executed, etc.), and therefore claim 1 is rejected under this section. Applicant’s disclosure states that the step of “determining whether a second parameter is to be assigned to the object, the object status of the object being assigned based on (a) a time since the object status of the first status is assigned to the object, or (b) an availability of computational resources for tracking the object” after re-initializing step (Fig. 3 and published paragraphs [0054, 0058-0063]. It is unclear whether claim 1 recites two different embodiments as the specification states in Fig. 3 and Fig. 4. Therefore, the claim is rejected under 35 U.S.C. 112(a). Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 1, the Applicant further claims, “responsive to receiving measurement data in an iteration after the first detection of the object: determining whether the measurement data can be associated with the object; responsive to determining the measurement data can be associated with the object as a subsequent detection of the object”, however, the Applicant fails to teach in such full, clear, concise and exact terms as to enable one skilled in the art how the respective limitation is implemented (i.e., performed, executed, etc.), and therefore claim 1 is rejected under this section. Application’s disclosure states, “The updated state 216 may be taken as previous state 202 for a subsequent iteration of the update method illustrated in FIG. 2” (Fig. 3, published paragraph [0049]). However, the recited claim is confusing because according to step 216 and published paragraph [0048], the object is assigned second/mature parameter that is the previous state 202 in Fig. 3. It is unclear whether the claimed “responsive to receiving measurement data in an iteration after the first detection of the object” is related to “responsive to receiving a first detection of the object” in line 3 of claim 1 or different iteration after the first detection of the object. It is unclear what exactly the steps/algorithm related to the claimed “responsive to receiving measurement data in an iteration after the first detection of the object”. Regarding the claimed “determining whether the measurement data can be associated with the object; responsive to determining the measurement data can be associated with the object as a subsequent detection of the object”, applicant’s disclosure states, “If the object cannot be associated with the measurement data 206, then this may be for example due to the object being temporarily obstructed (i.e. not visible in the measurement data 206), and the object may be treated as coasting in step 214, in order to obtain the updated state 218 assuming a continuing (for example smooth) advance in the state (for example without taking into account the measurement data 206)” (published paragraph [0047]). However, the disclosure does not provide any step/algorithm to determine whether the measurement data can be associated with the object as a subsequent detection of the object, such as how to determine the object not visible in the measurement data. The step of “determining whether the measurement data can be associated with the object” is directed to core of the invention and it merely states a result to be achieved without providing technical features to solve it. Therefore, the claim is rejected under 35 U.S.C. 112(a). Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 1, the Applicant further claims, “by a vehicle, based on the state of the object, at least partially-autonomously operating the vehicle”, however, the Applicant fails to teach in such full, clear, concise and exact terms as to enable one skilled in the art how the respective limitation is implemented (i.e., performed, executed, etc.), and therefore claim 1 is rejected under this section. Specially, applicant’s disclosure is silent regarding the claimed “by a vehicle, based on the state of the object, at least partially-autonomously operating the vehicle”. The heart of this invention is tracking and updating status of an object. The Applicant fails to teach in such full, clear, concise and exact step/algorithm to at least partially-autonomously operating a vehicle based on the state of the object based on the vehicle. Therefore, the claim is rejected under 35 U.S.C. 112(a). Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claims 13 and 15, the claims are recited similar to claim 1. Therefore, claims 13 and 15 are also rejected under 35 U.S.C. 112(a) based on the same reason as discussed above.
Regarding claim 9, the Applicant further claims, “wherein determining whether the second parameter is to be assigned to the object further comprises: responsive to determining the second condition is fulfilled, determining whether assignment of the second parameter is possible, the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter”, however, the Applicant fails to teach in such full, clear, concise and exact terms as to enable one skilled in the art how the respective limitation is implemented (i.e., performed, executed, etc.), and therefore claim 9 is rejected under this section. Specially, applicant’s disclosure is silent regarding the claimed “the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter”. The specification states, “According to another aspect, the computer implemented method further comprises the following step carried out by the computer hardware components: responsive to determining the second condition is fulfilled, determining whether assignment of the second parameter is possible. This determination may be a further check to determine whether there is enough good measurements to transit object to mature state in order to avoid creating low quality objects (for example to check if the object was de-aliased)” (published paragraph [0023]). The disclosure is silent regarding what step/algorithm to determine threshold of measurement data and further how the threshold of measurement data is related to “the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter”. The step of “determining whether the second parameter is to be assigned to the object” is directed to core of the invention and it merely states a result to be achieved without providing technical features to solve it. Therefore, the claim is rejected under 35 U.S.C. 112(a). Accordingly, appropriate correction and/or clarification are earnestly solicited.
The dependent claims are also rejected based on previously rejected based claim. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Claim Rejections - 35 USC § 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 1, the Applicant claims, “assigning a first parameter to the object, the first parameter indicating an object status of the object as a first status” and “determining that the first parameter is assigned to the object”, however, based on currently provided claim language, it is unclear what the metes and bounds regarding the term “parameter” and further how the term “parameter” is applied to indicating an object status when detecting the object. Specially, applicant’s disclosure is silent regarding the term “parameter”. The instant specification states, “[t]he method illustrated in FIG. 2 operates on only two statuses (in other words: with one of two parameters), on object level, and the parameters are “mature” and “coasted”. Once the object is created, in the method of FIG. 2, its status is “mature”. It can be changed to “coasted” in case when there is no measurement data (in other words: when measurement data 206 may not be associated with the object)” (published paragraph [0065]) and “[i]t will be understood that “status” is different from “state”. “Status” refers to the parameter assigned to an object (for example “young”, “mature”, or “coasted”). “State” refers to the state of the object, for example the linear velocity, angular velocity, heading angle, height, speed, and/or acceleration” (published paragraph [0070]). However, there is no disclosure regarding definition of “young”/”mature”/”coasted” parameter in term of established technical meaning to the skilled person. Apparently, assigning a parameter/status is merely labeling an object as young parameter/status as stated in published paragraphs [0065, 0070, 0071]. In addition, the disclosure is silent regarding the claimed “determining that the first parameter is assigned to the object”. The specification states, “If it is determined in step 210 that the measurement data 206 can be associated with the object, then it is determined whether the object is “young” in step 302 (in other words, it is determined whether the status of the object is “young”; in other words: it is determined whether the parameter “young” is assigned to the object)” (published paragraph [0052]). Therefore, it is not positive that the step to determine the object has a “young” or first parameter/status based on whether the measurement data is associated with the object. Both of the steps “assigning a first parameter to the object, the first parameter indicating an object status of the object as a first status” and “determining that the first parameter is assigned to the object” are directed to core of the invention and it merely states a result to be achieved without providing technical features to solve it. Therefore, this renders the claim indefinite because the claim is being incomplete for omitting essential steps. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 1, the Applicant further claims, “responsive to receiving measurement data in an iteration after the first detection of the object”, however, based on currently provided claim language, it is unclear what “an iteration” is referring to and whether “an iteration after the first detection of the object” is referring to “a first detection of the object” in line 3 of claim 1. If it is referring to “a first detection of the object” in line 3 of claim 1, it would be confusing regarding the claimed “determining that the first parameter is assigned to the object” because the first parameter is already assigned to the object when the object is detected. If it is not referring to “a first detection of the object” in line 3 of claim 1, it is unclear whether the claim recites two different embodiments. Application’s disclosure states, “The updated state 216 may be taken as previous state 202 for a subsequent iteration of the update method illustrated in FIG. 2” (Fig. 3, published paragraph [0049]). However, the recited claim is confusing because according to step 216 and published paragraph [0048], the object is assigned second/mature parameter that is the previous state 202 in Fig. 3. Therefore, this renders the claim indefinite. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 1, the Applicant further claims, “re-initializing the state of the object in response to determining that the first parameter is assigned to the object; updating the state of the object in response to determining the second parameter is assigned to the object”, however, based on the currently provided claim language, it is unclear whether the claim recites two different embodiments. According to Fig. 4, the step of determining whether a second parameter is to be assigned to the object is performed after the re-initializing step. However, claim 1 recites the step of determining whether a second parameter is to be assigned to the object before the re-initializing step. Therefore, this renders the claim. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claims 13 and 15, the claims are recited similar to claim 1. Therefore, claims 13 and 15 are rejected under 35 U.S.C. 112(b) based on the same reason as discussed above.
Regarding claim 4, the Applicant claims, “determining whether a first condition is fulfilled based on a variance of the state of the object”, however, based on currently provided claim language, it is unclear what the metes and bounds regarding the claimed “variance of the state of the object” and further how the claimed “variance of the state of the object” is applied to determine whether a first condition is fulfilled. The specification states, “The first criterion (“soft” criterion) may be provided to speed up object initialization. Each time when the object is re-initialized (or just initialized), additional checks are performed to verify if current parameters of object are good enough to leave status “young”. For example, the first criterion may be related to acceptable estimated variance of the state (or each component of the state) of the object. If the variance is low enough, then the tracker may carry out the transition from “young” to “mature” in step 216” (published paragraph [0059]). However, it is unclear how to consider a variance of the state of the object is acceptable or low enough to determine whether a first condition is fulfilled. Therefore, this renders the claim indefinite. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Claims 5-11 are also rejected based on previously rejected based claim. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 18, the claim is recited similar to claim 4. Therefore, claims 18 is rejected under 35 U.S.C. 112(b) based on the same reason as previously stated.
Claims 19 and 20 are also rejected based on previously rejected based claim. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Regarding claim 9, the Applicant further claims, “responsive to determining the second condition is fulfilled, determining whether assignment of the second parameter is possible, the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter”, however, based on currently provided claim language, it is unclear what the metes and bounds regarding the claimed “a threshold of measurement data” and further how the determination of “whether there is a threshold of measurement data” is related to “assign the second parameter”. Specially, applicant’s disclosure is silent regarding the claimed “the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter”. The specification states, “According to another aspect, the computer implemented method further comprises the following step carried out by the computer hardware components: responsive to determining the second condition is fulfilled, determining whether assignment of the second parameter is possible. This determination may be a further check to determine whether there is enough good measurements to transit object to mature state in order to avoid creating low quality objects (for example to check if the object was de-aliased)” (published paragraph [0023]). The disclosure is silent regarding definition of threshold of measurement data and further how the threshold of measurement data is related to “the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter”. The step of “determining whether the second parameter is to be assigned to the object” is directed to core of the invention and it merely states a result to be achieved without providing technical features to solve it. Therefore, this renders the claim indefinite. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Claims 10 and 11 are also rejected based on previously rejected based claim. Accordingly, appropriate correction and/or clarification are earnestly solicited.
The dependent claims are also rejected based on previously rejected based claim. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Examiner notes wherein the claims have been addressed below in view of the prior art, as best understood by the Examiner, in light of the 35 U.S.C. 112 rejections provided herein. Accordingly, appropriate correction and/or clarification are earnestly solicited.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Das et al. (US 20210181758 A1, hereinafter “Das”).
Regarding claim 1, Das discloses A method, comprising:
tracking, by computer hardware components, an object (Das, see at least Fig. 1, par. [0091], tracking object 138 by sensor(s) 104 of vehicle 102) by at least:
responsive to receiving a first detection of the object (Das, see at least Fig. 4, par. [0024], “At operation 402, example process 400 may comprise receiving a first object detection associated with a first sensor type and a second object detection associated with a second sensor type, according to any of the techniques discussed herein”):
assigning a first parameter to the object, the first parameter indicating an object status of the object as a first status (Das, see at least Fig. 4, par [0093], “At operation 416, example process 400 may comprise receiving a track associated with an object in an environment, according to any of the techniques discussed herein”);
predicting a state of the object (Das, see at least Fig. 4, par [0095, 0097], “At operation 424, example process 400 may comprise receiving, as output from the ML model, an estimated object detection 426, according to any of the techniques discussed herein”);
determining whether a second parameter is to be assigned to the object, the second parameter indicating the object status of the object as a second status and no longer the first status (Das, see at least Fig. 4, par [0095, 0097], “At operation 424, example process 400 may comprise receiving, as output from the ML model, an estimated object detection 426, according to any of the techniques discussed herein”), the object status of the object being assigned based on (a) a time since the object status of the first status is assigned to the object (Das, see at least Fig. 4, par [0108], “Updating a track may comprise associating one of the estimated object detections with the track, indicating that the track is associated with an object that is likely partially or fully occluded (e.g., an occluded status flag may be altered), and/or retiring the track. Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”);
responsive to receiving measurement data in an iteration after the first detection of the object (Das, see at least Fig. 4, par. [0091], step 402, responsive to receiving measurement data from the object detection 404/410):
determining whether the measurement data can be associated with the object (Das, see at least par. [0035], determine whether the measurement data can be associated with the object based on occlusion status, e.g., whether the object is currently/previously occluded partially or totally from one or more sensors);
responsive to determining the measurement data can be associated with the object as a subsequent detection of the object (Das, see at least Fig. 3, par. [0085], “As used herein an object detection may be a portion of one or more environment representations that indicate the existence of an object, such as an ROI, a positive occupancy indication, an object classification, etc.”):
determining that the first parameter is assigned to the object (Das, see at least Fig. 4, par [0093], “At operation 416, example process 400 may comprise receiving a track associated with an object in an environment, according to any of the techniques discussed herein”);
re-initializing the state of the object in response to determining that the first parameter is assigned to the object (Das, see at least Fig. 4, par [0095], “At operation 422, example process 400 may comprise inputting the first object detection, the second object detection, and/or the track into an ML model, according to any of the techniques discussed herein”);
updating the state of the object in response to determining the second parameter is assigned to the object (Das, see at least Fig. 4, par [0108], “At operation 432, example process 400 may comprise determining an updated (432) […] associated with the object based at least in part on the estimated object detection, according to any of the techniques discussed herein […]Updating a track may comprise associating one of the estimated object detections with the track, indicating that the track is associated with an object that is likely partially or fully occluded (e.g., an occluded status flag may be altered), and/or retiring the track. Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”);
responsive to determining the measurement data cannot be associated with the object (Das, see at least Fig. 4, par [0108], “Once all the tracks have been updated any remaining estimated object detections that have not been associated with a track may be passed to an alternate tracking component and/or new track(s) may be generated in association therewith”):
assigning a third parameter to the object, the third parameter indicating the object status of the object as a third status (Das, see at least Fig. 4, par [0108], “Once all the tracks have been updated any remaining estimated object detections that have not been associated with a track may be passed to an alternate tracking component and/or new track(s) may be generated in association therewith”); and
predicting an update of the state of the object assuming a continuous behavior of the object from a previous state of the object in the first detection or a previous iteration before the iteration (Das, see at least Fig. 5, par. [0117], “At operation 512, example process 500 may comprise generating a new track associated with the estimated object detection and/or providing the first object detection, the second object detection, and/or the estimated object detection to an alternate tracking component, according to any of the techniques discussed herein. Generating a new track indicates that the estimated object detection is associated with an object that was not previously detected by any of the perception pipelines and/or was not within a field of view of the sensors associated therewith. In an additional or alternate example, the raw object detections from the one or more pipelines and/or the track 418 may be provided as input to an alternate tracking component. In some examples, the alternate tracking component may be configured to determine a new track or update a former track based at least in part on comparing object detections from multiple perception pipelines”); and
by a vehicle, based on the state of the object, at least partially-autonomously operating the vehicle (Das, see at least Fig. 1, par. [0010, 0026], the controller 116 is configured to operate the vehicle based on predicting motion/behavior of the object and to determine a trajectory and/or path for controlling the autonomous vehicle 102);
wherein tracking the object with the first parameter assigned is computationally more expensive than tracking the object with the second parameter assigned (Das, see at least par. [0089], “In some examples, the contingent tracking component may be a tracking component configured to receive the raw environment representations from the pipeline(s) and determine a track therefrom. The contingent tracking component may, in some cases, require more compute and/or memory since the contingent tracking component uses more data between the different pipelines to determine whether an object detection is a false positive and/or whether to associate an object detection with a former track or a new track”).
Regarding claim 2, Das teaches all the limitations of claim 1 as discussed above. Das further teaches wherein the state of the object includes at least one of a linear velocity, an angular velocity, a heading angle, a height, a speed, or an acceleration of the object (Das, see at least Fig. 1, par. [0035], “In some examples, the track 136 may associate one or more previous object detections and/or may indicate data related thereto, such as a velocity, acceleration, heading, object classification, unique identifier, occlusion status (e.g., whether the object is currently/previously occluded partially or totally from one or more sensors)”).
Regarding claim 3, Das teaches all the limitations of claim 1 as discussed above. Das further teaches wherein re-initializing the state of the object comprises re-initializing the state of the object based on measurements associated with the object (Das, see at least Fig. 4, par [0095], “At operation 422, example process 400 may comprise inputting the first object detection, the second object detection, and/or the track into an ML model, according to any of the techniques discussed herein”; par. [0096], “Note that the object detections may be provided to the ML model as part of an environment representation. For example, environment representation 406 comprises multiple object detections and unillustrated data, such as object velocities, estimated heights, and more, as discussed above. In some examples, the environment representations may be aggregated and provided to the ML model as input. In some examples, an object detection may be isolated from the rest of the environment representation and provided as input. For example, the environment representations may be in a common reference frame or may be converted to a common reference frame during the aggregation. The pipeline(s) may be configured to output positive object detections along with their coordinates in the common reference frame. For example, these positive object detections may be portions of the environment representation associated with a likelihood that meets or exceeds a threshold confidence. Each and any of the object detection components discussed above may be associated with a regressed confidence score. For example, an object classification may be associated with a confidence score, an ROI may be determined based at least in part on confidence scores associated with different pixels via a non-maximum suppression technique, occupancy may be determined based at least in part on a likelihood associated with each pixel and determined by an ML model of a respective pipeline, and so on”).
Regarding claim 4, Das teaches all the limitations of claim 1 as discussed above. Das further teaches wherein determining whether the second parameter is to be assigned to the object comprises: determining whether a first condition is fulfilled based on a variance of the state of the object (Das, see at least Figs. 4, 5, par. [0020, 0109-0110], determining whether a degree of association between the estimated object detection and the projected region of interest satisfy a threshold of degree of association, such as an object classification, predicted velocity, predicted orientation, predicted height, predicted position, predicted orientation, and/or predicted embedding).
Regarding claim 5, Das teaches all the limitations of claims 1 and 4 as discussed above. Das further teaches wherein determining whether the second parameter is to be assigned to the object further comprises: responsive to determining the first condition is fulfilled, assigning the second parameter to the object (Das, see at least Fig. 5, par. [0114, 0115], “At operation 508, example process 500 may comprise determining whether the degree of association satisfies a threshold degree of association, according to any of the techniques discussed herein. A degree of association that satisfies the threshold degree of association indicates that the object associated with estimated object detection 426 is/is likely the same object as indicated by track 418. If the degree of association satisfies the threshold degree of association, example process 500 may continue to operation 510”; par. [0116], “At operation 510, example process 500 may comprise associating the estimated object detection 426 with the track 418 as an updated track 430, according to any of the techniques discussed herein”).
Regarding claim 6, Das teaches all the limitations of claims 1 and 4 as discussed above. Das further teaches wherein determining whether the second parameter is to be assigned to the object further comprises: responsive to determining the first condition is not fulfilled, determining whether a second condition is fulfilled (Das, see at least par. [0021], “However, if the degree of association does not meet the threshold, the techniques may comprise testing the estimated object detection with any other projected ROIs (e.g., other ROIs that overlap the estimated object detection or are within a threshold distance)”).
Regarding claims 7 and 8, the claims are directed to one of the alternative limitations (b). Since Das anticipates one of the claimed alternative limitations (a), claims 7 and 8 are also anticipated by the prior art.
Regarding claim 9, Das teaches all the limitations of claims 1, 4, and 6 as discussed above. Das further teaches wherein determining whether the second parameter is to be assigned to the object further comprises: responsive to determining the second condition is fulfilled, determining whether assignment of the second parameter is possible, the determination of whether the assignment of possible being made based on whether there is a threshold of measurement data to assign the second parameter (Das, see at least par. [0021, 0115], “However, if the degree of association does not meet the threshold, the techniques may comprise testing the estimated object detection with any other projected ROIs (e.g., other ROIs that overlap the estimated object detection or are within a threshold distance) and/or generating a new track in association with the estimated object detection if no projected ROI matches the estimated object detection”).
Regarding claim 10, Das teaches all the limitations of claims 1, 4, 6, and 9 as discussed above. Das further teaches wherein determining whether the second parameter is to be assigned to the object further comprises: responsive to determining assignment of the second parameter is not possible, ignoring the object (Das, see at least par. [0108], “Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”).
Regarding claim 11, Das teaches all the limitations of claims 1, 4, 6, and 9 as discussed above. Das further teaches wherein various steps of the tracking the object are carried out iteratively (Das, see at least Figs. 4, 5).
Regarding claim 12, Das teaches all the limitations of claim 1 as discussed above. Das further teaches wherein predicting the state of the object and updating the state of the object comprise using a Kalman filter to determine the state of the object (Das, see at least par. [0028-0030], “In some examples, the perception component 110 may comprise a pipeline of hardware and/or software, which may include one or more GPU(s), ML model(s), Kalman filter(s), and/or the like”).
Regarding claim 13, Das discloses a system (Das, see at least Figs. 1, 2, par. [0038], system 200 may include a vehicle 202, which may represent the vehicle 102 in FIG. 1), comprising:
a computer system having computer components (Das, see at least Figs. 1, 2, par. [0039], “The vehicle 202 may include a vehicle computing device(s) 204, sensor(s) 206, emitter(s) 208, network interface(s) 210, and/or drive component(s) 212. Vehicle computing device(s) 204 may represent computing device(s) 106 and sensor(s) 206 may represent sensor(s) 104. The system 200 may additionally or alternatively comprise computing device(s) 214”) configured to:
track an object (Das, see at least Fig. 1, par. [0091], tracking object 138 by sensor(s) 104 of vehicle 102) by at least:
responsive to receiving a first detection of the object (Das, see at least Fig. 4, par. [0024], “At operation 402, example process 400 may comprise receiving a first object detection associated with a first sensor type and a second object detection associated with a second sensor type, according to any of the techniques discussed herein”):
assign a first parameter to the object, the first parameter indicating an object status of the object as a first status (Das, see at least Fig. 4, par [0093], “At operation 416, example process 400 may comprise receiving a track associated with an object in an environment, according to any of the techniques discussed herein”);
predict a state of the object (Das, see at least Fig. 4, par [0095, 0097], “At operation 424, example process 400 may comprise receiving, as output from the ML model, an estimated object detection 426, according to any of the techniques discussed herein”);
determine whether a second parameter is to be assigned to the object, the second parameter indicating the object status of the object as a second status and no longer the first status (Das, see at least Fig. 4, par [0095, 0097], “At operation 424, example process 400 may comprise receiving, as output from the ML model, an estimated object detection 426, according to any of the techniques discussed herein”), the object status of the object being assigned based on (a) a time since the object status of young is assigned to the object (Das, see at least Fig. 4, par [0108], “Updating a track may comprise associating one of the estimated object detections with the track, indicating that the track is associated with an object that is likely partially or fully occluded (e.g., an occluded status flag may be altered), and/or retiring the track. Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”);
responsive to receiving measurement data in an iteration after the first detection of the object Das, see at least Fig. 4, par. [0091], step 402, responsive to receiving measurement data from the object detection 404/410):
determine whether the measurement data can be associated with the object (Das, see at least par. [0035], determine whether the measurement data can be associated with the object based on occlusion status, e.g., whether the object is currently/previously occluded partially or totally from one or more sensors);
responsive to determining the measurement data can be associated with the object as a subsequent detection of the object (Das, see at least Fig. 3, par. [0085], “As used herein an object detection may be a portion of one or more environment representations that indicate the existence of an object, such as an ROI, a positive occupancy indication, an object classification, etc.”):
determine that the first parameter is assigned to the object (Das, see at least Fig. 4, par [0093], “At operation 416, example process 400 may comprise receiving a track associated with an object in an environment, according to any of the techniques discussed herein”);
re-initialize the state of the object in response to determining that the first parameter is assigned to the object (Das, see at least Fig. 4, par [0095], “At operation 422, example process 400 may comprise inputting the first object detection, the second object detection, and/or the track into an ML model, according to any of the techniques discussed herein”); update the state of the object in response to determining the second parameter is assigned to the object (Das, see at least Fig. 4, par [0108], “At operation 432, example process 400 may comprise determining an updated (432) […] associated with the object based at least in part on the estimated object detection, according to any of the techniques discussed herein […] Updating a track may comprise associating one of the estimated object detections with the track, indicating that the track is associated with an object that is likely partially or fully occluded (e.g., an occluded status flag may be altered), and/or retiring the track. Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”);
responsive to determining the measurement data is not associated with the object as a subsequent detection of the object (Das, see at least Fig. 4, par [0108], “Once all the tracks have been updated any remaining estimated object detections that have not been associated with a track may be passed to an alternate tracking component and/or new track(s) may be generated in association therewith”):
assign a third parameter to the object, the third parameter indicating the object status of the object as a third status (Das, see at least Fig. 4, par [0108], “Once all the tracks have been updated any remaining estimated object detections that have not been associated with a track may be passed to an alternate tracking component and/or new track(s) may be generated in association therewith”);
predict an update of the state of the object assuming a continuous behavior of the object from a previous state of the object (Das, see at least Fig. 5, par. [0117], “At operation 512, example process 500 may comprise generating a new track associated with the estimated object detection and/or providing the first object detection, the second object detection, and/or the estimated object detection to an alternate tracking component, according to any of the techniques discussed herein. Generating a new track indicates that the estimated object detection is associated with an object that was not previously detected by any of the perception pipelines and/or was not within a field of view of the sensors associated therewith. In an additional or alternate example, the raw object detections from the one or more pipelines and/or the track 418 may be provided as input to an alternate tracking component. In some examples, the alternate tracking component may be configured to determine a new track or update a former track based at least in part on comparing object detections from multiple perception pipelines”); and
control an autonomous or partially-autonomous operating a vehicle based on the state of the object (Das, see at least Fig. 1, par. [0010, 0026], the controller 116 is configured to operate the vehicle based on predicting motion/behavior of the object and to determine a trajectory and/or path for controlling the autonomous vehicle 102);
wherein tracking the object with the first parameter assigned is computationally more expensive than tracking the object with the second parameter assigned (Das, see at least par. [0089], “In some examples, the contingent tracking component may be a tracking component configured to receive the raw environment representations from the pipeline(s) and determine a track therefrom. The contingent tracking component may, in some cases, require more compute and/or memory since the contingent tracking component uses more data between the different pipelines to determine whether an object detection is a false positive and/or whether to associate an object detection with a former track or a new track”).
Regarding claim 14, Das teaches all the limitations of claim 13 as discussed above. Das further teaches the system comprising a vehicle that comprises at least a portion of the computer components of the computer system (Das, see at least Figs. 1, 2, par. [0039], “The vehicle 202 may include a vehicle computing device(s) 204, sensor(s) 206, emitter(s) 208, network interface(s) 210, and/or drive component(s) 212. Vehicle computing device(s) 204 may represent computing device(s) 106 and sensor(s) 206 may represent sensor(s) 104. The system 200 may additionally or alternatively comprise computing device(s) 214”).
Regarding claim 15, Das discloses A non-transitory computer readable medium comprising instructions, that, when executed, configure computer components of a system (Das, see at least Figs. 1, 2, par. [0046-0047], “The processor(s) 218 and/or 222 may be any suitable processor capable of executing instructions to process data and perform operations as described herein […] Memory 220 and/or 224 may be examples of non-transitory computer-readable media. The memory 220 and/or 224 may store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions attributed to the various systems”) to:
track an object (Das, see at least Fig. 1, par. [0091], tracking object 138 by sensor(s) 104 of vehicle 102) by at least:
responsive to receiving a first detection of the object (Das, see at least Fig. 4, par. [0024], “At operation 402, example process 400 may comprise receiving a first object detection associated with a first sensor type and a second object detection associated with a second sensor type, according to any of the techniques discussed herein”):
assign a first parameter to the object, the first parameter indicating an object status of the object as a first status (Das, see at least Fig. 4, par [0093], “At operation 416, example process 400 may comprise receiving a track associated with an object in an environment, according to any of the techniques discussed herein”);
predict a state of the object (Das, see at least Fig. 4, par [0095, 0097], “At operation 424, example process 400 may comprise receiving, as output from the ML model, an estimated object detection 426, according to any of the techniques discussed herein”);
determine whether a second parameter is to be assigned to the object, the second parameter indicating the object status of the object as a second status and no longer the first status (Das, see at least Fig. 4, par [0095, 0097], “At operation 424, example process 400 may comprise receiving, as output from the ML model, an estimated object detection 426, according to any of the techniques discussed herein”), the object status of the object being assigned based on a time since the object status of young is assigned to the object, or an availability of computational resources for tracking the object (Das, see at least Fig. 4, par [0108], “Updating a track may comprise associating one of the estimated object detections with the track, indicating that the track is associated with an object that is likely partially or fully occluded (e.g., an occluded status flag may be altered), and/or retiring the track. Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”);
responsive to receiving measurement data in an iteration (Das, see at least Fig. 4, par. [0091], step 402, responsive to receiving measurement data from the object detection 404/410):
determine whether the measurement data can be associated with the object (Das, see at least par. [0035], determine whether the measurement data can be associated with the object based on occlusion status, e.g., whether the object is currently/previously occluded partially or totally from one or more sensors);
responsive to determining the measurement data can be associated with the object as a subsequent detection of the object (Das, see at least Fig. 3, par. [0085], “As used herein an object detection may be a portion of one or more environment representations that indicate the existence of an object, such as an ROI, a positive occupancy indication, an object classification, etc.”):
determine that the first parameter is assigned to the object (Das, see at least Fig. 4, par [0093], “At operation 416, example process 400 may comprise receiving a track associated with an object in an environment, according to any of the techniques discussed herein”);
re-initialize the state of the object in response to determining that the first parameter is assigned to the object (Das, see at least Fig. 4, par [0095], “At operation 422, example process 400 may comprise inputting the first object detection, the second object detection, and/or the track into an ML model, according to any of the techniques discussed herein”);
update the state of the object in response to determining the second parameter is assigned to the object (Das, see at least Fig. 4, par [0108], “At operation 432, example process 400 may comprise determining an updated (432) […] associated with the object based at least in part on the estimated object detection, according to any of the techniques discussed herein […] Updating a track may comprise associating one of the estimated object detections with the track, indicating that the track is associated with an object that is likely partially or fully occluded (e.g., an occluded status flag may be altered), and/or retiring the track. Retiring the track may comprise indicating that an object associated with the track has been occluded for at least a threshold amount of time, the object is likely no longer within a field of view, and/or deleting the track”);
responsive to determining the measurement data cannot be associated with the object as a subsequent detection of the object (Das, see at least Fig. 4, par [0108], “Once all the tracks have been updated any remaining estimated object detections that have not been associated with a track may be passed to an alternate tracking component and/or new track(s) may be generated in association therewith”):
assign a third parameter to the object, the third parameter indicating the object status of the object as a third status (Das, see at least Fig. 4, par [0108], “Once all the tracks have been updated any remaining estimated object detections that have not been associated with a track may be passed to an alternate tracking component and/or new track(s) may be generated in association therewith”);
predict an update of the state of the object assuming a continuous behavior of the object from a previous state of the object (Das, see at least Fig. 5, par. [0117], “At operation 512, example process 500 may comprise generating a new track associated with the estimated object detection and/or providing the first object detection, the second object detection, and/or the estimated object detection to an alternate tracking component, according to any of the techniques discussed herein. Generating a new track indicates that the estimated object detection is associated with an object that was not previously detected by any of the perception pipelines and/or was not within a field of view of the sensors associated therewith. In an additional or alternate example, the raw object detections from the one or more pipelines and/or the track 418 may be provided as input to an alternate tracking component. In some examples, the alternate tracking component may be configured to determine a new track or update a former track based at least in part on comparing object detections from multiple perception pipelines”); and
at least partially-autonomously operating a vehicle based on the state of the object (Das, see at least Fig. 1, par. [0010, 0026], the controller 116 is configured to operate the vehicle based on predicting motion/behavior of the object and to determine a trajectory and/or path for controlling the autonomous vehicle 102);
wherein tracking the object with the first parameter assigned is computationally more expensive than tracking the object with the second parameter assigned (Das, see at least par. [0089], “In some examples, the contingent tracking component may be a tracking component configured to receive the raw environment representations from the pipeline(s) and determine a track therefrom. The contingent tracking component may, in some cases, require more compute and/or memory since the contingent tracking component uses more data between the different pipelines to determine whether an object detection is a false positive and/or whether to associate an object detection with a former track or a new track”).
Regarding claim 16, Das teaches all the limitations of claim 15 as discussed above. Das further teaches wherein the state of the object includes at least one of a linear velocity, an angular velocity, a heading angle, a height, a speed, or an acceleration of the object (Das, see at least Fig. 1, par. [0035], “In some examples, the track 136 may associate one or more previous object detections and/or may indicate data related thereto, such as a velocity, acceleration, heading, object classification, unique identifier, occlusion status (e.g., whether the object is currently/previously occluded partially or totally from one or more sensors)”).
Regarding claim 17, Das teaches all the limitations of claim 15 as discussed above. Das further teaches wherein the instructions, when executed, further configure the computer components to re-initialize the state of the object based on measurements associated with the object (Das, see at least Fig. 4, par [0095], “At operation 422, example process 400 may comprise inputting the first object detection, the second object detection, and/or the track into an ML model, according to any of the techniques discussed herein”; par. [0096], “Note that the object detections may be provided to the ML model as part of an environment representation. For example, environment representation 406 comprises multiple object detections and unillustrated data, such as object velocities, estimated heights, and more, as discussed above. In some examples, the environment representations may be aggregated and provided to the ML model as input. In some examples, an object detection may be isolated from the rest of the environment representation and provided as input. For example, the environment representations may be in a common reference frame or may be converted to a common reference frame during the aggregation. The pipeline(s) may be configured to output positive object detections along with their coordinates in the common reference frame. For example, these positive object detections may be portions of the environment representation associated with a likelihood that meets or exceeds a threshold confidence. Each and any of the object detection components discussed above may be associated with a regressed confidence score. For example, an object classification may be associated with a confidence score, an ROI may be determined based at least in part on confidence scores associated with different pixels via a non-maximum suppression technique, occupancy may be determined based at least in part on a likelihood associated with each pixel and determined by an ML model of a respective pipeline, and so on”).
Regarding claim 18, Das teaches all the limitations of claim 15 as discussed above. Das further teaches wherein the instructions, when executed, further configure the computer components to determine whether the second parameter is to be assigned to the object by determining whether a first condition is fulfilled based on a variance of the state of the object (Das, see at least Figs. 4, 5, par. [0020, 0109-0110], determining whether a degree of association between the estimated object detection and the projected region of interest satisfy a threshold of degree of association, such as an object classification, predicted velocity, predicted orientation, predicted height, predicted position, predicted orientation, and/or predicted embedding).
Regarding claim 19, Das teaches all the limitations of claims 15 and 18 as discussed above. Das further teaches wherein the instructions, when executed, further configure the computer components to determine whether the second parameter is to be assigned to the object by: responsive to determining the first condition is fulfilled, assigning the second parameter to the object (Das, see at least Fig. 5, par. [0114, 0115], “At operation 508, example process 500 may comprise determining whether the degree of association satisfies a threshold degree of association, according to any of the techniques discussed herein. A degree of association that satisfies the threshold degree of association indicates that the object associated with estimated object detection 426 is/is likely the same object as indicated by track 418. If the degree of association satisfies the threshold degree of association, example process 500 may continue to operation 510”; par. [0116], “At operation 510, example process 500 may comprise associating the estimated object detection 426 with the track 418 as an updated track 430, according to any of the techniques discussed herein”).
Regarding claim 20, Das teaches all the limitations of claims 15 and 18 as discussed above. Das further teaches wherein the instructions, when executed, further configure the computer components to determine whether the second parameter is to be assigned to the object by: responsive to determining the first condition is not fulfilled, determining whether a second condition is fulfilled (Das, see at least par. [0021], “However, if the degree of association does not meet the threshold, the techniques may comprise testing the estimated object detection with any other projected ROIs (e.g., other ROIs that overlap the estimated object detection or are within a threshold distance)”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Janssens (US 20230014601 A1) teaches systems and methods for tracking objects though a traffic control system include a plurality of sensors configured to capture data associated with a traffic location, and a logic device configured to detect one or more objects in the captured data, determine an object location within the captured data, transform each object location to world coordinates associated with one of the plurality of sensors; and track each object location using the world coordinates using prediction and occlusion-based processes.
Kiiski (US 12128887 B1) teaches a vehicle computing system may implement techniques to determine relevance of objects detected in an environment to a vehicle operating in the environment.
Oami et al. (US 12380699 B2) teaches an object tracking apparatus, method and computer-readable medium for detecting an object from output information of sensors, tracking the object on a basis of a plurality of detection results, generating tracking information of the object represented in a common coordinate system, outputting the tracking information, and detecting the object on a basis of the tracking information.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRANG DANG whose telephone number is (703)756-1049. The examiner can normally be reached Monday-Friday 8:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TRANG DANG/ Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656