Prosecution Insights
Last updated: April 19, 2026
Application No. 18/823,155

VEHICLE CONTROL DEVICE AND VEHICLE CONTROL METHOD

Non-Final OA §103
Filed
Sep 03, 2024
Examiner
AFRIN, NAZIA
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kia Corporation
OA Round
1 (Non-Final)
40%
Grant Probability
Moderate
1-2
OA Rounds
3y 2m
To Grant
57%
With Interview

Examiner Intelligence

Grants 40% of resolved cases
40%
Career Allow Rate
4 granted / 10 resolved
-12.0% vs TC avg
Strong +17% interview lift
Without
With
+16.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
63 currently pending
Career history
73
Total Applications
across all art units

Statute-Specific Performance

§101
11.8%
-28.2% vs TC avg
§103
60.7%
+20.7% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
6.4%
-33.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 10 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 5-9,11-13, 15-19 are rejected under 35 U.S.C. 103 as being unpatented over US 20220179076 A1 to Noh et al. (herein after “Noh”) in view of KR 102477510 B1 to Kim et al.(herein after “Kim”) and CN 109740274 A to Cao et al. (herein after”Cao”). Regarding claim 1, Noh teaches A vehicle control apparatus comprising: a Light Detection and Ranging (LiDAR)(See Noh at least para[0063]The object-tracking apparatus 100 shown in FIG. 1 may include a light detection and ranging (LiDAR) sensor 110, a preprocessing unit 120, a clustering unit 130, and a shape analysis unit 140); a memory configured to store map information; and a processor operably connected to the LiDAR and the memory (see Noh para[0060] Each of these units may separately embody or be included with a processor and a memory) , wherein the processor is configured to: filter datasets including a road edge portion associated with a position of a vehicle from the map information according to a first predetermined condition related to at least one of a distance, or an angle, or any combination thereof (See Noh at least para[0065] In other words, the preprocessing unit 120 may convert the LiDAR data into data suitable for the reference coordinate system in consideration of the positional angle at which the LiDAR sensor 110 is mounted to the host vehicle. In addition, the preprocessing unit 120 may perform filtering to remove points having low intensity or reflectance using intensity or confidence information of the LiDAR data. In addition, the preprocessing unit 120 may remove data reflected from the host vehicle. In other words, since there is a region that is shielded by the body of the host vehicle depending on the mounting position and the field of view of the LiDAR sensor 110, the preprocessing unit 120 may remove data reflected from the body of the host vehicle using the reference coordinate system. ) obtain a plurality of first partial line segments by dividing a line segment included in at least one of the datasets and corresponding to the road edge portion based on a predetermined length, in the map information;(see Noh claim 1 (a2) assigning a shape flag to the mth layer using at least one of a first line segment connecting the first end point and the break point,) identify a plurality of second partial line segments from a plurality of line segments formed by contour points corresponding to the road edge portion through the LiDAR according to a second predetermined condition related to at least one of a distance, an angle, or a height, or any combination thereof; (see Noh second line segment, a predetermined priority, at least para[0130]-[0131]) However, Noh does not expressly disclose or otherwise teach identify pairs of the plurality of second partial line segments and the plurality of first partial line segments that are respectively closest to the plurality of second partial line segments in a vehicle coordinate system represented using the vehicle as a center thereof, output a position of the vehicle in a second coordinate system different from a first coordinate system by use of a calibration amount related to at least one of a lateral movement of the vehicle based on the first coordinate system, or an amount of change in heading of the vehicle, or any combination thereof based on applying a predetermined algorithm to each of the pairs, control the vehicle based on the position of the vehicle in the second coordinate system. Nevertheless, in a related field of invention, Kim same field of endeavor teaches identify pairs of the plurality of second partial line segments and the plurality of first partial line segments that are respectively closest to the plurality of second partial line segments in a vehicle coordinate system represented using the vehicle as a center thereof; and (see Kim para[0065] [According to embodiments, the calibration device (200) can generate correction data (i.e., calibration data) for correcting the positions of captured markers (CMK) in the image generated by the camera (110) to the corrected reference marker positions. For example, the calibration device (200) can generate calibration data that positions captured markers (CMK) in an image generated by the camera (110) within a predetermined range from the calibrated reference marker positions. For example, the calibration device (200) can generate calibration data to convert the positions of captured markers (CMK) in the image generated by the camera (110) to positions closest to the calibrated reference marker positions.) It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Kim’s calibrated/corrected data close to the reference marker position(line segment) in order to allow to provide a accurate method for aligning images captured by cameras of a vehicle( see Kim pra[0004]). However, Noh does not expressly disclose or otherwise teach output a position of the vehicle in a second coordinate system different from a first coordinate system by use of a calibration amount related to at least one of a lateral movement of the vehicle based on the first coordinate system, or an amount of change in heading of the vehicle, or any combination thereof based on applying a predetermined algorithm to each of the pairs, control the vehicle based on the position of the vehicle in the second coordinate system. Nevertheless, in a related field of invention, Cao same field of endeavor teaches output a position of the vehicle in a second coordinate system different from a first coordinate system by use of a calibration amount related to at least one of a lateral movement of the vehicle based on the first coordinate system, or an amount of change in heading of the vehicle, or any combination thereof based on applying a predetermined algorithm to each of the pairs, control the vehicle based on the position of the vehicle in the second coordinate system. (See Cao at least para [0019]Calculate the coordinate transformation parameters from the local coordinate system to the national geodetic coordinate system based on the first and second geodetic coordinates; at least para [0059] The calibration model building submodule is used to build a calibration model of the first road data based on the first road data and the second plane coordinates. see “control elements “, see para[0035] Based on the second field coordinates and the coordinates of the conjugate point, calculate the coordinate deviation of the road markings at the preset field point;, para[0036] Based on the coordinate deviation, a correction model is established in the national geodetic coordinate system.) It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Cao’s calibration model based on first and second coordinate in order to allow to generating a high-precision road map (see Cao para[0008]). Regarding claim 2, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the processor is further configured to identify at least one of the plurality of first partial line segments, the plurality of second partial line segments, or the pairs, or any combination thereof, in a plane formed by a first axis and a second axis, among the first axis, the second axis, and a third axis. (see Noh figure 5). Regarding claim 3, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the processor is further configured to: identify the plurality of second partial line segments in each of layers separated by a third axis, among a first axis, a second axis, and the third axis; and identify first sub-pairs included in each of the layers among the pairs; wherein the first sub-pairs include the plurality of first partial line segments that are respectively closest to the plurality of second partial line segments included in each of the layers. (see Noh at least para [0083] The LiDAR points related to one target object may be divided into M layers, i.e. the first to M.sup.th layers, in a vertical direction (e.g. the z-axis direction).) Regarding claim 5, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the processor is further configured to select a portion of the plurality of second partial line segments in a plurality of layers separated by a third axis, among a first axis, a second axis, and the third axis, based on a type of construction of the map information. (See Noh para [0170] When two target objects 740 and 742, which are separated from each other in a first frame t as shown in FIG. 22(a), are clustered into one object in a second frame t+1 subsequent to the first frame t as shown in FIG. 22(b),). Regarding claim 6, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the processor is further configured to: identify a portion of the plurality of second partial line segments identified in a first reference number of layers located at a top of the plurality of layers based on a type of construction of the map information being a top line construction type; and select a portion of the plurality of second partial line segments that are closest to a construction height of the map information, among the portion of the plurality of second partial line segments identified in the first reference number of layers. (see Noh at lest para [0163] In the comparative example, the break point BP located farthest from a baseline 800 is searched for. Thereafter, the heading direction of an object is determined on the basis of the longer one of the first line segment L1 and the second line segment L2. Thereafter, among the four corner points of a bounding box 802, the corner point 804 located closest to the break point BP is determined to be a corner point having L-shape characteristics.) Regarding claim 7, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the top line construction type is a construction type in which the map information is generated based on a portion of the road edge portion identified at a highest height with respect to the third axis. (See Noh at least para [0168]In contrast, according to an embodiment of the present disclosure, the distribution of the LiDAR points (e.g. the inner LiDAR points and the outer LiDAR points described above) located near the first and second line segments L1 and L2 is numerically calculated, the shape flag is assigned to each of the M layers resulting from the division in the height direction (i.e. the z-axis direction) of the target vehicle). Regarding claim 8, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the processor is further configured to identify a portion of the plurality of second partial line segments identified in a second reference number of layers located at a bottom of the plurality of layers based on a type of construction of the map information being a bottom line construction type. (see Noh at least para [0083]The LiDAR points related to one target object may be divided into M layers, i.e. the first to Mth layers, in a vertical direction (e.g. the z-axis direction),para [0163]In the comparative example, the break point BP located farthest from a baseline 800 is searched for. ). Regarding claim 9, Noh, Kim and Cao remain applied as claim 1. Noh teaches wherein the bottom line construction type is a construction type in which the map information is generated based on a portion of the road edge portion identified at a lowest height with respect to the third axis. (see Noh figure 19 first line segment is at the x-axis, road edge). Regarding claim 11, Noh teaches A vehicle control method including: filtering, by a processor, datasets including a road edge portion associated with a position of a vehicle from map information, according to a first predetermined condition related to at least one of a distance, or an angle, or any combination thereof; (See Noh at least para[0065] In other words, the preprocessing unit 120 may convert the LiDAR data into data suitable for the reference coordinate system in consideration of the positional angle at which the LiDAR sensor 110 is mounted to the host vehicle. In addition, the preprocessing unit 120 may perform filtering to remove points having low intensity or reflectance using intensity or confidence information of the LiDAR data. In addition, the preprocessing unit 120 may remove data reflected from the host vehicle. In other words, since there is a region that is shielded by the body of the host vehicle depending on the mounting position and the field of view of the LiDAR sensor 110, the preprocessing unit 120 may remove data reflected from the body of the host vehicle using the reference coordinate system. ) obtaining, by the processor, a plurality of first partial line segments by dividing a line segment included in at least one of the datasets and corresponding to the road edge portion based on a predetermined length, in the map information; ;(see Noh claim 1 (a2) assigning a shape flag to the mth layer using at least one of a first line segment connecting the first end point and the break point,) identifying, by the processor, a plurality of second partial line segments from a plurality of line segments formed by contour points corresponding to the road edge portion through a Light Detection and Ranging (LiDAR) according to a second predetermined condition related to at least one of a distance, an angle, or a height, or any combination thereof; (See Noh at least para[0063]The object-tracking apparatus 100 shown in FIG. 1 may include a light detection and ranging (LiDAR) sensor 110, a preprocessing unit 120, a clustering unit 130, and a shape analysis unit 140); a memory configured to store map information; and a processor operably connected to the LiDAR and the memory (see Noh para[0060] Each of these units may separately embody or be included with a processor and a memory) However, Noh does not expressly disclose or otherwise teach identifying, by the processor, pairs of the plurality of second partial line segments and the plurality of first partial line segments that are respectively closest to the plurality of second partial line segments in a vehicle coordinate system represented using the vehicle as a center thereof. Nevertheless, in a related field of invention, Kim same field of endeavor teaches identifying, by the processor, pairs of the plurality of second partial line segments and the plurality of first partial line segments that are respectively closest to the plurality of second partial line segments in a vehicle coordinate system represented using the vehicle as a center thereof; (see Kim para[0065] [According to embodiments, the calibration device (200) can generate correction data (i.e., calibration data) for correcting the positions of captured markers (CMK) in the image generated by the camera (110) to the corrected reference marker positions. For example, the calibration device (200) can generate calibration data that positions captured markers (CMK) in an image generated by the camera (110) within a predetermined range from the calibrated reference marker positions. For example, the calibration device (200) can generate calibration data to convert the positions of captured markers (CMK) in the image generated by the camera (110) to positions closest to the calibrated reference marker positions.) It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Kim’s calibrated/corrected data close to the reference marker position(line segment) in order to allow to provide a accurate method for aligning images captured by cameras of a vehicle( see Kim pra[0004]). However, Noh does not expressly disclose or otherwise teach outputting, by the processor, a position of the vehicle in a second coordinate system different from a first coordinate system by use of a calibration amount related to at least one of a lateral movement of the vehicle based on the first coordinate system, or an amount of change in heading of the vehicle, or any combination thereof based on applying a predetermined algorithm to each of the pairs; and controlling, by the processor, the vehicle based on the position of the vehicle in the second coordinate system. Nevertheless, in a related field of invention, Cao same field of endeavor teaches outputting, by the processor, a position of the vehicle in a second coordinate system different from a first coordinate system by use of a calibration amount related to at least one of a lateral movement of the vehicle based on the first coordinate system, or an amount of change in heading of the vehicle, or any combination thereof based on applying a predetermined algorithm to each of the pairs; and controlling, by the processor, the vehicle based on the position of the vehicle in the second coordinate system. (See Cao at least para [0019]Calculate the coordinate transformation parameters from the local coordinate system to the national geodetic coordinate system based on the first and second geodetic coordinates; at least para [0059] The calibration model building submodule is used to build a calibration model of the first road data based on the first road data and the second plane coordinates, see “control elements “, see para[0035] Based on the second field coordinates and the coordinates of the conjugate point, calculate the coordinate deviation of the road markings at the preset field point;, para[0036] Based on the coordinate deviation, a correction model is established in the national geodetic coordinate system.) It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Cao’s calibration model based on first and second coordinate in order to allow to generating a high-precision road map (see Cao para[0008]). Regarding claim 12, Noh, Kim and Cao remain applied as claim 11. Noh teaches further including: identifying, by the processor, at least one of the plurality of first partial line segments, the plurality of second partial line segments, or the pairs, or any combination thereof, in a plane formed by a first axis and a second axis, among the first axis, the second axis, and a third axis. (see Noh figure 5). Regarding claim 13, Noh, Kim and Cao remain applied as claim 11. Noh teaches further including: identifying, by the processor, the plurality of second partial line segments in each of layers separated by a third axis, among a first axis, a second axis, and the third axis; and identifying, by the processor, first sub-pairs included in each of the layers among the pairs, wherein the first sub-pairs include the plurality of first partial line segments that are respectively closest to the plurality of second partial line segments included in each of the layers. (see Noh at least para [0083] The LiDAR points related to one target object may be divided into M layers, i.e. the first to M.sup.th layers, in a vertical direction (e.g. the z-axis direction).) Regarding claim 15, Noh, Kim and Cao remain applied as claim 11. Noh teaches selecting, by the processor, a portion of the plurality of second partial line segments in a plurality of layers separated by a third axis, among a first axis, a second axis, and the third axis, based on a type of construction of the map information. (See Noh para [0170] When two target objects 740 and 742, which are separated from each other in a first frame t as shown in FIG. 22(a), are clustered into one object in a second frame t+1 subsequent to the first frame t as shown in FIG. 22(b),). Regarding claim 16, Noh, Kim and Cao remain applied as claim 11. Noh teaches identifying, by the processor, a portion of the plurality of second partial line segments identified in a first reference number of layers located at a top of the plurality of layers based on the type of construction of the map information being a top line construction type; and selecting, by the processor, a portion of the plurality of second partial line segments that are closest to a construction height of the map information, among the portion of the plurality of second partial line segments identified in the first reference number of layers. (see Noh at lest para [0163]In the comparative example, the break point BP located farthest from a baseline 800 is searched for. Thereafter, the heading direction of an object is determined on the basis of the longer one of the first line segment L1 and the second line segment L2. Thereafter, among the four corner points of a bounding box 802, the corner point 804 located closest to the break point BP is determined to be a corner point having L-shape characteristics.) Regarding claim 17, Noh, Kim and Cao remain applied as claim 11. Noh teaches the top line construction type is a construction type in which the map information is generated based on a portion of the road edge portion identified at a highest height with respect to the third axis (See Noh at least para [0168]In contrast, according to an embodiment of the present disclosure, the distribution of the LiDAR points (e.g. the inner LiDAR points and the outer LiDAR points described above) located near the first and second line segments L1 and L2 is numerically calculated, the shape flag is assigned to each of the M layers resulting from the division in the height direction (i.e. the z-axis direction) of the target vehicle). Regarding claim 18, Noh, Kim and Cao remain applied as claim 11. Noh teaches identifying, by the processor, a portion of the plurality of second partial line segments identified in a second reference number of layers located at a bottom of the plurality of layers based on a type of construction of the map information being a bottom line construction type. (see Noh at least para [0083]The LiDAR points related to one target object may be divided into M layers, i.e. the first to Mth layers, in a vertical direction (e.g. the z-axis direction),para [0163]In the comparative example, the break point BP located farthest from a baseline 800 is searched for. ). Regarding claim 19, Noh, Kim and Cao remain applied as claim 11. Noh teaches the bottom line construction type is a construction type in which the map information is generated based on a portion of the road edge portion identified at a lowest height with respect to the third axis (see Noh figure 19 first line segment is at the x-axis, road edge). Claims 4 and 14 are rejected under 35 U.S.C. 103 as being unpatented over US 20220179076 A1 to Noh et al. (herein after “Noh”) in view of KR 102477510 B1 to Kim et al.(herein after “Kim”), CN 109740274 A to Cao et al. (herein after”Cao”) and JP 2018205450 A to Hibino et al. (herein after “Hibino”). Regarding claim 4, Noh, Kim and Cao remain applied as claim 1. However, Noh does not expressly disclose or otherwise teach wherein the processor is further configured to: identify first identifiers assigned to the plurality of first partial line segments respectively; identify second identifiers assigned to the plurality of second partial line segments respectively; identify second sub-pairs in which a distance between the plurality of first partial line segments and the plurality of second partial line segments is less than a predetermined distance, among the pairs; and sequentially arrange at least one of the first identifiers or the second identifiers included in the identified second sub-pairs, or any combination thereof. Nevertheless, in a related field of invention, Hibino same field of endeavor teaches identify first identifiers assigned to the plurality of first partial line segments respectively; identify second identifiers assigned to the plurality of second partial line segments respectively; identify second sub-pairs in which a distance between the plurality of first partial line segments and the plurality of second partial line segments is less than a predetermined distance, among the pairs; and sequentially arrange at least one of the first identifiers or the second identifiers included in the identified second sub-pairs, or any combination thereof see Hibino first and second identifiers, at least para[0110] If the intersections E1 and E2 match or the distance between the intersections E1 and E2 is less than a predetermined distance (for example, 50 centimeters), the sameness determining unit 107 determines that the two are from the same user.). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Hibino’s first and second identifiers of line segments and the criteria of line segment less than the predetermined distance in order to allow to the normal state simply by detecting that the user has come a certain distance from the image forming apparatus. (see Hibino para[0008]). Regarding claim 14, Noh, Kim and Cao remain applied as claim 1. However, Noh does not expressly disclose or otherwise teach wherein the processor is further configured to: identify first identifiers assigned to the plurality of first partial line segments respectively; identify second identifiers assigned to the plurality of second partial line segments respectively; identify second sub-pairs in which a distance between the plurality of first partial line segments and the plurality of second partial line segments is less than a predetermined distance, among the pairs; and sequentially arrange at least one of the first identifiers or the second identifiers included in the identified second sub-pairs, or any combination thereof. Nevertheless, in a related field of invention, Hibino same field of endeavor teaches identify first identifiers assigned to the plurality of first partial line segments respectively; identify second identifiers assigned to the plurality of second partial line segments respectively; identify second sub-pairs in which a distance between the plurality of first partial line segments and the plurality of second partial line segments is less than a predetermined distance, among the pairs; and sequentially arrange at least one of the first identifiers or the second identifiers included in the identified second sub-pairs, or any combination thereof see Hibino first and second identifiers, at least para[0110] If the intersections E1 and E2 match or the distance between the intersections E1 and E2 is less than a predetermined distance (for example, 50 centimeters), the sameness determining unit 107 determines that the two are from the same user.). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Hibino’s first and second identifiers of line segments and the criteria of line segment less than the predetermined distance in order to allow to the normal state simply by detecting that the user has come a certain distance from the image forming apparatus. (see Hibino para[0008]). Claims 10 and 20 are rejected under 35 U.S.C. 103 as being unpatented over US 20220179076 A1 to Noh et al. (herein after “Noh”) in view of KR 102477510 B1 to Kim et al.(herein after “Kim”), CN 109740274 A to Cao et al. (herein after ”Cao”) and US 20220146676 A1 to Armstrong-Crews et al. (herein after “Armstrong-Crews”). Regarding claim 10, Noh, Kim and Cao remain applied as claim 1. However, Noh does not expressly disclose or otherwise teach wherein the predetermined algorithm includes at least one of an iterative closest point (ICP) algorithm, or a simultaneous localization and mapping (SLAM) algorithm, or any combination thereof. Nevertheless, in a related field of invention, Armstrong-Crews same field of endeavor teaches wherein the predetermined algorithm includes at least one of an iterative closest point (ICP) algorithm, or a simultaneous localization and mapping (SLAM) algorithm, or any combination thereof. (See Armstrong-Crews para [0076] At block 524, method 500 can continue with the computing device mapping, the first set of points to the second set of points. For example, mapping the first set of points to the second set of points can include using the iterative closest point algorithm, see “Predetermined metric”, para[0019] Various algorithms can be used for finding the optimal geometric transform, such as iterative closest point (ICP) algorithms that are capable of identifying the optimal transform using a series of steps (iterations) of a gradually improving convergence. The conventional ICP (or other mapping) algorithms are based on point mapping in coordinate space using angular (or linear lateral) coordinates and longitudinal (or radial) range (distance) values and are subject to a number of shortcomings. In particular, selecting smaller or larger time increments Δt has respective shortcomings). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Armstrong-Crews’s iterative closest point ICP algorithm and predetermined metric in order to allow to improving autonomous driving systems and components using velocity sensing data to assist in detection, identification, and tracking of objects encountered in autonomous driving environments(see Armstrong-Crews para[0001]). Regarding claim 20, Noh, Kim and Cao remain applied as claim 11. However, Noh does not expressly disclose or otherwise teach wherein the predetermined algorithm includes at least one of an iterative closest point (ICP) algorithm, or a simultaneous localization and mapping (SLAM) algorithm, or any combination thereof. Nevertheless, in a related field of invention, Armstrong-Crews same field of endeavor teaches wherein the predetermined algorithm includes at least one of an iterative closest point (ICP) algorithm, or a simultaneous localization and mapping (SLAM) algorithm, or any combination thereof. (See Armstrong-Crews para [0076] At block 524, method 500 can continue with the computing device mapping, the first set of points to the second set of points. For example, mapping the first set of points to the second set of points can include using the iterative closest point algorithm, see “Predetermined metric”, para[0019] Various algorithms can be used for finding the optimal geometric transform, such as iterative closest point (ICP) algorithms that are capable of identifying the optimal transform using a series of steps (iterations) of a gradually improving convergence. The conventional ICP (or other mapping) algorithms are based on point mapping in coordinate space using angular (or linear lateral) coordinates and longitudinal (or radial) range (distance) values and are subject to a number of shortcomings. In particular, selecting smaller or larger time increments Δt has respective shortcomings). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention with a reasonable expectation of success to combine Noh’s method of analyzing the shape of an object using a lidar sensor with Armstrong-Crews’s iterative closest point ICP algorithm and predetermined metric in order to allow to improving autonomous driving systems and components using velocity sensing data to assist in detection, identification, and tracking of objects encountered in autonomous driving environments(see Armstrong-Crews para[0001]). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NAZIA AFRIN whose telephone number is (703)756-1175. The examiner can normally be reached Monday-Friday 7:30-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A Browne can be reached at 5712700151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NAZIA AFRIN/ Examiner, Art Unit 3666 /SCOTT A BROWNE/ Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Sep 03, 2024
Application Filed
Jan 20, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600603
CRANE, CRANE CHARACTERISTIC CHANGE DETERMINATION DEVICE, AND CRANE CHARACTERISTIC CHANGE DETERMINATION SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12585271
ACTIVE GEOFENCING SYSTEM AND METHOD FOR SEAMLESS AIRCRAFT OPERATIONS IN ALLOWABLE AIRSPACE REGIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12560927
NAVIGATION METHOD AND ROBOT THEREOF
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
40%
Grant Probability
57%
With Interview (+16.7%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 10 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month