Prosecution Insights
Last updated: April 19, 2026
Application No. 18/259,229

LIDAR SYSTEM WITH AUTOMATIC YAW CORRECTION

Non-Final OA §102§103
Filed
Jun 23, 2023
Examiner
HODGES, SUSAN E
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Innoviz Technologies Ltd.
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
81%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
250 granted / 375 resolved
+8.7% vs TC avg
Moderate +14% lift
Without
With
+14.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
31 currently pending
Career history
406
Total Applications
across all art units

Statute-Specific Performance

§101
6.0%
-34.0% vs TC avg
§103
48.7%
+8.7% vs TC avg
§102
20.9%
-19.1% vs TC avg
§112
22.6%
-17.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 375 resolved cases

Office Action

§102 §103
DETAILED ACTION This office action is in response to the application filed on June 23, 2023. Claims 38 - 53, 56 - 59 and 62 - 65 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant's claim for priority based on U.S. provisional applications 63/132,287 and U.S. provisional applications 63/132,301 both filed on December 30, 2020. Information Disclosure Statement The information disclosure statement (IDS) was submitted on June 23,2023. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the Examiner. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 38, 39, 46, 56, 63 and 64 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Gassend (US 2019/0018416 A1) referred to as Gassend hereinafter. Regarding Claim 38, Gassend discloses a LIDAR system for a host vehicle (Fig. 1, Fig. 3, Par. [0058] Sensor system 304 may include a number of sensors configured to sense information about an environment in which the vehicle 300 (i.e. host vehicle) is located, including a laser rangefinder and/or LIDAR unit 332) comprising: a laser emission unit configured to generate at least one laser beam (Fig. 1, Par. [0033] Transmitter 106 may be configured to transmit light (or other signal) toward an environment of LIDAR device 100 that include one or more light sources that emit (i.e. generate) one or more light beams and/or pulses having wavelengths within a wavelength range. The wavelength range could, for example, be in the ultraviolet, visible, and/or infrared portions of the electromagnetic spectrum. In some examples, the wavelength range can be a narrow wavelength range, such as provided by lasers (i.e. laser beam)); a scanning unit configured to project the at least one laser beam toward a field of view of the LIDAR system (Par. [0006] a system comprises means (i.e. scanning unit) for scanning a field-of-view defined by a pointing direction (i.e. project) of a light detection and ranging (LIDAR) device that mounts to a vehicle. Par. [0098] Sensor 610 may be similar to any of LIDARs 100, 200, 332, 410, or any other device that emits a signal and detects reflections of the emitted signal to scan a field-of-view (FOV) defined by a pointing direction of the device) ; and at least one processor (Fig. 1, Par. [0031] controller 104 may include one or more processors) programmed to: determine at least one indicator of a current yaw orientation of the LIDAR system (Fig. 7, Par. [0127] the sensor data received at block 704 may include data from one or more devices or sensors that indicate measurements related to motion of the LIDAR device relative to the vehicle (i.e. based on detected ego motion of vehicle), such as any of motion indicators 614 for instance, where indicator(s) 614 may comprise an encoder that measures a position of sensor 610 about an axis of rotation (i.e. yaw orientation) of sensor 610 (e.g., axis 432, etc.). When sensor 610 is a LIDAR that rotates about an axis, an encoder can provide an encoder value indicating an amount of rotation of the LIDAR from an initial (or reference) position (i.e. current orientation) about the axis Par. [0104]) based on analysis of point cloud representations of at least one stationary object (Par. [0067] Computer vision system 346 may be any system configured to process and analyze images captured by camera 334 in order to identify objects and/or features in the environment in which vehicle 300 is located, including, for example, traffic signals (i.e. stationary object) and obstacles) in an environment of the host vehicle (Par. [0002] a LIDAR sensor can determine distances to environmental features while scanning through a scene to assemble a “point cloud” indicative of reflective surfaces in the environment. Individual points in the point cloud can be determined, for example, by transmitting a laser pulse and detecting a returning pulse, if any, reflected from an object in the environment, and then determining a distance to the object according to a time delay between the transmission of the pulse and the reception of the reflected pulse, where a three-dimensional map of points (i.e. point cloud representation) indicative of locations of reflective features in the environment can be generated) and based on detected ego motion of the host vehicle (Par. [0108] indicators 616 may include speed sensors that provide an indication of a speed of the motion of system 600 (and/or a vehicle (i.e. ego motion of host vehicle) that includes system 600, etc.) relative to the surrounding environment.). determine a difference between the current yaw orientation and a target yaw orientation for the LIDAR system (Par. [0135] determining a difference between the adjusted target change to the pointing direction (or adjusted target frequency of rotation (i.e. target yaw orientation)) of the LIDAR device and a measured change to the pointing direction (or measured frequency of rotation (i.e. current yaw orientation)) of the LIDAR device frequency_error = adjusted_target_frequency − measured frequency); and adjust at least one scan range limit associated with the scanning unit (Par. [0049] LIDAR device 200 can be configured to tilt the axis of rotation (i.e. scan range limit) of housing 224 to control a field of view of LIDAR device 200, rotating platform 216 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation (i.e. yaw orientation) of LIDAR device 200) to at least partially compensate (Par. [0024], the vehicle can mitigate or prevent (i.e. compensate) variations in the apparent frequency of rotation of the LIDAR device relative to the environment before, during, and/or after a driving maneuver (e.g., left turn or right turn) for a difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system (Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction, etc.) of the rotation of the LIDAR device about the axis). Regarding Claim 39, Gassend discloses claim 38. Gassend further discloses wherein the target yaw orientation is dependent upon a lateral position of the host vehicle on a road segment (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel (i.e. lateral position) of a car on a road (i.e. road segment), etc.) of the vehicle in the environment). Regarding Claim 46, Gassend discloses claim 38. Gassend further discloses wherein the at least one indicator of the current yaw orientation (Fig. 7, Par. [0127] the sensor data received at block 704 may include data from one or more devices or sensors that indicate measurements related to motion of the LIDAR device relative to the vehicle (i.e. based on detected ego motion of vehicle), such as any of motion indicators 614 for instance, where indicator(s) 614 may comprise an encoder that measures a position of sensor 610 about an axis of rotation (i.e. yaw orientation) of sensor 610 (e.g., axis 432, etc.). When sensor 610 is a LIDAR that rotates about an axis, an encoder can provide an encoder value indicating an amount of rotation of the LIDAR from an initial (or reference) position (i.e. current orientation) about the axis Par. [0104]) is determined by identifying at least one stationary object (Par. [0067] Computer vision system 346 may be any system configured to process and analyze images captured by camera 334 in order to identify objects and/or features in the environment in which vehicle 300 is located, including, for example, traffic signals (i.e. stationary object) and obstacles) represented in a sequence (Par. 0067] Computer vision system 346 may be any system configured to process and analyze images captured by camera 334 in order to identify objects and/or features in the environment in which vehicle 300 is located, including, for example, traffic signals and obstacles. To that end, computer vision system 346 may use video tracking (i.e. sequence of frames), or other computer vision techniques) of generated point cloud frames (Par. [0002] a LIDAR sensor can determine distances to environmental features while scanning through a scene to assemble a “point cloud” indicative of reflective surfaces in the environment. Individual points in the point cloud can be determined, for example, by transmitting a laser pulse and detecting a returning pulse, if any, reflected from an object in the environment, and then determining a distance to the object according to a time delay between the transmission of the pulse and the reception of the reflected pulse, where a three-dimensional map of points (i.e. point cloud representation) indicative of locations of reflective features in the environment can be generated); and analyzing a trajectory of the at least one stationary object (Par. [0069], computer vision system 346 may additionally be configured to map the environment, track objects (i.e. trajectory), estimate the speed of objects (i.e. trajectory), etc.) over the sequence of generated point cloud frames (Par. [0067] Computer vision system 346 may be any system configured to process and analyze images captured by camera 334 in order to identify objects and/or features in the environment in which vehicle 300 is located, including, for example, traffic signals and obstacles (i.e. stationary object)) while accounting for effects of detected vehicle ego motion (Par. [0069] Obstacle avoidance system 350 may be any system configured to identify, evaluate (i.e. analyzing), and avoid or otherwise negotiate obstacles (i.e. stationary object) in the environment of vehicle 300). Regarding Claim 54 and Claim 55, they have been canceled. Regarding Claim 56, Gassend discloses claim 38. Gassend further discloses wherein initiation of the adjustment to the at least one scan range limit is further (Fig. 7, Par. [0120], the scanned FOV (i.e. initiation) may correspond to a region of an environment within contour 442 when the LIDAR device is at a first pointing direction and to the region within contour 444 when the LIDAR device is at a second pointing direction, etc.) based on detected changes in the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system (Fig. 7, (Par. [0135] determining a difference between the adjusted target change to the pointing direction (or adjusted target frequency of rotation (i.e. target yaw orientation)) of the LIDAR device and a measured change to the pointing direction (or measured frequency of rotation (i.e. current yaw orientation)) of the LIDAR device frequency_error = adjusted_target_frequency − measured frequency)). Regarding Claim 60 and Claim 61, they have been canceled. Regarding Claim 63, Gassend discloses claim 38. Gassend further discloses wherein the difference between the current yaw orientation and the target yaw orientation results from a detected directional change (Par. [0135] determining a difference between the adjusted target change to the pointing direction (i.e. detected direction change) (or adjusted target frequency of rotation (i.e. target yaw orientation)) of the LIDAR device and a measured change to the pointing direction (or measured frequency of rotation (i.e. current yaw orientation)) of the LIDAR device frequency_error = adjusted_target_frequency − measured frequency) of a road segment forward of the host vehicle (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel (i.e. forward direction) of a car on a road (i.e. road segment), etc.) of the vehicle in the environment). Regarding Claim 64, Gassend discloses claim 38. Gassend further discloses wherein the ego motion of the host vehicle is detected based on an output from at least one of a speed sensor, a GPS unit, an accelerometer, or an inertial motion sensor (Par. [0108] indicators 616 may include speed sensors that provide an indication of a speed of the motion of system 600 (and/or a vehicle (i.e. ego motion of host vehicle) that includes system 600, etc.) relative to the surrounding environment, where indicators 616 may include any combination of sensors, such as a speedometer (e.g., sensor that measures rate of rotation of wheels 324, etc.), a satellite navigation sensor (e.g., GPS 326 that provides data indicating a speed of motion of vehicle 300), an inertial measurement unit (e.g., IMU 328), an accelerometer, a gyroscope, among other possibilities). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 40 - 45 and 51 are rejected under 35 U.S.C. 103 as being unpatentable over Gassend (US 2019/0018416 A1), in view of Liao et al., (US 2021/0109204 A1) referred to as Liao hereinafter. Regarding Claim 40, Gassend discloses claim 39. Gassend further discloses a lateral position of the host vehicle (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel (i.e. lateral position) of a car on a road (i.e. road segment), etc.) of the vehicle in the environment) results in a target yaw orientation biased of the host vehicle (Par. [0086], LIDAR device 410 can emit light toward regions of the environment that are relatively close to the vehicle (e.g., a lane marker) (i.e. road segment) as well as toward regions of the environment that are further away from the vehicle (e.g., a road sign ahead of the vehicle. Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction (i.e. target yaw orientation), etc.) of the rotation of the LIDAR device about the axis). Gassend does not specifically teach edge of the road. Therefore, Gassend fails to explicitly teach a lateral position of the host vehicle toward a right edge of the road segment results in a target yaw orientation biased toward a left side of the host vehicle. However, Liao teaches a lateral position of the host vehicle toward a right edge of the road segment results in a target yaw orientation biased toward a left side of the host vehicle (Par. [0115] analyzing the spatial relationship can include determining an amount of lateral asymmetry between the first lane marking from the path of the vehicle and the second lane marking from the path of the vehicle and determining the deviation from the expected alignment of the LiDAR sensor (i.e. target yaw orientation) can include determining a yaw error of the LiDAR sensor based on the amount of lateral asymmetry. Fig. 12B, Par. [0100] a method of dynamic calibration of a LiDAR sensor mounted on a vehicle can use road features as the vehicle travels in a relatively straight section of a road (i.e. road segments). The painted lane markings 1240 can be used for dynamically calibrating the position and orientation of the LiDAR sensor 1210 relative to the vehicle 1220 based on LiDAR images acquired while the vehicle is traveling along the road 1230. In Fig. 12B, the vehicle path 1250 can appear as moving closer to the lane marking 1240a on the driver side (i.e. left edge of road segment) than to the lane marking 1240b on the passenger side (i.e. right edge of road segment) and vice versa. Thus, by analyzing the LiDAR images of the lane markings 1240 with respect to the vehicle path 1250, the amount of mis-alignment (e.g., the yaw error) (i.e. target orientation biased toward left or right side) of the LiDAR sensor 1210 with respect to the vehicle 1220 can be estimated). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying the target yaw orients based on the edge of the road as taught by Liao in the invention of Gassend. This modification would allow using lane makings for dynamic calibration of a LiDAR sensor mounted on a vehicle (See Liao, Par. [0102]). Regarding Claim 41, Gassend discloses claim 39. Gassend further discloses a lateral position of the host vehicle (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel (i.e. lateral position) of a car on a road (i.e. road segment), etc.) of the vehicle in the environment) results in a target yaw orientation biased of the host vehicle (Par. [0086], LIDAR device 410 can emit light toward regions of the environment that are relatively close to the vehicle (e.g., a lane marker) (i.e. road segment) as well as toward regions of the environment that are further away from the vehicle (e.g., a road sign ahead of the vehicle. Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction (i.e. target yaw orientation), etc.) of the rotation of the LIDAR device about the axis). Gassend does not specifically teach edge of the road. Therefore, Gassend fails to explicitly teach a lateral position of the host vehicle toward a left edge of the road segment results in a target yaw orientation biased toward a right side of the host vehicle. However, Liao teaches a lateral position of the host vehicle toward a left edge of the road segment results in a target yaw orientation biased toward a right side of the host vehicle (Par. [0115] analyzing the spatial relationship can include determining an amount of lateral asymmetry between the first lane marking from the path of the vehicle and the second lane marking from the path of the vehicle and determining the deviation from the expected alignment of the LiDAR sensor (i.e. target yaw orientation) can include determining a yaw error of the LiDAR sensor based on the amount of lateral asymmetry. Fig. 12B, Par. [0100] a method of dynamic calibration of a LiDAR sensor mounted on a vehicle can use road features as the vehicle travels in a relatively straight section of a road (i.e. road segments). The painted lane markings 1240 can be used for dynamically calibrating the position and orientation of the LiDAR sensor 1210 relative to the vehicle 1220 based on LiDAR images acquired while the vehicle is traveling along the road 1230. In Fig. 12B, the vehicle path 1250 can appear as moving closer to the lane marking 1240a on the driver side (i.e. left edge of road segment) than to the lane marking 1240b on the passenger side (i.e. right edge of road segment) and vice versa. Thus, by analyzing the LiDAR images of the lane markings 1240 with respect to the vehicle path 1250, the amount of mis-alignment (e.g., the yaw error) (i.e. target orientation biased toward left or right side) of the LiDAR sensor 1210 with respect to the vehicle 1220 can be estimated). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying the target yaw orients based on the edge of the road as taught by Liao in the invention of Gassend. This modification would allow using lane makings for dynamic calibration of a LiDAR sensor mounted on a vehicle (See Liao, Par. [0102]). Regarding Claim 42, Gassend discloses claim 38. Gassend further discloses the target yaw orientation is dependent upon road segment (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel (i.e. lateral position) of a car on a road (i.e. road segment), etc.) of the vehicle in the environment. Par. [0086] LIDAR device 410 can emit light toward regions of the environment that are relatively close to the vehicle (e.g., a lane marker)). Gassend does not specifically teach target yaw orientation is dependent on lane location of road segment. Therefore, Gassend fails to explicitly teach the target yaw orientation is dependent upon a lane location of the host vehicle on a road segment. However, Liao teaches the target yaw orientation is dependent upon a lane location of the host vehicle on a road segment (Par. [0115] analyzing the spatial relationship can include determining an amount of lateral asymmetry between the first lane marking from the path of the vehicle and the second lane marking from the path of the vehicle and determining the deviation from the expected alignment of the LiDAR sensor (i.e. target yaw orientation) can include determining a yaw error of the LiDAR sensor based on the amount of lateral asymmetry. Fig. 12B, Par. [0100] a method of dynamic calibration of a LiDAR sensor mounted on a vehicle can use road features as the vehicle travels in a relatively straight section of a road (i.e. road segments). The painted lane markings 1240 can be used (i.e. dependent on lane location) for dynamically calibrating the position and orientation of the LiDAR sensor 1210 relative to the vehicle 1220 based on LiDAR images acquired while the vehicle is traveling along the road 1230). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying the target yaw orients based on the lane location of the road as taught by Liao in the invention of Gassend. This modification would allow using lane makings for dynamic calibration of a LiDAR sensor mounted on a vehicle (See Liao, Par. [0102]). Regarding Claim 43, Gassend in view of Liao teaches claim 42. Gassend further teaches wherein a location of the host vehicle on the road segment (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel of a car on a road (i.e. road segment), etc.) of the vehicle in the environment) results in a target yaw orientation biased of the host vehicle (Par. [0086], LIDAR device 410 can emit light toward regions of the environment that are relatively close to the vehicle (e.g., a lane marker) (i.e. road segment) as well as toward regions of the environment that are further away from the vehicle (e.g., a road sign ahead of the vehicle. Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction (i.e. target yaw orientation), etc.) of the rotation of the LIDAR device about the axis). Gassend does not specifically teach right lane of road segment. Therefore, Gassend fails to explicitly teach a location of the host vehicle in a right lane location on the road segment results in a target yaw orientation biased toward a lane to a left side of the host vehicle. However, Liao teaches a location of the host vehicle in a right lane location on the road segment results in a target yaw orientation biased toward a lane to a left side of the host vehicle (Par. [0115] analyzing the spatial relationship can include determining an amount of lateral asymmetry between the first lane marking from the path of the vehicle and the second lane marking from the path of the vehicle and determining the deviation from the expected alignment of the LiDAR sensor (i.e. target yaw orientation) can include determining a yaw error of the LiDAR sensor based on the amount of lateral asymmetry. Fig. 12B, Par. [0100] a method of dynamic calibration of a LiDAR sensor mounted on a vehicle can use road features as the vehicle travels in a relatively straight section of a road (i.e. road segments). The painted lane markings 1240 can be used for dynamically calibrating the position and orientation of the LiDAR sensor 1210 relative to the vehicle 1220 based on LiDAR images acquired while the vehicle is traveling along the road 1230. In Fig. 12B, the vehicle path 1250 can appear as moving closer to the lane marking 1240a on the driver side (i.e. left lane location) than to the lane marking 1240b on the passenger side (i.e. right lane location) and vice versa. Thus, by analyzing the LiDAR images of the lane markings 1240 with respect to the vehicle path 1250, the amount of mis-alignment (e.g., the yaw error) (i.e. target orientation biased toward left or right side) of the LiDAR sensor 1210 with respect to the vehicle 1220 can be estimated). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying the target yaw orients based on the lane location of the road as taught by Liao in the invention of Gassend. This modification would allow using lane makings for dynamic calibration of a LiDAR sensor mounted on a vehicle (See Liao, Par. [0102]). Regarding Claim 44, Gassend in view of Liao teaches claim 42. Gassend further teaches wherein a location of the host vehicle on the road segment (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel of a car on a road (i.e. road segment), etc.) of the vehicle in the environment) results in a target yaw orientation biased of the host vehicle (Par. [0086], LIDAR device 410 can emit light toward regions of the environment that are relatively close to the vehicle (e.g., a lane marker) (i.e. road segment) as well as toward regions of the environment that are further away from the vehicle (e.g., a road sign ahead of the vehicle. Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction (i.e. target yaw orientation), etc.) of the rotation of the LIDAR device about the axis). Gassend does not specifically teach right lane of road segment. Therefore, Gassend fails to explicitly teach a location of the host vehicle in a right lane location on the road segment results in a target yaw orientation biased toward a lane to a left side of the host vehicle. However, Liao teaches a location of the host vehicle in a left lane location on the road segment results in a target yaw orientation biased toward a lane to a right side of the host vehicle (Par. [0115] analyzing the spatial relationship can include determining an amount of lateral asymmetry between the first lane marking from the path of the vehicle and the second lane marking from the path of the vehicle and determining the deviation from the expected alignment of the LiDAR sensor (i.e. target yaw orientation) can include determining a yaw error of the LiDAR sensor based on the amount of lateral asymmetry. Fig. 12B, Par. [0100] a method of dynamic calibration of a LiDAR sensor mounted on a vehicle can use road features as the vehicle travels in a relatively straight section of a road (i.e. road segments). The painted lane markings 1240 can be used for dynamically calibrating the position and orientation of the LiDAR sensor 1210 relative to the vehicle 1220 based on LiDAR images acquired while the vehicle is traveling along the road 1230. In Fig. 12B, the vehicle path 1250 can appear as moving closer to the lane marking 1240a on the driver side (i.e. left lane location) than to the lane marking 1240b on the passenger side (i.e. right lane location) and vice versa. Thus, by analyzing the LiDAR images of the lane markings 1240 with respect to the vehicle path 1250, the amount of mis-alignment (e.g., the yaw error) (i.e. target orientation biased toward left or right side) of the LiDAR sensor 1210 with respect to the vehicle 1220 can be estimated). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying the target yaw orients based on the lane location of the road as taught by Liao in the invention of Gassend. This modification would allow using lane makings for dynamic calibration of a LiDAR sensor mounted on a vehicle (See Liao, Par. [0102]). Regarding Claim 45, Gassend in view of Liao teaches claim 42. Gassend further teaches wherein a location of the host vehicle on the road segment (Par. [0023], one example sensor (e.g., yaw sensor) may indicate a measurement of a yaw direction (e.g., direction of travel of a car on a road (i.e. road segment), etc.) of the vehicle in the environment) results in target yaw orientation (Par. [0086], LIDAR device 410 can emit light toward regions of the environment that are relatively close to the vehicle (e.g., a lane marker) (i.e. road segment) as well as toward regions of the environment that are further away from the vehicle (e.g., a road sign ahead of the vehicle. Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction (i.e. target yaw orientation), etc.) of the rotation of the LIDAR device about the axis). Gassend does not specifically teach middle lane location of road. Therefore, Gassend fails to explicitly teach a location of the host vehicle in a middle lane location on the road segment results in a non-biased target yaw orientation. However, Liao teaches a location of the host vehicle in a middle lane location on the road segment results in a non-biased target yaw orientation (Par. [0115] analyzing the spatial relationship can include determining an amount of lateral asymmetry between the first lane marking from the path of the vehicle and the second lane marking from the path of the vehicle and determining the deviation from the expected alignment of the LiDAR sensor (i.e. target yaw orientation) can include determining a yaw error of the LiDAR sensor based on the amount of lateral asymmetry. Fig. 12A, Par. [0100] a method of dynamic calibration of a LiDAR sensor mounted on a vehicle can use road features as the vehicle travels in a relatively straight section of a road (i.e. road segments). The painted lane markings 1240 can be used for dynamically calibrating the position and orientation of the LiDAR sensor 1210 relative to the vehicle 1220 based on LiDAR images acquired while the vehicle is traveling along the road 1230. In Fig. 12A, if the LiDAR sensor 1210 is properly aligned (i.e. non-biased target yaw orientation) with respect to the vehicle 1220 (e.g., the LiDAR sensor 1210 is looking in the same direction as the vehicle 1220 is heading), the pair of lane markings 1240 may appear to pass equally on each side of the vehicle path 1250. Thus, by analyzing the LiDAR images of the lane markings 1240 with respect to the vehicle path 1250, the amount of mis-alignment (e.g., the yaw error) of the LiDAR sensor 1210 with respect to the vehicle 1220 can be estimated). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying the target yaw orients based on the lane location of the road as taught by Liao in the invention of Gassend. This modification would allow using lane makings for dynamic calibration of a LiDAR sensor mounted on a vehicle (See Liao, Par. [0102]). Regarding Claim 51, Gassend discloses claim 38. Gassend further discloses wherein an adjustment of the at least one scan range limit associated with the scanning unit (Par. [0049] LIDAR device 200 can be configured to tilt the axis of rotation (i.e. scan range limit) of housing 224 to control a field of view of LIDAR device 200, rotating platform 216 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation (i.e. yaw orientation) of LIDAR device 200) is dependent upon of the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system (Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction, etc.) of the rotation of the LIDAR device about the axis). Gassend fails to explicitly teach an amount of adjustment is dependent upon a magnitude of the difference. However, Liao teaches an amount of adjustment of the at least one scan range limit associated with the scanning unit is dependent upon a magnitude of the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system (Par. [0049], The LiDAR sensor 410 can determine the amount of mis-orientation (e.g., the yaw error) (i.e. amount of adjustment) of the LiDAR sensor 410 based on the amount of the shift (e.g., the difference between the new margin (i.e. target) and the nominal margin (i.e. current))). References Gassend and Liao are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying an amount of adjust dependent on magnitude of difference as taught by Liao in the invention of Gassend. This modification would allow correct alignment with respect to the target (See Liao, Par. [0049]). Claims 47 – 50 and 65 are rejected under 35 U.S.C. 103 as being unpatentable over Gassend (US 2019/0018416 A1), in view of Keilaf et al., (US 2018/0128920 A1) referred to as Keilaf hereinafter. Regarding Claim 47, Gassend discloses claim 38. Gassend further discloses wherein the scanning unit includes scanner (Par. [0006] a system comprises means (i.e. scanner) for scanning a field-of-view defined by a pointing direction (i.e. project) of a light detection and ranging (LIDAR) device that mounts to a vehicle. Par. [0098] Sensor 610 may be similar to any of LIDARs 100, 200, 332, 410, or any other device that emits a signal and detects reflections of the emitted signal to scan a field-of-view (FOV) defined by a pointing direction of the device), and adjustment of the at least one scan range limit (Par. [0049] LIDAR device 200 can be configured to tilt the axis of rotation (i.e. scan range limit) of housing 224 to control a field of view of LIDAR device 200, rotating platform 216 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation of LIDAR device 200) includes changing at least one horizontal scan limit (Par. [0041], a pointing direction of LIDAR device 100 can be adjusted horizontally (i.e. horizontal scan limit) by actuating the rotating platform 114 to different directions). While Gassend teaches in Par. [0041], a pointing direction of LIDAR device 100 can be adjusted horizontally (i.e. horizontal scanning) by actuating the rotating platform 114 to different directions and in Par. [0049], LIDAR device 200 can be configured to tilt the axis of rotation of housing 224 to control a field of view of LIDAR device 200, Gassend does not specifically teach a biaxial scanner. Therefore, Gassend fails to explicitly teach the scanning unit includes a biaxial scanner. However, Keilaf teaches the scanning unit includes a biaxial scanner (Par. [0147] a dual axis MEMS mirror (i.e. biaxial scanner) may be configured to deflect light in a horizontal direction and in a vertical direction). References Gassend and Keilaf are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specify a biaxial scanner as taught by Keilaf in the invention of Gassend in order to scan both directions (See Keilaf, Par. [0147]). Regarding Claim 48, Gassend in view of Keilaf teaches claim 47. Gassend further discloses changing the at least one horizontal scan limit provides a horizontal shift of a field of view of the LIDAR system (Par. [0041], a pointing direction of LIDAR device 100 can be adjusted horizontally by actuating (i.e. shifting) the rotating platform 114 to different directions). Gassend does not specifically teach a biaxial scanner. Therefore, Gassend fails to explicitly teach the scanning unit includes a biaxial scanner. However, Keilaf further teaches the scanning unit includes a biaxial scanner (Par. [0147] a dual axis MEMS mirror (i.e. biaxial scanner) may be configured to deflect light in a horizontal direction and in a vertical direction). References Gassend and Keilaf are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specify a biaxial scanner as taught by Keilaf in the invention of Gassend in order to scan both directions (See Keilaf, Par. [0147]). Regarding Claim 49, Gassend discloses claim 38. Gassend further discloses wherein the scanning unit includes at least one horizontal scanner (Par. [0006] a system comprises means (i.e. scanner) for scanning a field-of-view defined by a pointing direction (i.e. project) of a light detection and ranging (LIDAR) device that mounts to a vehicle. Par. [0098] Sensor 610 may be similar to any of LIDARs 100, 200, 332, 410, or any other device that emits a signal and detects reflections of the emitted signal to scan a field-of-view (FOV) defined by a pointing direction of the device), and adjustment of the at least one scan range limit (Par. [0049] LIDAR device 200 can be configured to tilt the axis of rotation (i.e. scan range limit) of housing 224 to control a field of view of LIDAR device 200, rotating platform 216 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation of LIDAR device 200) includes changing at least one horizontal scan range limit associated with the at least one horizontal scanner (Par. [0041], a pointing direction of LIDAR device 100 can be adjusted horizontally (i.e. horizontal scan limit) by actuating the rotating platform 114 to different directions). While Gassend teaches in Par. [0041], a pointing direction of LIDAR device 100 can be adjusted horizontally (i.e. horizontal scanning) by actuating the rotating platform 114 to different directions and in Par. [0049], LIDAR device 200 can be configured to tilt the axis of rotation of housing 224 to control a field of view of LIDAR device 200, Gassend does not specifically teach a horizontal and a vertical scanner. Therefore, Gassend fails to explicitly teach the scanning unit includes at least one vertical scanner and at least one horizontal scanner. However, Keilaf teaches the scanning unit includes at least one vertical scanner and at least one horizontal scanner (Par. [0147] a dual axis MEMS mirror (i.e. horizontal and vertical scanner) may be configured to deflect light in a horizontal direction and in a vertical direction). References Gassend and Keilaf are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specify a horizontal and a vertical scanner as taught by Keilaf in the invention of Gassend in order to scan both directions (See Keilaf, Par. [0147]). Regarding Claim 50, Gassend in view of Keilaf teaches claim 49. Gassend further discloses wherein changing the at least one horizontal scan range limit associated with the at least one horizontal scanner provides a horizontal shift of a field of view of the LIDAR system (Par. [0041], a pointing direction of LIDAR device 100 can be adjusted horizontally (i.e. horizontal scanner) by actuating (i.e. shifting) the rotating platform 114 to different directions). Regarding Claim 65, Gassend discloses claim 38. Gassend further discloses wherein the ego motion of the host vehicle is detected using mapping (Par. [0002], a three-dimensional map of points indicative of locations of reflective features in the environment can be generated). Gassend does not specifically teach simultaneous localization and mapping (SLAM). Therefore, Gassend fails to explicitly teach the ego motion of the host vehicle is detected using simultaneous localization and mapping (SLAM). However, Keilaf teaches the ego motion of the host vehicle is detected using simultaneous localization and mapping (SLAM) (Par. [0539] A navigational state of the vehicle may also include a position of the host vehicle relative to three-dimensional maps, partial maps, 2-D maps, landmarks, or any combination of map and landmarks, etc. Maps may be pre-stored, received via a communication channel, or generated (e.g. by SLAM)). References Gassend and Keilaf are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specify detection of motion of vehicle using SLAM as taught by Keilaf in the invention of Gassend. This modification would provide a navigational state of the vehicle including a position of the host vehicle (See Keilaf, Par. [0539]). Claims 52, 53, 57 – 59 and 62 are rejected under 35 U.S.C. 103 as being unpatentable over Gassend (US 2019/0018416 A1), in view of Heit et al., (US 2019/0155291 A1) referred to as Heit hereinafter. Regarding Claim 52, Gassend discloses claim 38. Gassend further discloses the adjustment is initiated (Fig. 7, Par. [0128], the adjustment at block 706 may involve adjusting one or more characteristics (e.g., frequency, phase, direction (i.e. initiation of adjustment), etc.) of the rotation of the LIDAR device about the axis). Gassend does not specifically teach initiate adjustment after predetermined time delay. However, Heit teaches initiate adjustment (Fig. 4, Par. [0065] At 412 (which is after step 408), vehicle parameters are changed (i.e. initiate adjustment) to be used in the next scenario of the autonomous vehicle system and an additional iteration of process 400 begins) after predetermined time delay (Par. [0063] At 408, one or more performance metrics of the scenario of step 406 are determined, according to the description of performance metrics, where Par. [0031] performance metrics 112 may include one or more of: a length of a delay before the autonomous vehicle system senses an object within range of a specified sensor (e.g., delay of 300 milliseconds (i.e. predetermined time delay) for the vehicle to detect an object via LIDAR)). References Gassend and Heit are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying a predetermined time delay to initiate an adjustment as taught by Heit in the invention of Gassend to order to improve the performance of the vehicle (See Heit, Par. [0004]). Regarding Claim 53, Gassend in view of Heit teaches claim 52. Heit further discloses wherein the predetermined time delay is i) between 0.2 seconds and 30 seconds, ii) between 0.2 seconds and 10 seconds, or iii) between 0.2 seconds and 1 second (Par. [0031] performance metrics 112 may include one or more of: a length of a delay before the autonomous vehicle system senses an object within range of a specified sensor (e.g., delay of 300 milliseconds (i.e. 0.3 seconds which is between 0.2 seconds and 1 second, 10 seconds or even 30 seconds) for the vehicle to detect an object via LIDAR)). Regarding Claim 57, Gassend discloses claim 56. Gassend further discloses wherein the at least one processor is further programmed to adjustment if the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system exists (Par. [0135] determining a difference between the adjusted target change to the pointing direction (or adjusted target frequency of rotation (i.e. target yaw orientation)) of the LIDAR device and a measured change to the pointing direction (or measured frequency of rotation (i.e. current yaw orientation)) of the LIDAR device frequency_error = adjusted_target_frequency − measured frequency). Gassend does not specifically teach forgoing an adjustment if the difference no longer exists. However, Heit teaches the at least one processor is further programmed to forego the adjustment if, after a predetermined time delay ((Fig. 4, Par. [0064], at Step 414, process 400 is completed and ends in response to determining to forgo additional scenarios of the autonomous vehicle with different vehicle parameters. Par. [0065] At 412 (which is after step 408), vehicle parameters are changed (i.e. initiate adjustment) to be used in the next scenario of the autonomous vehicle system and an additional iteration of process 400 begins. Par. [0063] At 408, one or more performance metrics of the scenario of step 406 are determined, according to the description of performance metrics, where Par. [0031] performance metrics 112 may include one or more of: a length of a delay before the autonomous vehicle system senses an object within range of a specified sensor (e.g., delay of 300 milliseconds (i.e. predetermined time delay) for the vehicle to detect an object via LIDAR)), the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system no longer exists (Fig. 4, Par. [0064], at Step 410, simulating another scenario of the autonomous vehicle with different vehicle parameters may be forgone in response to sufficient improvement of the vehicle parameters of the minimum performance scenario (i.e. no longer exists) (i.e., sufficient increase in performance metrics determined in scenarios of the autonomous vehicle system using the vehicle parameters), where a minimum performance scenario identification process (e.g., via performance metric analyzer 122) to identify any scenarios where performance metrics 112 of one or more scenarios are determined to be less than a specified amount Par. [0034]). References Gassend and Heit are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying forgoing an adjustment as taught by Heit in the invention of Gassend to order to improve the performance of the vehicle (See Heit, Par. [0004]). Regarding Claim 58, Gassend discloses claim 56. Gassend further discloses wherein the adjustment is initiated if the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system (Par. [0135] determining a difference between the adjusted target change to the pointing direction (or adjusted target frequency of rotation (i.e. target yaw orientation)) of the LIDAR device and a measured change to the pointing direction (or measured frequency of rotation (i.e. current yaw orientation)) of the LIDAR device frequency_error = adjusted_target_frequency − measured frequency). Gassend does not specifically teach foregoing an adjustment if the difference is trending towards zero. However, Heit teaches wherein the adjustment is initiated after a predetermined first time delay and the at least one processor is further programmed to forego the adjustment for at least a second time delay (Par. [0065] At 412, vehicle parameters are changed to be used in the next scenario (i.e. second time delay) of the autonomous vehicle system and an additional iteration of process 400 begins) if, after the predetermined first time delay ((Fig. 4, Par. [0067], at Step 414, process 400 is completed and ends in response to determining to forgo additional scenarios of the autonomous vehicle with different vehicle parameters. Par. [0065] At 412 (which is after step 408), vehicle parameters are changed (i.e. initiate adjustment) to be used in the next scenario of the autonomous vehicle system and an additional iteration of process 400 begins. Par. [0063] At 408, one or more performance metrics of the scenario of step 406 are determined, according to the description of performance metrics, where Par. [0031] performance metrics 112 may include one or more of: a length of a delay before the autonomous vehicle system senses an object within range of a specified sensor (e.g., delay of 300 milliseconds (i.e. predetermined first time delay) for the vehicle to detect an object via LIDAR)), the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system (Fig. 4, Par. [0064], at Step 410, simulating another scenario of the autonomous vehicle with different vehicle parameters may be forgone in response to sufficient improvement of the vehicle parameters of the minimum performance scenario (i.e. no longer exists) (i.e., sufficient increase in performance metrics determined in scenarios of the autonomous vehicle system using the vehicle parameters), where a minimum performance scenario identification process (e.g., via performance metric analyzer 122) to identify any scenarios where performance metrics 112 of one or more scenarios are determined to be less than a specified amount Par. [0034]) is trending toward zero (Par. [0047] improved vehicle parameters (or an improved vehicle configuration (i.e. trending toward zero)) associated with increased performance metric or a performance above a specified threshold). References Gassend and Heit are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying forgoing an adjustment as taught by Heit in the invention of Gassend to order to improve the performance of the vehicle (See Heit, Par. [0004]). Regarding Claim 59, Gassend in view of Heit teaches claim 58. Heit further teaches wherein the second time delay is ) between 0.2 seconds and 30 seconds, ii) between 0.2 seconds and 10 seconds, or iii) between 0.2 seconds and 1 second (Par. [0031] performance metrics 112 may include one or more of: a length of a delay before the autonomous vehicle system senses an object within range of a specified sensor (e.g., delay of 300 milliseconds (i.e. 0.3 seconds which is between 0.2 seconds and 1 second, 10 seconds or even 30 seconds) for the vehicle to detect an object via LIDAR)). Regarding Claim 62, Gassend discloses claim 56. Gassend further discloses wherein the at least one processor is further programmed to actively adjust (Par. [0126] the measurements by a “yaw sensor” aligned with axis 414 of vehicle 400 may be biased due to the pitch/roll orientation of the vehicle during the turning maneuver or while driving (i.e. actively adjusted) on the banked surface) the at least one scan range limit (Par. [0049] LIDAR device 200 can be configured to tilt the axis of rotation (i.e. actively adjust) of housing 224 to control a field of view of LIDAR device 200, rotating platform 216 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation of LIDAR device 200), if, the difference between the current yaw orientation of the LIDAR system and the target yaw orientation for the LIDAR system is constant or increasing (Par. [0135] determining a difference between the adjusted target change to the pointing direction (or adjusted target frequency of rotation (i.e. target yaw orientation)) of the LIDAR device and a measured change (i.e. increasing) to the pointing direction (or measured frequency of rotation (i.e. current yaw orientation)) of the LIDAR device frequency_error = adjusted_target_frequency − measured frequency) Gassend does not specifically teach initiate adjustment after predetermined time delay. However, Heit teaches adjustment (Fig. 4, Par. [0065] At 412 (which is after step 408), vehicle parameters are changed (i.e. initiate adjustment) to be used in the next scenario of the autonomous vehicle system and an additional iteration of process 400 begins) after predetermined time delay (Par. [0063] At 408, one or more performance metrics of the scenario of step 406 are determined, according to the description of performance metrics, where Par. [0031] performance metrics 112 may include one or more of: a length of a delay before the autonomous vehicle system senses an object within range of a specified sensor (e.g., delay of 300 milliseconds (i.e. predetermined time delay) for the vehicle to detect an object via LIDAR)). References Gassend and Heit are considered to be analogous art because they relate to sensor adjustments in vehicles. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying a predetermined time delay to initiate an adjustment as taught by Heit in the invention of Gassend to order to improve the performance of the vehicle (See Heit, Par. [0004]). Conclusion Any inquiry concerning this communication should be directed to SUSAN E HODGES whose telephone number is (571)270-0498. The Examiner can normally be reached on Monday - Friday from 8:00 am (EST) to 4:00 pm (EST). If attempts to reach the Examiner by telephone are unsuccessful, the Examiner's supervisor, Brian T. Pendleton, can be reached on (571) . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://portal.uspto.gov/external/portal. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /Susan E. Hodges/Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Jun 23, 2023
Application Filed
Feb 24, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603982
STEREOSCOPIC HIGH DYNAMIC RANGE VIDEO
2y 5m to grant Granted Apr 14, 2026
Patent 12604008
ADAPTIVE CLIPPING IN MODELS PARAMETERS DERIVATIONS METHODS FOR VIDEO COMPRESSION
2y 5m to grant Granted Apr 14, 2026
Patent 12574558
Method and Apparatus for Sign Coding of Transform Coefficients in Video Coding System
2y 5m to grant Granted Mar 10, 2026
Patent 12568212
ADAPTIVE LOOP FILTERING ON OUTPUT(S) FROM OFFLINE FIXED FILTERING
2y 5m to grant Granted Mar 03, 2026
Patent 12556671
THREE DIMENSIONAL STROBO-STEREOSCOPIC IMAGING SYSTEMS AND ASSOCIATED METHODS
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
81%
With Interview (+14.4%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 375 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month