DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments/Amendments
The amendment filed June 30th, 2025 has been entered. Claims 1-5, 7-8, and 10-14 are currently pending in the Application.
Applicant’s arguments with respect to the rejection of claims under 35 U.S.C 102 and 35 U.S.C 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3-5, 8 and 10-14, is/are rejected under 35 U.S.C. 103 as being unpatentable over DE Patent Publication No. DE 102018124538 A1, to Weinberg et al. (hereinafter Weinberg), and further in view of DE Patent Publication No. DE 102006010295 A1, to Lang et al (hereinafter Lang)
Regarding claim 1, Weinberg discloses, A vehicle (See at least paragraph [0008]). having one or more optical sensor elements (See at least paragraph [0012] “Lidar”). in a region of a first side or a front side of said vehicle (See at least [FIG.2A]).
which, in interaction with an evaluation and control electronics provided on board or positioned fully or partly remotely to said vehicle, is/are configured to warn about a collision of said driving vehicle with an obstacle when said vehicle is driving, to avoid the obstacle or to stop said vehicle when detecting the obstacle, (See at least paragraph [0012] “to warn a driver of the vehicle 107, or the vehicle 107 may automatically decelerate to avoid a collision with the overhead infrastructure ”)
wherein that at least one, several or all of said optical sensor elements has/have a field of view with a horizontal and/or vertical viewing angle range of less than 3 degrees. (See at least paragraph [0010] “A field of view can range from 1.5 to 3 degrees in a horizontal dimension”)
Weinberg fails to explicitly disclose, however Lang discloses, wherein two sensor elements located in spatial proximity to each other (See at least paragraph [0024] “the camera system is implemented using chip-on-board technology. Both camera chips are mounted on a substrate and are therefore designed as a single electronic component. The base distance between the camera chips is chosen to be correspondingly small..”). have fields of view laterally and/or vertically offset from each other, (See at least paragraph [0009] “the image sensors have different imaging areas. For a stereo recording, it is necessary that the imaging areas overlap at least partially. In a motor vehicle application, the imaging area of the image sensors is i. d. R. is calibrated in the factory so that the position of the overlap area in the recorded images is predetermined”). Further, (See at least paragraph [0026] “the distance area 1 in front of the motor vehicle 3 is monitored. A CMOS camera is provided as image sensor 2 for monitoring the near area 2. The monitoring areas of the image sensors, as shown for an exemplary embodiment in Fig. 2, are aligned symmetrically to the longitudinal axis of the vehicle. The intersection of the two monitoring areas is the overlap area 4, in this area 4 the data from the image sensors are merged.”)
wherein the maximum receiving sensitivity of said two sensor elements located in spatial proximity to each other is at a different wavelength. (See at least paragraph [0020] “at least two different types of image sensors are provided which are sensitive in different spectral ranges. This recording method offers the advantage that objects under given lighting and environmental conditions, which are only displayed with very low contrast in a first spectral range, are clearly visible, i.e., with high contrast, in a second spectral range”). Further, (See at least paragraph [0014] “at least one image sensor is sensitive in the far-infrared spectral range and one image sensor is sensitive in the visible and/or near-infrared spectral range.”). Still further, (See at least paragraph [0025] “The respective optical channels are designed for the corresponding wavelengths and monitoring ranges.”)
and wherein said two sensor elements located in spatial proximity to each other are configured to interact with said evaluation and control electronics (See at least paragraph [0016] “Fig. 1: Schematic representation of a camera system with two different types of image sensors and the associated image data processing”). Further, (See at least paragraph [0012] “Fig. 1, image processing is provided which reliably recognizes object types, analyzes the environmental situation, tracks the movement of objects in image sequences, etc. The criticality of a situation is analyzed. If a critical situation arises, the driver will be visually or audibly alerted and/or an intervention will be made. B. initiated braking”). in a way that a signal of one of said two sensor elements can be plausibility checked or validated, by a signal from said other one of said sensor elements. (See at least paragraph [0013] “In an advantageous embodiment of the invention, the data from at least one of the image sensors are analyzed separately. It is particularly advantageous to use this data to identify objects and/or predict their behavior. The information obtained about objects is compared with data from various image sensors to make it plausible. In another embodiment, the image data from the different types of image sensors are first fused, and then object recognition is performed and/or the behavior of the objects is predicted. In general, the same objects can be detected in multiple images and their movement can be tracked in the images. The aforementioned procedure enables a situational analysis of the environment.”).
Weinberg as modified by Lang, are analogous art because they are in the same field of endeavor, vehicle sensor systems. Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Weinberg to incorporate the teachings of Lang such that two different cameras are checked with each other for the purpose of verifying the accuracy of the sensors will aid in vehicle safety.
Regarding claim 3, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said at least one several or all of said optical sensor elements has/have a field of view with a horizontal and/or vertical viewing angle range in the range of from 0.1 to 1.5 degrees, or in particular range of from 0.5 degrees to 1.5 degrees (See at least paragraph [0010] “A field of view can range from 1.5 to 3 degrees in a horizontal dimension.” )
Regarding claim 4, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said vehicle in the region of said first side or said front side (11) has at least two of said optical elements with a field of view with a horizontal and/or vertical viewing angle range of less than 3 degrees, in particular from 0.1 degrees to 1.5 degrees or from 0.5 degrees to 1.5 degrees, located in spatial proximity to each other, next to each other, above each other or one behind the other (See at least paragraph [0009], “The receiver 120 may include a pixel array 121 that may be used to receive the portion of the light beam from the target area”. The control circuit 110 may then determine a round trip time of the light beam, such as by comparing a time when the light beam 125 was emitted toward the target 130 and a time when the light beam 135 was received by the receiver 120. A distance to the target 130 can then be determined according to the expression, where d can represent a distance from the lidar system 105 to the target 130, t can represent an orbital time, and c can represent a speed of light. The receiver 120 may include a pixel array 121 that may be used to receive the portion of the light beam from the target area)
Regarding claim 5, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said vehicle in the region of said first side or said front side has at least four of said optical elements with a field of view with a horizontal and/or vertical viewing angle range of less than 3 degrees, or a range of 0.1 degrees to 1.5 degrees or from 0.5 degrees to 1.5 degrees, at least two of which being located in spatial proximity to each other, in particular next to each, above each other or one behind the other, and wherein a first pair of said optical sensor element is positioned in the region of, or in proximity to, or a left corner of said vehicle, and wherein a second pair of said sensor elements is positioned in the region of, or in proximity to, a right, in particular said front, corner or a right front corner of said vehicle, (See at least paragraph [0090], “In one example, the pixel array (e.g.photodiodes) 121 have a one-dimensional pixel array and each pixel in the pixel array can capture a part of the light beam (e.g. a ray) corresponding to an angle or a range of angles. In another example, the pixel array (e.g. photodiodes) 121 have a two-dimensional pixel array.)
Regarding claim 7, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, however Weinberg fails to explicitly disclose, however Lang discloses, wherein the maximum receiving sensitivity of said two sensor elements located in spatial proximity to each other is at a different wavelength, in particular in a wavelength range of from 600 nm to 1100 nm. (See at least paragraph [0020] “at least two different types of image sensors are provided which are sensitive in different spectral ranges. This recording method offers the advantage that objects under given lighting and environmental conditions, which are only displayed with very low contrast in a first spectral range, are clearly visible, i.e., with high contrast, in a second spectral range”). Further, (See at least paragraph [0014] “at least one image sensor is sensitive in the far-infrared spectral range and one image sensor is sensitive in the visible and/or near-infrared spectral range.”). Still further, (See at least paragraph [0025] “The respective optical channels are designed for the corresponding wavelengths and monitoring ranges.”)
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Weinberg to incorporate the teachings of Lang for the same motivation reasons in claim 1.
Regarding claim 8, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said two sensor elements located in spatial proximity to each other have fields of view extending at different elevation angles relative to the level of said vehicle such that a first one of said two sensor elements can detect an obstacle earlier than said second one of said two sensor elements (See at least [FIG.2A] and paragraph [0012] “illustrates an example of the operation of a lidar system, such as the lidar system 105 .The lidar system 105 can emit a light beam 125.The light beam may comprise rays extending over a range of angles in a vertical dimension, where an individual ray may correspond to an angle or a range of angles.The beams can be received by the lidar system 105 after being reflected or scattered by a target area.The target area may include an elevated infrastructure 205, such as a bridge or overpass.The target area may also include a ground feature 210, such as a road.A distance from the lidar system 105 to the target area may be determined for a received beam, such as based on a round trip time of the received beam (e.g., B. using the equation ).A horizontal distance (e.g. B. a distance along a horizontal dimension) can then be determined using the distance determined from the round trip time of the received beam, an angle of the received beam, and a position of the lidar system 105 on the vehicle. The distance from the ground to the Lidar system 105 can be measured during installation of the Lidar system 105. A distance from the ground feature 210 to the overhead infrastructure 205 may then be determined using a pair of rays, such as a first ray 125a and a second ray 125b. The first beam 125a can interact with the overhead infrastructure 205.The second ray 125b may interact with the ground feature 210 and may correspond to the first ray 125a (e.g., For example, a horizontal distance of the second beam 125b may be the same as the horizontal distance of the first beam 125a).).
Regarding claim 10, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said two sensor elements located in spatial proximity to each other are positioned on said vehicle at a height of from 0.5 to 1.5 m above road level (See at least [FIG.2A] and paragraph [0012] “illustrates an example of the operation of a lidar system, such as the lidar system 105 .The lidar system 105 can emit a light beam 125.The light beam may comprise rays extending over a range of angles in a vertical dimension, where an individual ray may correspond to an angle or a range of angles.The beams can be received by the lidar system 105 after being reflected or scattered by a target area.The target area may include an elevated infrastructure 205, such as a bridge or overpass.The target area may also include a ground feature 210, such as a road.A distance from the lidar system 105 to the target area may be determined for a received beam, such as based on a round trip time of the received beam (e.g., B. using the equation ).A horizontal distance (e.g. B. a distance along a horizontal dimension) can then be determined using the distance determined from the round trip time of the received beam, an angle of the received beam, and a position of the lidar system 105 on the vehicle. The distance from the ground to the Lidar system 105 can be measured during installation of the Lidar system 105. A distance from the ground feature 210 to the overhead infrastructure 205 may then be determined using a pair of rays, such as a first ray 125a and a second ray 125b. The first beam 125a can interact with the overhead infrastructure 205.The second ray 125b may interact with the ground feature 210 and may correspond to the first ray 125a (e.g., For example, a horizontal distance of the second beam 125b may be the same as the horizontal distance of the first beam 125a).).
Regarding claim 11, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said sensor elements arranged in the region of, or in proximity to, said left, corner or saif left corner of said vehicle and in the region of, or in the proximity to, said right, corner or said right front, corner of said vehicle are positioned and configured in a way that they each have a field of view which extends across an area in front of said vehicle and optionally also in an area sideways next to said vehicle, thereby allowing for the surroundings of said front corners of said vehicle (10) to be observable (See at least [FIG.2A] and paragraph [0012] “illustrates an example of the operation of a lidar system, such as the lidar system 105 .The lidar system 105 can emit a light beam 125.The light beam may comprise rays extending over a range of angles in a vertical dimension, where an individual ray may correspond to an angle or a range of angles.The beams can be received by the lidar system 105 after being reflected or scattered by a target area.The target area may include an elevated infrastructure 205, such as a bridge or overpass.The target area may also include a ground feature 210, such as a road.A distance from the lidar system 105 to the target area may be determined for a received beam, such as based on a round trip time of the received beam (e.g., B. using the equation ).A horizontal distance (e.g. B. a distance along a horizontal dimension) can then be determined using the distance determined from the round trip time of the received beam, an angle of the received beam, and a position of the lidar system 105 on the vehicle. The distance from the ground to the Lidar system 105 can be measured during installation of the Lidar system 105. A distance from the ground feature 210 to the overhead infrastructure 205 may then be determined using a pair of rays, such as a first ray 125a and a second ray 125b. The first beam 125a can interact with the overhead infrastructure 205.The second ray 125b may interact with the ground feature 210 and may correspond to the first ray 125a (e.g., For example, a horizontal distance of the second beam 125b may be the same as the horizontal distance of the first beam 125a).).
Regarding claim 12, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said at least one optical sensor element has such a detection distance such that obstacles at a distance of from 3 m to 10 m, or from 3 m to 7 m, in the front of said vehicle can be reliably detected (See at least paragraph [0010-0015], “In one example, a resolution in the vertical dimension may be 1 degree or less. At a distance of about 100 meters, an angular deviation of 0.2 degrees can correspond to an offset of about 14 inches. A field of view can range from 1.5 to 3 degrees in a horizontal dimension. At a distance of about 100 meters, a field of view of 1 degree can correspond to 1.7 meters. Thus, a field of view can extend in a horizontal dimension in a range of 1.5 to 3 degrees across the width of a roadway or other overhead infrastructure.”)
Regarding claim 13, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein said at least one optical sensor element, said several or all of said optical sensor elements has/have a receiving module in the form of a pixel matrix, in the form of an 8×4 pixel matrix or in the form of a 2×2 pixel matrix, (See at least [0090], “In another example, the pixel array (e.g.photodiodes) 121 have a two-dimensional pixel array.”)
Regarding claim 14, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, Weinberg further teaches, wherein, in addition to said at least one optical sensor element, one or more additional sensor elements is/are provided which, in interaction with said evaluation and control electronics, is/are configured to warn about a collision of said driving vehicle with an obstacle, when the vehicle is driving to avoid the obstacle or to stop said vehicle, wherein said one or more additional sensor elements is/are arranged at said first side or said front side of said vehicle and/or at sideways and/or at a rear side of said vehicle (See at least [0090], “In one example, the pixel array (e.g.photodiodes) 121 have a one-dimensional pixel array and each pixel in the pixel array can capture a part of the light beam (e.g. a ray) corresponding to an angle or a range of angles.
Claims 2, is/are rejected under 35 U.S.C. 103 as being unpatentable over DE Patent Publication No DE 102018124538 A1, to Weinberg et al. (hereinafter Weinberg), and further in view of DE Patent Publication No. DE 102006010295 A1, to Lang et al (hereinafter Lang), and further in view of U.S. Patent Publication No. 20180284275, to Lachapelle et al (hereinafter Lachapelle).
Regarding claim 2, Weinberg as modified by Lang teaches the claimed limitations in claimed 1, however Weinberg fails to explicitly disclose, however Lachapelle discloses, wherein said vehicle (10) is a driver-less vehicle with a discrete drive which is controlled automatically without any human intervention. (See at least paragraph [0105]).
Weinberg as modified by Lachapelle, are analogous art because they are in the same field of endeavor, obstacle avoidance systems. Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Weinberg as modified by Lang to incorporate the teachings of Lachapelle such that the self-driving vehicle of Lachapelle may improve the teachings of Weinberg to control a vehicle to navigate within a path while avoiding obstacles.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Wesam Almadhrhi whose telephone number is (571) 270-3844. The examiner can normally be reached on 7:30 AM - 5PM Mon-Fri Eastern Alt Fri.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached on (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WESAM NMN ALMADHRHI/Examiner, Art Unit 3666 /ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666