DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Examiner acknowledges the reply filed on 11/14/2025 in which claims 1, 4, 7, and 12-14 have been amended. Claims 15-19 have been added. Currently claims 1-19 are pending for examination in this application.
In response to this amendment:
The specification objections are withdrawn.
The claim objection is withdrawn.
The 112(b) claim rejection is withdrawn.
The prior art rejections are withdrawn.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1, 3, 6, 8-9, 11-14, and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Price et al. (US 20180227566 A1), hereinafter Price, in view of Gruver et al. (US 20160282468 A1), hereinafter Gruver.
Regarding claim 1, Price teaches:
A distance measurement apparatus ([0042] “FIG. 1 illustrates a 3D imaging system 100 including a housing 102 that supports an illuminator 104 and an imaging sensor 106.”) comprising:
a light emitting apparatus configured to emit first light and second light, which is a light beam having a smaller spread than the first light ([0047] “the illuminator 104 may have a short throw FOI 108 and a long throw FOI 112. The short throw FOI 108 may provide a wider FOI with lower illumination concentration for a given light source intensity.” Note FOI stands for, “field of illumination”. Short throw FOI corresponds to the claimed “first light” and long throw FOI corresponds to the claimed “second light”. See also [0056] and [0064] for quantified contemplations of illumination angles.), and
change an emission direction of the second light ([0120] “FIG. 20 illustrates an embodiment of a 3D imaging system 800 mounted in a gimbal 888. A gimbal may allow the 3D imaging system 800 to rotate a housing 802 about a transverse axis 890 and/or a vertical axis 892. Rotating the housing 802 moves the illuminator 804 and the imaging sensor 806 to change the orientation of the associated FOI and FOV”);
a light receiving apparatus ([0042] “FIG. 1 illustrates a 3D imaging system 100 including a housing 102 that supports an illuminator 104 and an imaging sensor 106.”); and
a processing circuit that controls the light emitting apparatus and the light receiving apparatus and processes a signal output from the light receiving apparatus ([0042] “The illuminator 104 and imaging sensor 106 are in data communication with a processor 101. In some embodiments, the illuminator 104 is a modulated illuminator and the processor 101 may be a time-of-flight measurement device.”),
wherein the processing circuit performs a process comprising:
generating first distance data based on a first signal obtained by detecting, by the light receiving apparatus, first reflected light which occurs by the first light ([0042] “The imaging sensor 106 has a coordinated shutter that operates in conjunction with the light modulation, allowing the time of flight depth measurement.” Note that this depth measurement is clearly applicable to both short and long throw illumination.);
generating second distance data based on a second signal obtained by detecting, by the light receiving apparatus, second reflected light which occurs by the second light ([0042] “The imaging sensor 106 has a coordinated shutter that operates in conjunction with the light modulation, allowing the time of flight depth measurement.” Note that this depth measurement is clearly applicable to both short and long throw illumination.);
when an object is present outside a first target area included in an area illuminated by the first light, causing the light emitting apparatus to track the object by the second light ([0054] “FIG. 2 illustrates the 3D imaging system 100 of FIG. 1 in a short-range application. The 3D imaging system 100 may operate in a short range application when identifying and/or tracking objects 122 that are relatively close to the illuminator 104 and/or imaging sensor 106. In some embodiments, the illuminator 104 and/or imaging sensor 106 are configured to identify or track a close object 122 that is a first distance 116 from the illuminator 104 and/or imaging sensor 106.”; [0062] “The 3D imaging system 100 may operate in a long range application when identifying and/or tracking objects 124 that are relatively far from the illuminator 104 and/or imaging sensor 106. In some embodiments, the illuminator 104 and/or imaging sensor 106 are configured to identify or track a distant object 124 that is a second distance 126 from the illuminator 104 and/or imaging sensor 106.” This describes a system wherein the tracking is performed by the long throw mode for “second distances” which are farther than the “first distance” associated with the short throw mode. The area associated with the “first distance” corresponding to the claimed “first target area”.) by 1) calculating a position of the object (It is understood that the concept of “tracking” in this context involves calculating a position of the object.) and […]; and
when the object enters the inside of the first target area from the outside of the first target area, causing the light emitting apparatus to stop the tracking by the second light ([0054] “FIG. 2 illustrates the 3D imaging system 100 of FIG. 1 in a short-range application. The 3D imaging system 100 may operate in a short range application when identifying and/or tracking objects 122 that are relatively close to the illuminator 104 and/or imaging sensor 106. In some embodiments, the illuminator 104 and/or imaging sensor 106 are configured to identify or track a close object 122 that is a first distance 116 from the illuminator 104 and/or imaging sensor 106.” By describing a distance range within which the system will operate in short throw mode, it would be understood that when an object moves from outside to inside this distance, the operating mode will swap to short throw mode, and thus stop tracking with long throw mode.).
Price does not explicitly teach:
when an object is present outside a first target area included in an area illuminated by the first light, causing the light emitting apparatus to track the object by the second light by 1) calculating a position of the object and 2) moving a beam spot of the light beam based on the position of the object so that the light beam continues to be directed to the object;
Gruver, in the same field of endeavor, teaches moving the field of view of the narrower long-range lidar to track objects ([0048] “In some examples, the vehicle 100 may be configured to adjust a viewing direction of the second LIDAR 122.”; [0119] “Thus, in this example, the second LIDAR may be suitable for scanning the environment for objects within a long range of distances”; [0120] “the method 500 may include adjusting the viewing direction of the second LIDAR to focus on the moving object and/or track the moving object”).
This teaching is readily incorporated into the system of Price, as the system of Price already contains a means for changing an orientation of the sensors (Price: [0120] “FIG. 20 illustrates an embodiment of a 3D imaging system 800 mounted in a gimbal 888. A gimbal may allow the 3D imaging system 800 to rotate a housing 802 about a transverse axis 890 and/or a vertical axis 892. Rotating the housing 802 moves the illuminator 804 and the imaging sensor 806 to change the orientation of the associated FOI and FOV”).
Thus, the combination of Price in view of Gruver teaches the remaining limitation:
when an object is present outside a first target area included in an area illuminated by the first light, causing the light emitting apparatus to track the object by the second light by 1) calculating a position of the object and 2) moving a beam spot of the light beam based on the position of the object so that the light beam continues to be directed to the object;
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have incorporated the tracking FOV adjustment of Gruver into the 3D imaging system of Price to extend the field of view over which tracking can be maintained.
Regarding claim 3, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
wherein the processing circuit performs a process comprising:
causing the light emitting apparatus to scan, by the second light, a second target area located outside the first target area (Price: [0062] “the illuminator 104 and/or imaging sensor 106 are configured to identify or track a distant object 124 that is a second distance 126 from the illuminator 104 and/or imaging sensor 106.” The “second distance” corresponds to the claimed “second target area”. Note also, that the claim phrase “scan… a second target area” may be broadly interpreted to include flash scans of an area. The examiner recommends additional language if a narrower interpretation is intended.);
detecting the object based on the second signal or the second distance data obtained by the scanning (Price: [0062] “the illuminator 104 and/or imaging sensor 106 are configured to identify or track a distant object 124 that is a second distance 126 from the illuminator 104 and/or imaging sensor 106.”); and
in response to the detection of the object, causing the light emitting apparatus to start tracking the object by the second light (Price: [0062] “the illuminator 104 and/or imaging sensor 106 are configured to identify or track a distant object 124 that is a second distance 126 from the illuminator 104 and/or imaging sensor 106.”; Gruver: [0120] “the method 500 may include adjusting the viewing direction of the second LIDAR to focus on the moving object and/or track the moving object”).
Regarding claim 6, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
wherein the processing circuit determines whether the object is present outside the first target area or inside the first target area, based on at least one selected from a group consisting of a strength of the first signal, a strength of the second signal, the first distance data, the second distance data, and external data input from an external sensor (As the “first target area” of Price in view of Gruver corresponds to a distance, the “first distance”, it would be understood that the normal distance measurement results of the system could be used to determine whether an object is within the first distance or not.).
Regarding claim 8, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
wherein the light receiving apparatus is an image sensor including a plurality of pixels two-dimensionally arranged (Price: FIG. 4; [0072] “The photoreceptor array 132 has a height 134 and a width 136 of the active area. In conventional long throw systems, a cropped subsection of the photoreceptor array 132 having a reduced height 138 and reduced width 140 is used to image the smaller angular space that is illuminated by the illuminator. For example, in some conventional systems, the FOV is cropped from 512×512 down to 320×288, a >60% reduction in overall resolution.” At least a two-dimensional array of 512x512 pixels is contemplated.).
Regarding claim 9, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
wherein the processing circuit changes the first target area based on at least one selected from a group consisting of a strength of the first signal, a strength of the second signal, the first distance data, the second distance data, and external data input from an external sensor (Price contemplates the degrading effects of ambient light on the system, and incorporates an ambient light sensor to allow for adjustment based on this effect: [0106] “In yet other embodiments, the trigger is received from one or more sensors in data communication with the 3D imaging system. For example, the trigger may be received from an ambient light sensor, instructing the 3D imaging system to narrow the FOI and/or FOV to increase illumination concentration and angular resolution in an effort to compensate for high amounts of ambient light.” While the discussion in Price is largely centered around narrowing the FOI when ambient light is high to maintain a sensitivity at a given distance, this clearly shows an understanding that for a given FOI setting, the distance of sensitivity decreases with increasing ambient light – for example, that the sensing distance of the short throw mode has decreased.).
Regarding claim 11, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, but the modification outlines thus far does not explicitly teach:
wherein the processing circuit integrates the first distance data and the second distance data and output a result.
Gruver further teaches a sensor fusion algorithm for combining data from different scan zones. This teaching would reasonably apply to the different sensing zones of Price:
wherein the processing circuit integrates the first distance data and the second distance data and output a result (Gruver: [0169] “The sensor fusion algorithm 944 may be an algorithm (or a computer program product storing an algorithm) configured to accept data from the sensor system 904 as an input. The data may include, for example, data representing information sensed at the sensors of the sensor system 904.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the 3D imaging system of Price in view of Gruver with the further data fusion teachings of Gruver to capture a full picture of all the sensed zones.
Regarding claim 12, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
wherein the first light is flash light ([0073] “FIG. 5-1 and FIG. 5-2 schematically illustrate an embodiment of the adjustable optics of a 3D imaging system. FIG. 5-1 illustrates the adjustable optics of an illuminator 204. The illuminator 204 includes a light source 242 that produces an emitted light 244. The emitted light 244 is collimated by a collimator 246 into a collimated light 248, which, in turn, passes to a beam broadener 250, such as a diffuser. The diffused light 252 may then be directed by a movable illuminator lens 254. The movable illuminator lens 254 may be moved axially (i.e., relative to the light source 242) to change the angle of the FOI of an output light from the short throw FOI 208 to the long throw FOI 212, or any FOI therebetween.” The emitted light is diffused light, corresponding to the claimed flash light.).
Regarding claim 13, claim 13 matches the scope of claim 1 and is rejected for the same reasons.
Regarding claim 14, claim 14 matches the scope of claim 4 and is rejected for the same reasons.
Regarding claim 19, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
wherein the spread is a beam spread angle of a single optical pulse of the light beam (FIG. 3; [0064] long throw FOI angle 129; [0056] short throw FOI angle 119; Note also the emission of diffuse light [0073].).
Claim(s) 2, 4, and 15-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Price in view of Gruver and further in view of Official Notice.
Regarding claim 2, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, and further teaches:
Tracking of an object inside the first target area (Price: [0054] “In some embodiments, the illuminator 104 and/or imaging sensor 106 are configured to identify or track a close object 122 that is a first distance 116 from the illuminator 104 and/or imaging sensor 106.” By tracking the object, it is understood that the system of Price is ).
Price does not explicitly teach that the tracking is achieved by:
wherein the processing circuit stores a change in a position of the object in addition to the first distance data after the object enters the inside of the first target area
The examiner takes Official Notice of the fact that taking a difference between the position of subsequent frames is a well-known method for tracking the movement of an object.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the position difference between frames for the tracking of Price in view of Gruver as one known and predictable choice for tracking.
Regarding claim 4, Price teaches:
A distance measurement apparatus ([0042] “FIG. 1 illustrates a 3D imaging system 100 including a housing 102 that supports an illuminator 104 and an imaging sensor 106.”) comprising:
a light emitting apparatus configured to emit first light and second light, which is a light beam having a smaller spread than the first light ([0047] “the illuminator 104 may have a short throw FOI 108 and a long throw FOI 112. The short throw FOI 108 may provide a wider FOI with lower illumination concentration for a given light source intensity.” Note FOI stands for, “field of illumination”. Short throw FOI corresponds to the claimed “first light” and long throw FOI corresponds to the claimed “second light”. See also [0056] and [0064] for quantified contemplations of illumination angles.), and
change an emission direction of the second light ([0120] “FIG. 20 illustrates an embodiment of a 3D imaging system 800 mounted in a gimbal 888. A gimbal may allow the 3D imaging system 800 to rotate a housing 802 about a transverse axis 890 and/or a vertical axis 892. Rotating the housing 802 moves the illuminator 804 and the imaging sensor 806 to change the orientation of the associated FOI and FOV”);
a light receiving apparatus ([0042] “FIG. 1 illustrates a 3D imaging system 100 including a housing 102 that supports an illuminator 104 and an imaging sensor 106.”); and
a processing circuit that controls the light emitting apparatus and the light receiving apparatus and processes a signal output from the light receiving apparatus ([0042] “The illuminator 104 and imaging sensor 106 are in data communication with a processor 101. In some embodiments, the illuminator 104 is a modulated illuminator and the processor 101 may be a time-of-flight measurement device.”),
wherein the processing circuit performs a process comprising:
generating first distance data based on a first signal obtained by detecting, by the light receiving apparatus, first reflected light which occurs by the first light ([0042] “The imaging sensor 106 has a coordinated shutter that operates in conjunction with the light modulation, allowing the time of flight depth measurement.” Note that this depth measurement is clearly applicable to both short and long throw illumination.);
generating second distance data based on a second signal obtained by detecting, by the light receiving apparatus, second reflected light which occurs by the second light ([0042] “The imaging sensor 106 has a coordinated shutter that operates in conjunction with the light modulation, allowing the time of flight depth measurement.” Note that this depth measurement is clearly applicable to both short and long throw illumination.);
when an object is present in a first target area included in an area illuminated by the first light, [tracking the object] ([0054] “FIG. 2 illustrates the 3D imaging system 100 of FIG. 1 in a short-range application. The 3D imaging system 100 may operate in a short range application when identifying and/or tracking objects 122 that are relatively close to the illuminator 104 and/or imaging sensor 106. In some embodiments, the illuminator 104 and/or imaging sensor 106 are configured to identify or track a close object 122 that is a first distance 116 from the illuminator 104 and/or imaging sensor 106.”); and
when the object moves from the inside of the first target area to the outside of the first target area, causing the light emitting apparatus to start tracking the object by the second light ([0062] “The 3D imaging system 100 may operate in a long range application when identifying and/or tracking objects 124 that are relatively far from the illuminator 104 and/or imaging sensor 106. In some embodiments, the illuminator 104 and/or imaging sensor 106 are configured to identify or track a distant object 124 that is a second distance 126 from the illuminator 104 and/or imaging sensor 106.” This describes a system wherein the tracking is performed by the long throw mode for “second distances” which are farther than the “first distance” associated with the short throw mode. The area associated with the “first distance” corresponding to the claimed “first target area”.) by 1) calculating a position of the object (It is understood that the concept of “tracking” in this context involves calculating a position of the object.) and […].
Price does not explicitly teach:
when an object is present in a first target area included in an area illuminated by the first light, storing a change in a position of the object in addition to the first distance data
when the object moves from the inside of the first target area to the outside of the first target area, causing the light emitting apparatus to start tracking the object by the second light by 1) calculating a position of the object and 2) moving a beam spot of the light beam based on the position of the object so that the light beam continues to be directed to the object;
Gruver, in the same field of endeavor, teaches moving the field of view of the narrower long-range lidar to track objects ([0048] “In some examples, the vehicle 100 may be configured to adjust a viewing direction of the second LIDAR 122.”; [0119] “Thus, in this example, the second LIDAR may be suitable for scanning the environment for objects within a long range of distances”; [0120] “the method 500 may include adjusting the viewing direction of the second LIDAR to focus on the moving object and/or track the moving object”).
This teaching is readily incorporated into the system of Price, as the system of Price already contains a means for changing an orientation of the sensors (Price: [0120] “FIG. 20 illustrates an embodiment of a 3D imaging system 800 mounted in a gimbal 888. A gimbal may allow the 3D imaging system 800 to rotate a housing 802 about a transverse axis 890 and/or a vertical axis 892. Rotating the housing 802 moves the illuminator 804 and the imaging sensor 806 to change the orientation of the associated FOI and FOV”).
Thus, the combination of Price in view of Official Notice and further in view of Gruver teaches the limitation:
when an object is present outside a first target area included in an area illuminated by the first light, causing the light emitting apparatus to track the object by the second light by 1) calculating a position of the object and 2) moving a beam spot of the light beam based on the position of the object so that the light beam continues to be directed to the object;
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have incorporated the tracking FOV adjustment of Gruver into the 3D imaging system of Price in view of Official Notice to extend the field of view over which tracking can be maintained.
The combination still does not explicitly teach:
when an object is present in a first target area included in an area illuminated by the first light, storing a change in a position of the object in addition to the first distance data
The examiner takes Official Notice of the fact that taking a difference between the position of subsequent frames is a well-known method for tracking the movement of an object. Therefore, the tracking of Price, as noted above, ([0054]) in combination with this Official Notice, would teach the limitation:
when an object is present in a first target area included in an area illuminated by the first light, storing a change in a position of the object in addition to the first distance data
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the position difference between frames for the tracking of Price in view of Gruver as one known and predictable choice for tracking.
Regarding claim 15, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, but does not explicitly teach:
wherein the calculating the position of the object is performed based on a difference between frames generated by the distance measurement apparatus.
The examiner takes Official Notice of the fact that taking a difference between the position of subsequent frames is a well-known method for tracking the movement of an object.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the position difference between frames for the tracking of Price in view of Gruver as one known and predictable choice for tracking.
Regarding claim 16, Price in view of Gruver teaches the distance measurement apparatus of claim 4, as described above, but does not explicitly teach:
wherein the calculating the position of the object is performed based on a difference between frames generated by the distance measurement apparatus.
The examiner takes Official Notice of the fact that taking a difference between the position of subsequent frames is a well-known method for tracking the movement of an object.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the position difference between frames for the tracking of Price in view of Gruver as one known and predictable choice for tracking.
Regarding claim 17, claim 17 matches the scope of claim 15 and is rejected for the same reasons.
Regarding claim 18, claim 18 matches the scope of claim 16 and is rejected for the same reasons.
Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Price in view of Gruver and further in view of Ogura et al. (US 20220007482 A1), hereinafter Ogura.
Regarding claim 5, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, but does not teach:
wherein the processing circuit executes a distance measurement by the first light when the object is present outside the first target area.
Ogura, in the same field of endeavor, teaches periodically emitting light in full flash mode ([0106] “In the second example, the light projection in the normal mode is performed for the detection region (ROI) of the object Sb in a manner similar to the manner in the first example, and a flash is radiated at intervals of a fixed time for the non-detection region of the object Sb.”).
In combination with the 3D imaging system of Price in view of Gruver, the flash radiation of Ogura corresponds to the wider short throw illumination. Thus, Price in view of Gruver and further in view of Ogura teaches a system where the short throw mode is used at fixed intervals, thus teaching:
wherein the processing circuit executes a distance measurement by the first light when the object is present outside the first target area.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the 3D imaging system of Price in view of Gruver with the repeated flash of Ogura to ensure no other objects are missed while the system is focused on tracking the first object.
Claim(s) 7 and 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Price in view of Gruver and further in view of Klotz et al. (US 20100007476 A1), hereinafter Klotz.
Regarding claim 7, Price in view of Gruver teaches the distance measurement apparatus of claim 1, as described above, but does not explicitly teach:
wherein the processing circuit predicts a movement of the object from the outside to the inside of the first target area and a movement from the inside to the outside of the first distance data, based on at least one selected from a group consisting of a strength of the first signal, a strength of the second signal, the first distance data, the second distance data, and external data input from an external sensor.
Klotz, in the same field of endeavor, teaches:
wherein the processing circuit predicts a movement of the object from the outside to the inside of the first target area and a movement from the inside to the outside of the first distance data, based on at least one selected from a group consisting of a strength of the first signal, a strength of the second signal, the first distance data, the second distance data, and external data input from an external sensor ([0032] “A third option for the attentiveness control of the sensor system is the preconditioning of at least one of the sensors. At least one of the sensors of the sensor system transmits object data to the information platform which the information platform converts into data for another sensor having another sensing range, which represents in particular the location of the expected penetration of the object into the sensing range of the sensor.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the system of Price in view of Gruver with the object prediction of Klotz to improve sensor optimization (Klotz: [0032] “The sensor will then be able to adjust itself optimally to the new object to be detected on the basis of this information, for example, in regard to tracking initialization or angle assignment.”).
Regarding claim 10, Price in view of Gruver teaches the distance measurement apparatus of claim 8, as described above, and further teaches:
wherein the processing circuit performs a process comprising: detecting a position of the object based on at least one selected from a group consisting of a strength of the first signal, a strength of the second signal, the first distance data, the second distance data, and external data input from an external sensor ([0054] “FIG. 2 illustrates the 3D imaging system 100 of FIG. 1 in a short-range application. The 3D imaging system 100 may operate in a short range application when identifying and/or tracking objects 122 that are relatively close to the illuminator 104”; [0062] “The 3D imaging system 100 may operate in a long range application when identifying and/or tracking objects 124 that are relatively far from the illuminator 104”);
The combination does not explicitly teach:
calculating a confidence level of a position of the object defined by a variance of the position of the object; and
determining the first target area based on the confidence level of the position of the object.
Klotz, in the same field of endeavor, teaches:
a lidar tracking system which predicts where an object will enter into the sensing range of the next sensor ([0032] “A third option for the attentiveness control of the sensor system is the preconditioning of at least one of the sensors. At least one of the sensors of the sensor system transmits object data to the information platform which the information platform converts into data for another sensor having another sensing range, which represents in particular the location of the expected penetration of the object into the sensing range of the sensor.”).
calculating a confidence level of a position of the object defined by a variance of the position of the object ([0029] “As a supplement to the predefinition of a sensing range or alternatively thereto, data relating to at least one detection area to be particularly observed are transmitted by the information platform to the sensor(s). These data are derived from the data of a detected object of another sensor, such as a radar sensor, and include, for example, the coordinates for the center point (or center of gravity or a singular point) of this detection area and the velocity of the change of this point together with particular variance values.”); and
determining the first target area based on the confidence level of the position of the object.
It is well-known in the art that the sensing range of a lidar sensor is based on an acceptable confidence level. Thus, Price in view of Gruver and further in view of Klotz teaches a system which changes the target area based on sensing range, thereby implicitly teaching a change based on confidence level.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the system of Price in view of Gruver with the object prediction of Klotz to improve sensor optimization (Klotz: [0032] “The sensor will then be able to adjust itself optimally to the new object to be detected on the basis of this information, for example, in regard to tracking initialization or angle assignment.”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Beraldin et al. (J. Angelo Beraldin, Francois Blais, Marc Rioux, Luc Cournoyer, Denis G. Laurin, Steve G. MacLean, "Eye-safe digital 3-D sensing for space applications," Opt. Eng. 39(1) (1 January 2000)) teaches a scanning lidar system which has a broader “searching field of view” and a narrower “tracking field of view”.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SEAN C. GRANT whose telephone number is (571)272-0402. The examiner can normally be reached Monday - Friday, 9:30 am - 6:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached at (571)270-3603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SEAN C. GRANT/Examiner, Art Unit 3645
/YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645