DETAILED ACTION
Notice of Pre-AIA or AIA Status
This is in response to application no. 19/059,867 filed on 02/21/2025. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-21 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 8 of U.S. Patent No. US 11557127 B2 in view of Liang et al.(US 10992860 B2).
Regarding claim 1, claim 1 of pat.‘127 teaches all of the limitation of the current claim 1 except that “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well.” However, Liang teaches a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well (Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.)…and/or other object-detection sensors 116D, 116E…col. 4, lines 8-25, 51-55 . Liang further discloses [w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100. Col. 4, lines 21-25).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified claim 1 of pat.‘127 to include “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well” since such modification is a predictable design choice of a rearrangement of parts of the prior elements. Note that it has been held that rearranging parts of a prior art structure involves only routing skill in the art. In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950).
Independent claims 15 and 21 are rejected due to a similar reason set forth above with respect to claim 1.
The rest of the dependent claims 2-14 and 16-20 are rejected based on their dependency from the rejected claims 1 and 15.
Table 1 below shows the comparison between the current claims and the claim of pat. ‘127.
Table 1
Current claims
Patent no. US 11557127 B2 claim
1. An external sensing system comprising: a lidar sensor disposed at a first position adjacent to
1. An external sensing system for a vehicle configured to operate in an autonomous driving mode, the external sensing system comprising: a lidar sensor having a first field of view configured to detect objects in at least a region of an external environment around the vehicle and within a threshold distance of no more than 3 meters from the vehicle; an image sensor disposed adjacent to the lidar sensor and arranged along the vehicle to have a second field of view of the region of the external environment within the threshold distance of the vehicle, the second field of view having a first region and a second region, the first region overlapping with the first field of view and the second region being oriented to cover an immediate vicinity of the vehicle within the threshold distance not within the first field of view…
15. An external sensing system comprising: a first lidar sensor disposed at a first position
1. An external sensing system for a vehicle configured to operate in an autonomous driving mode, the external sensing system comprising: a lidar sensor having a first field of view configured to detect objects in at least a region of an external environment around the vehicle and within a threshold distance of no more than 3 meters from the vehicle; an image sensor disposed adjacent to the lidar sensor and arranged along the vehicle to have a second field of view of the region of the external environment within the threshold distance of the vehicle, the second field of view having a first region and a second region, the first region overlapping with the first field of view and the second region being oriented to cover an immediate vicinity of the vehicle within the threshold distance not within the first field of view…
21. An external sensing system comprising: a lidar sensor disposed on
1. An external sensing system for a vehicle configured to operate in an autonomous driving mode, the external sensing system comprising: a lidar sensor having a first field of view configured to detect objects in at least a region of an external environment around the vehicle and within a threshold distance of no more than 3 meters from the vehicle; an image sensor disposed adjacent to the lidar sensor and arranged along the vehicle to have a second field of view of the region of the external environment within the threshold distance of the vehicle, the second field of view having a first region and a second region, the first region overlapping with the first field of view and the second region being oriented to cover an immediate vicinity of the vehicle within the threshold distance not within the first field of view…
Claims 1-21 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 8 of U.S. Patent No. US 11887378 B2 in view of Liang et al.(US 10992860 B2).
Regarding claim 1, claim 1 of pat.‘378 teaches all of the limitation of the current claim 1 except that “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well.” However, Liang teaches a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well (Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.)…and/or other object-detection sensors 116D, 116E…col. 4, lines 8-25, 51-55 . Liang further discloses [w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100. Col. 4, lines 21-25).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified claim 1 of pat.‘378 to include “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well” since such modification is a predictable design choice of a rearrangement of parts of the prior elements. Note that it has been held that rearranging parts of a prior art structure involves only routing skill in the art. In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950).
Independent claims 15 and 21 are rejected due to a similar reason set forth above with respect to claim 1.
The rest of the dependent claims 2-14 and 16-20 are rejected based on their dependency from the rejected claims 1 and 15.
Table 2 below shows the comparison between the current claims and the claim of pat. ‘378.
Table 2
Current claims
Patent no. US 11887378 B2 claim
1. An external sensing system comprising: a lidar sensor disposed at a first position adjacent toposition
1. An external sensing system for a vehicle configured to operate in an autonomous driving mode, the external sensing system comprising: a lidar sensor arranged along an external sensing assembly, the lidar sensor having a lidar field of view configured to detect objects in a region of an external environment around the vehicle and within a threshold distance of the vehicle, the lidar field of view including an occlusion region within an immediate vicinity of the vehicle, the lidar field of view extending to at least the threshold distance of the vehicle; an image sensor positioned relative to the lidar sensor along the external sensing assembly to have an image field of view that is within the region of the external environment and within the threshold distance of the vehicle, the image field of view at least partly overlapping with the lidar field of view and encompassing at least a portion of the occlusion region of the lidar field of view…
15. An external sensing system comprising: a first lidar sensor disposed at a first position
1. An external sensing system for a vehicle configured to operate in an autonomous driving mode, the external sensing system comprising: a lidar sensor arranged along an external sensing assembly, the lidar sensor having a lidar field of view configured to detect objects in a region of an external environment around the vehicle and within a threshold distance of the vehicle, the lidar field of view including an occlusion region within an immediate vicinity of the vehicle, the lidar field of view extending to at least the threshold distance of the vehicle; an image sensor positioned relative to the lidar sensor along the external sensing assembly to have an image field of view that is within the region of the external environment and within the threshold distance of the vehicle, the image field of view at least partly overlapping with the lidar field of view and encompassing at least a portion of the occlusion region of the lidar field of view…
21. An external sensing system comprising: a lidar sensor disposed on
1. An external sensing system for a vehicle configured to operate in an autonomous driving mode, the external sensing system comprising: a lidar sensor arranged along an external sensing assembly, the lidar sensor having a lidar field of view configured to detect objects in a region of an external environment around the vehicle and within a threshold distance of the vehicle, the lidar field of view including an occlusion region within an immediate vicinity of the vehicle, the lidar field of view extending to at least the threshold distance of the vehicle; an image sensor positioned relative to the lidar sensor along the external sensing assembly to have an image field of view that is within the region of the external environment and within the threshold distance of the vehicle, the image field of view at least partly overlapping with the lidar field of view and encompassing at least a portion of the occlusion region of the lidar field of view…
Claims 1-21 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 8 of U.S. Patent No. US 12260650 B2 in view of Liang et al.(US 10992860 B2).
Regarding claim 1, claim 8 of pat.‘650 teaches all of the limitation of the current claim 1 except that “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well.” However, Liang teaches a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well (Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.)…and/or other object-detection sensors 116D, 116E…col. 4, lines 8-25, 51-55 . Liang further discloses [w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100. Col. 4, lines 21-25).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified claim 1 of pat.‘650 to include “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle… an image sensor disposed at a second position adjacent to the front wheel well” since such modification is a predictable design choice of a rearrangement of parts of the prior elements. Note that it has been held that rearranging parts of a prior art structure involves only routing skill in the art. In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950).
Independent claims 15 and 21 are rejected due to a similar reason set forth above with respect to claim 1.
The rest of the dependent claims 2-14 and 16-20 are rejected based on their dependency from the rejected claims 1 and 15.
Table 3 below shows the comparison between the current claims and the claim of pat. ‘650.
Table 3
Current claims
Patent no. US 12260650 B2 claim
1. An external sensing system comprising: a lidar sensor disposed at a first position adjacent toencompassing the region in the immediate vicinity of the vehicle.
8. An external sensing system comprising: a lidar sensor arranged along an external sensing assembly, the lidar sensor having a lidar field of view of a region of an external environment around a vehicle, the vehicle being configured to operate in an autonomous driving mode, the lidar field of view including an occlusion region within an immediate vicinity of the vehicle; an image sensor positioned relative to the lidar sensor along the external sensing assembly to have an image field of view that is within the region of the external environment, the image field of view at least partly overlapping with the lidar field of view and encompassing at least a portion of the occlusion region of the lidar field of view; and a control system operatively coupled to the image sensor and the lidar sensor…
15. An external sensing system comprising: a first lidar sensor disposed at a first position
8. An external sensing system comprising: a lidar sensor arranged along an external sensing assembly, the lidar sensor having a lidar field of view of a region of an external environment around a vehicle, the vehicle being configured to operate in an autonomous driving mode, the lidar field of view including an occlusion region within an immediate vicinity of the vehicle; an image sensor positioned relative to the lidar sensor along the external sensing assembly to have an image field of view that is within the region of the external environment, the image field of view at least partly overlapping with the lidar field of view and encompassing at least a portion of the occlusion region of the lidar field of view; and a control system operatively coupled to the image sensor and the lidar sensor…
21. An external sensing system comprising: a lidar sensor disposed on
8. An external sensing system comprising: a lidar sensor arranged along an external sensing assembly, the lidar sensor having a lidar field of view of a region of an external environment around a vehicle, the vehicle being configured to operate in an autonomous driving mode, the lidar field of view including an occlusion region within an immediate vicinity of the vehicle; an image sensor positioned relative to the lidar sensor along the external sensing assembly to have an image field of view that is within the region of the external environment, the image field of view at least partly overlapping with the lidar field of view and encompassing at least a portion of the occlusion region of the lidar field of view; and a control system operatively coupled to the image sensor and the lidar sensor…
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-10 and 13-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 10992860 B2) in view of Frank et al. (US 20180032822 A1).
Regarding claim 1, Liang teaches the claim limitations as follows:
An external sensing system comprising: a lidar sensor disposed at a first position Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.)…and/or other object-detection sensors 116D, 116E…The ranging and imaging system 112 may be configured to detect visual information in an environment surrounding the vehicle 100. col. 4, lines 8-25, 51-67. See FIG. 2: a viewing zone 208). Note that Liang discloses “[w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100.” Col. 4, lines 21-25; and an image sensor disposed at a second position adjacent to the front wheel well Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.), …and/or other object-detection sensors 116D, 116E)…Col. 4, lines 8-25, 51-55. E.g., a first camera 116A and a second camera 116F aimed in a forward traveling direction of the vehicle 100. Col. 5, lines 5-26). Liang further discloses the side-facing sensors 116D, 116E may be camera. Col. 6, lines 13-15.
Liang in FIGS. 1 and 2 illustrates a lidar sensor 112 mounted on the roof of the vehicle, however, does not explicitly illustrate a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle. Furthermore, as stated above, Liang teaches an image sensor disposed at a second position adjacent to the front wheel well of the vehicle (Col. 6, lines 13-15: the side-facing sensors 116D, 116E may be camera), but does not explicitly teach an image sensor disposed at a second position adjacent to the front wheel well relative to the lidar sensor.
However, as stated above, Liang discloses that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100. Col. 4, lines 21-25.
Furthermore, Frank discloses a vehicle 100 with multiple LIDAR sensors 105 and at least one camera 110 incorporated into the side view mirror housing 115. The camera 110 field of view overlaps the Lidar field of view. FIGS. 1-3, ¶0012-0014. Thus, Frank teaches a co-located Lidar and camera arrangement having an overlapped field of view, which has the advantage of reducing aerodynamic resistance due to a vehicle roof mounted Lidar sensors or for aesthetic purpose (Frank: ¶0008, 0017).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Liang’s vehicle surrounding monitoring system by incorporating the co-located Lidar and camera arrangement as taught by Frank to arrive at the claimed invention “a lidar sensor disposed at a first position adjacent to a front wheel well of a vehicle …an image sensor disposed at a second position adjacent to the front wheel well relative to the lidar sensor” by rearranging parts of the prior art structure, in order to reducing aerodynamic resistance due to a vehicle roof mounted Lidar sensors or for aesthetic purpose (Frank: ¶0008, 0017). Note that it has been held that rearranging parts of a prior art structure involves only routing skill in the art. In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950).
Regarding claims 2-10 and 13-14 the claims recite different arrangement of a different sensors on the body of the vehicle, and are rejected under the same rationale set forth above with respect to claim 1.
Regarding claim 15, Liang discloses an external sensing system comprising: a first lidar sensor disposed at a first position on a Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.)…and/or other object-detection sensors 116D, 116E…The ranging and imaging system 112 may be configured to detect visual information in an environment surrounding the vehicle 100. col. 4, lines 8-25, 51-67. See FIG. 2: a viewing zone 208). Note that Liang discloses “[w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100.” Col. 4, lines 21-25; and a first image sensor disposed at a second position on the quarterpanel and configured to provide an image field of view at least partly overlapping [a] lidar field of view in the region of the immediate vicinity of the vehicle (Liang discloses [a]s shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.), an imaging sensor 116A, 116F (e.g., camera, IR, etc.), …and/or other object-detection sensors 116D, 116E)…Sensor data and information may be collected by one or more sensors or systems 116A-K, 112 of the vehicle 100 monitoring the vehicle sensing environment 200. Col. 4, lines 8-25, 51-55, col. 5, lines 5-7). Liang further discloses the side-facing sensors 116D, 116E may be camera. Col. 6, lines 13-15.
Liang in FIGS. 1 and 2 illustrates a Lidar 112, mounted on the roof of the vehicle, having a viewing zone 208 overlapping with a detection zone 216A-D corresponding to sensors and systems 116A-K. Liang further discloses the side-facing sensors 116D, 116E may be camera. Col. 6, lines 13-15. Liang, however, does not explicitly illustrate a first lidar sensor disposed at a first position on a quarterpanel of a vehicle. Furthermore, as stated above, Liang teaches an image sensor disposed at a second position adjacent to the front wheel well of the vehicle (Col. 6, lines 13-15: the side-facing sensors 116D, 116E may be camera), but does not explicitly disclose a first image sensor disposed at a second position on the quarterpanel and configured to provide an image field of view at least partly overlapping the lidar field of view which is disposed at a first position on a quarterpanel of the vehicle.
Frank discloses a vehicle 100 with multiple LIDAR sensors 105 and at least one camera 110 incorporated into the side view mirror housing 115. The camera 110 field of view overlaps the Lidar field of view. FIGS. 1-3, ¶0012-0014. Thus, Frank teaches a co-located Lidar and camera arrangement having an overlapped field of view, which has the advantage of reducing aerodynamic resistance due to a vehicle roof mounted Lidar sensors or for aesthetic purpose (Frank: ¶0008, 0017).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Liang’s vehicle surrounding monitoring system by incorporating the co-located Lidar and camera arrangement taught by Frank to arrive at the claimed invention “a first lidar sensor disposed at a first position on a quarterpanel of a vehicle...a first image sensor disposed at a second position on the quarterpanel and configured to provide an image field of view at least partly overlapping the lidar field of view in the region of the immediate vicinity of the vehicle” by rearranging parts of the prior art structure, in order to reducing aerodynamic resistance due to a vehicle roof mounted Lidar sensors or for aesthetic purpose (Frank: ¶0008, 0017). Note that it has been held that rearranging parts of a prior art structure involves only routing skill in the art. In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950).
Regarding claims 16-17 the claims recite different arrangement of a different sensors on the body of the vehicle, and are rejected under the same rationale set forth above with respect to claim 15.
Regarding claim 18, Liang teaches the external sensing system of claim 15, further comprising at least one of a second image sensor, a radar sensor or a second lidar sensor disposed on a roof of the vehicle (col. 4, lines 8-14: an imaging sensor 116A, 116F (e.g., camera, IR, etc.)). Liang further discloses “[w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100.” Col. 4, lines 21-25.
Regarding claim 19, Liang teaches the external sensing system of claim 15, further comprising at least one of another image sensor, a radar sensor or another lidar sensor disposed on a roof of the vehicle (col. 4, lines 8-14: an imaging sensor 116A, 116F (e.g., camera, IR, etc.)). Liang further discloses “[w]hile shown associated with one or more areas of a vehicle 100, it should be appreciated that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100.” Col. 4, lines 21-25.
Regarding claim 20, Liang teaches a vehicle comprising the external sensing system of claim 15 (See the rejection of claim 15).
Claim(s) 11 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 10992860 B2) in view of Frank et al. (US 20180032822 A1) as applied to claim 1, and further in view of Lu et al. (US 20160306041 A1).
Regarding claim 11, Liang teaches the external sensing system of claim 1, further comprising a control system operatively coupled to the image sensor and the lidar sensor, the control system including one or more processors (Figs. 3A: processors 340, vehicle control system 348) configured to: detect, based on at least one of lidar data from the lidar sensor or captured imagery from the image sensor, an object in the immediate vicinity of the vehicle (col. 9, lines 41-43, col. 10, lines 26-32: The camera sensors 332 may include one or more components configured to detect image information associated with an environment of the vehicle 100. ..one or more of the sensors 306-337 described above may include one or more processors configured to process and/or interpret signals detected by the one or more sensors 306-337).
Liang in view of Frank does not explicitly disclose classify the object based on the captured imagery.
However, Lu teaches classify the object based on the captured imagery (¶0014, 0016: The vision processing algorithm processes image(s) and identifies object(s) of interest in the field of view of the imager or camera…The vision module acquires, processes and identifies objects that are of interest to the applications, such as vehicles, pedestrians, roadside buildings, traffic lights, traffic signs, tail lights and head lights, and/or the like. The vision system may also identify lane markers, curbs and/or road dividers to help to determine the lane that the host vehicle is in and where the other vehicles or objects are relative to that lane).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang in view of Frank by incorporating the teaching of Lu as noted above, in order to improve object identification and prioritization (Lu: ¶0014).
Regarding claim 12, Liang in view of Frank and Lu teaches the external sensing system of claim 11. Lu further teaches wherein the control system is further configured to determine, based on the classification of the object, whether to cause the vehicle to perform an action in an autonomous driving mode (¶0016: vision module acquires, processes and identifies objects that are of interest to the applications, such as vehicles, pedestrians, roadside buildings, traffic lights, traffic signs, tail lights and head lights, and/or the like. ¶0025: For example, in an ACC application, the vehicle in front of the host vehicle and in the same lane has the highest priority score and will be stared by the Lidar for the most amount time and the system will update the distance data to the vehicle controllers more frequently, in order to maintain the safe distance and avoid a collision with the other vehicle). The motivation statement set forth above with respect to claim 11 applies here.
Claim(s) 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 10992860 B2).
Regarding claim 21, Liang discloses an external sensing system comprising: a lidar sensor disposed (col. 4, lines 8-25: As shown in FIG. 1, the vehicle 100 may, for example, include at least one of a ranging and imaging system 112 (e.g., LIDAR, etc.)…) and providing a lidar field of view of a region of an immediate vicinity of the vehicle (FIG. 2, a viewing zone 208); and an image sensor disposed on (col. 5, lines 14-26: a first sensor 116A and a second sensor 116F may correspond to a first camera 116A and a second camera 116F aimed in a forward traveling direction of the vehicle 100….Similar image data may be collected by rear view cameras (e.g., sensors 116G, 116H) aimed in a rearward traveling direction vehicle 100).
Liang does not explicitly illustrate a lidar sensor disposed on a bumper, and an image sensor disposed on the bumper.
However, Liang discloses that any of the sensors and systems 116A-K, 112 illustrated in FIGS. 1 and 2 may be disposed in, on, and/or about the vehicle 100 in any position, area, and/or zone of the vehicle 100. Col. 4, lines 21-25. Liang further discloses the side-facing sensors 116D, 116E may be camera. Col. 6, lines 13-15.
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Liang’s vehicle monitoring system by rearranging parts of the prior art elements to arrive at the claimed invention of “a lidar sensor disposed on a bumper…an image sensor disposed on the bumper,” since such modification is a predictable design choice of a rearrangement of parts of the prior elements. Note that it has been held that rearranging parts of a prior art structure involves only routing skill in the art. In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950).
The following are the prior art made of record and not relied upon are considered pertinent to applicant's disclosure.
Roeger et al. (US 10916035 B1) discloses “Camera Calibration Using Dense Depth Maps” Title
“Introducing Waymo’s suite of custom-built, self-driving hardware” The Waymo Team, February 15, 2017, Introducing Waymo’s suite of custom-built, self-driving hardware
“Waymo Keynote at NAIA S AutoMobili-D 2017” Youtube, Feb 16, 2017, Waymo Keynote at NAIA S AutoMobili-D 2017
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NATHNAEL AYNALEM whose telephone number is (571)270-1482. The examiner can normally be reached M-F 9AM-5:30 PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SATH PERUNGAVOOR can be reached at 571-272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NATHNAEL AYNALEM/Primary Examiner, Art Unit 2488