DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The disclosure is objected to because of the following informalities:
In paragraph [0026], line 3, "illustrated in FIGS. 2A and 3B", should likely read "illustrated in FIGS. 2A and 2B"
Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-3, and 5-7 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wang (US 20210316669 A1).
Regarding claim 1, Wang teaches an obstacle sensor inspection device that performs an inspection of an obstacle sensor detecting an obstacle present in the vicinity of a mobile body, the obstacle sensor inspection device comprising:
a running control unit configured to perform control such that the mobile body is caused to run along a running path ([0035] The steering system 134 includes suitable componentry that is configured to control the direction of movement of the autonomous vehicle 102 during navigation.; [0096] The hallway calibration environment 300, which may also be referred to as a tunnel calibration environment, includes a thoroughfare 305 through which a vehicle 102 drives, the thoroughfare 305 flanked on either side by targets detectable by the sensors 180 of the vehicle 102.);
an inspection processing unit configured to perform an inspection process for the obstacle sensor using a still object for inspection that is designated in advance in a sensor inspection section designated in advance in a state in which the mobile body is running along the running path ([0045] As described above, the remote computing system 150 is configured to send/receive a signal from the autonomous vehicle 140 regarding reporting data for training and evaluating machine learning algorithms, requesting assistance from remote computing system 150 or a human operator via the remote computing system 150; [0053] The sensor calibration target 200A illustrated in FIG. 2A is a planar board made from a substrate 205, with a pattern 210A printed, stamped, engraved, imprinted, or otherwise marked thereon. The pattern 210A of FIG. 2A is a checkerboard pattern.); and
a detection area setting unit configured to set a detection area of the obstacle sensor to a second area larger than a first area that is used at the time of normal running when the inspection process for the obstacle sensor is performed using the inspection processing unit ([0183] Depending on the sensors 180 on the vehicle 102 and the data captured by the sensors 180, the sensors 180 may require one or more full 360 degree rotations of the vehicle 102 on the platform 420, or may require less than one full 360 degree rotation of the vehicle 102 on the platform 420. In one embodiment, sufficient data for calibration of a sensor may mean data corresponding to targets covering at least a subset of the complete field of view of a particular sensor (collectively over a number of captures), with the subset reaching and/or exceeding a threshold percentage (e.g., 30%, 35%, 40%, 45%, 50%, 55%, 60%, 65%, 70%, 75%, 80%, 85%, 90%, 95%, 99%, 100%).),
wherein the inspection processing unit determines that the obstacle sensor is normal when the still object for inspection is detected by the obstacle sensor and determines that the obstacle sensor is abnormal when the still object for inspection is not detected by the obstacle sensor ([Fig. 10]; [0149] At step 1015, the calibration systems in the vehicle read the calibration scene and: (a) detect targets in each sensor frame, (b) associate detected targets, (c) generate residuals, (d) solve calibration optimization problem, (e) validate calibration optimization solution, and (f) output calibration results).
Regarding claim 2, Wang teaches the obstacle sensor inspection device according to claim 1, further comprising
an inspection preparation processing unit configured to perform a preparation process for causing the mobile body not to come into contact with the obstacle when the mobile body runs in the sensor inspection section, in a sensor inspection preparation section positioned on a side in front of the sensor inspection section in a traveling direction of the mobile body before the inspection process for the obstacle sensor is performed by the inspection processing unit ([0039] The internal computing system 110 can also include a constraint service 114 to facilitate safe propulsion of the autonomous vehicle 102. The constraint service 116 includes instructions for activating a constraint based on a rule-based restriction upon operation of the autonomous vehicle 102. For example, the constraint may be a restriction upon navigation that is activated in accordance with protocols configured to avoid occupying the same space as other objects, abide by traffic laws, circumvent avoidance areas, etc).
Regarding claim 3, Wang teaches the obstacle sensor inspection device according to claim 2,
wherein the inspection preparation processing unit sets the detection area of the obstacle sensor to a third area larger than the first area and determines whether or not the obstacle is detected by the obstacle sensor in the state as the preparation process ([0183] Depending on the sensors 180 on the vehicle 102 and the data captured by the sensors 180, the sensors 180 may require one or more full 360 degree rotations of the vehicle 102 on the platform 420, or may require less than one full 360 degree rotation of the vehicle 102 on the platform 420. In one embodiment, sufficient data for calibration of a sensor may mean data corresponding to targets covering at least a subset of the complete field of view of a particular sensor (collectively over a number of captures), with the subset reaching and/or exceeding a threshold percentage (e.g., 30%, 35%, 40%, 45%, 50%, 55%, 60%, 65%, 70%, 75%, 80%, 85%, 90%, 95%, 99%, 100%).), and
wherein, in a case in which it is determined by the inspection preparation processing unit that the obstacle is not detected by the obstacle sensor, the inspection processing unit determines that the obstacle sensor is normal when the still object for inspection is detected by the obstacle sensor and determines that the obstacle sensor is abnormal when the still object for inspection is not detected by the obstacle sensor ([Fig. 10]; [0149] At step 1015, the calibration systems in the vehicle read the calibration scene and: (a) detect targets in each sensor frame, (b) associate detected targets, (c) generate residuals, (d) solve calibration optimization problem, (e) validate calibration optimization solution, and (f) output calibration results).
Regarding claim 5, Wang teaches the obstacle sensor inspection device according to claim 1,
wherein the second area is set to be wider than the first area in at least one of a traveling direction and a width direction of the mobile body ([0183] Depending on the sensors 180 on the vehicle 102 and the data captured by the sensors 180, the sensors 180 may require one or more full 360 degree rotations of the vehicle 102 on the platform 420, or may require less than one full 360 degree rotation of the vehicle 102 on the platform 420. In one embodiment, sufficient data for calibration of a sensor may mean data corresponding to targets covering at least a subset of the complete field of view of a particular sensor (collectively over a number of captures), with the subset reaching and/or exceeding a threshold percentage (e.g., 30%, 35%, 40%, 45%, 50%, 55%, 60%, 65%, 70%, 75%, 80%, 85%, 90%, 95%, 99%, 100%).).
Regarding claim 6, Wang teaches the obstacle sensor inspection device according to claim 1,
wherein the running path includes a curved part ([0100] While the thoroughfare 305 of the hallway calibration environment 300 of FIG. 3 is a straight path, in some cases it may be a curved path, and by extension the left target channel 310 and right target channel 315 may be curved to follow the path of the thoroughfare 305.),
wherein the sensor inspection section is designated to be at a position on a side in front of the curved part in a traveling direction of the mobile body ([0096] The hallway calibration environment 300, which may also be referred to as a tunnel calibration environment, includes a thoroughfare 305 through which a vehicle 102 drives, the thoroughfare 305 flanked on either side by targets detectable by the sensors 180 of the vehicle 102.), and
wherein the still object for inspection is disposed at a position entering the inside of the second area when the mobile body runs in the sensor inspection section ([0099] The sensor targets illustrated in FIG. 3 are illustrated such that some are positioned closer to the thoroughfare 305 while some are positioned farther from the thoroughfare 305. Additionally, while some targets in FIG. 3 are facing a direction perpendicular to the thoroughfare 305, others are angled up or down with respect to the direction perpendicular to the thoroughfare 305.).
Regarding claim 7, Wang teaches an obstacle sensor inspection method for performing an inspection of an obstacle sensor detecting an obstacle present in the vicinity of a mobile body, the obstacle sensor inspection method comprising:
designating a sensor inspection section in which an inspection of the obstacle sensor is performed, a sensor inspection preparation section positioned on a side in front of the sensor inspection section in a traveling direction of the mobile body, and a still object for inspection used for an inspection of the obstacle sensor in the sensor inspection section, in the middle of a running path in which the mobile body runs ([0096] The hallway calibration environment 300, which may also be referred to as a tunnel calibration environment, includes a thoroughfare 305 through which a vehicle 102 drives, the thoroughfare 305 flanked on either side by targets detectable by the sensors 180 of the vehicle 102.);
performing control such that the mobile body is caused to run along the running path ([0035] The steering system 134 includes suitable componentry that is configured to control the direction of movement of the autonomous vehicle 102 during navigation.; [0096] The hallway calibration environment 300, which may also be referred to as a tunnel calibration environment, includes a thoroughfare 305 through which a vehicle 102 drives, the thoroughfare 305 flanked on either side by targets detectable by the sensors 180 of the vehicle 102.);
performing an inspection process for the obstacle sensor using the still object for inspection in a state in which the mobile body is running along the running path in the sensor inspection section ([0045] As described above, the remote computing system 150 is configured to send/receive a signal from the autonomous vehicle 140 regarding reporting data for training and evaluating machine learning algorithms, requesting assistance from remote computing system 150 or a human operator via the remote computing system 150; [0053] The sensor calibration target 200A illustrated in FIG. 2A is a planar board made from a substrate 205, with a pattern 210A printed, stamped, engraved, imprinted, or otherwise marked thereon. The pattern 210A of FIG. 2A is a checkerboard pattern.); and
setting a detection area of the obstacle sensor to a second area larger than a first area that is used at the time of normal running when the inspection process for the obstacle sensor is performed ([0183] Depending on the sensors 180 on the vehicle 102 and the data captured by the sensors 180, the sensors 180 may require one or more full 360 degree rotations of the vehicle 102 on the platform 420, or may require less than one full 360 degree rotation of the vehicle 102 on the platform 420. In one embodiment, sufficient data for calibration of a sensor may mean data corresponding to targets covering at least a subset of the complete field of view of a particular sensor (collectively over a number of captures), with the subset reaching and/or exceeding a threshold percentage (e.g., 30%, 35%, 40%, 45%, 50%, 55%, 60%, 65%, 70%, 75%, 80%, 85%, 90%, 95%, 99%, 100%).),
wherein, in the performing of an inspection process for the obstacle sensor, it is determined that the obstacle sensor is normal when the still object for inspection is detected by the obstacle sensor, and it is determined that the obstacle sensor is abnormal when the still object for inspection is not detected by the obstacle sensor ([Fig. 10]; [0149] At step 1015, the calibration systems in the vehicle read the calibration scene and: (a) detect targets in each sensor frame, (b) associate detected targets, (c) generate residuals, (d) solve calibration optimization problem, (e) validate calibration optimization solution, and (f) output calibration results).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Wang (US 20210316669 A1), as applied to Claim 1 above, and further in view of Matsuzaki (US 20180210443A1).
Regarding claim 4, Wang teaches the obstacle sensor inspection device according to claim 2,
Wang fails to teach the device wherein the inspection preparation processing unit performs control of the mobile body to be decelerated to a speed for not coming into contact with the obstacle at the time of the mobile body running in the sensor inspection section as the preparation process
However, Matsuzaki teaches the device wherein the inspection preparation processing unit performs control of the mobile body to be decelerated to a speed for not coming into contact with the obstacle at the time of the mobile body running in the sensor inspection section as the preparation process (the device wherein the inspection preparation processing unit performs control of the mobile body to be decelerated to a speed for not coming into contact with the obstacle at the time of the mobile body running in the sensor inspection section as the preparation process)
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Wang to comprise the speed reduction system to avoid obstacles similar to Matsuzaki, with a reasonable expectation of success. This would have the predictable result of using ensuring the controlled vehicle of Wang avoids obstacles in the safest way.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT WILLIAM VASQUEZ JR whose telephone number is (571)272-3745. The examiner can normally be reached Monday thru Thursday, Flex Friday, 8:00-5:00 PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ROBERT HODGE can be reached at (571)272-2097. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ROBERT W VASQUEZ/Examiner, Art Unit 3645
/JAMES R HULKA/Primary Examiner, Art Unit 3645