DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description:
Reference Numerals “S820”, “S830”, “S840”, “S850”, and “S860” shown in Figure 8.
Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim 8 is rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yamada et al. (JP 2017-207340).
Re claim 8: Yamada et al. disclose a method of detecting soiling of camera image recognition (i.e., “travel plan creation device and a center that create a travel plan for an autonomous vehicle”, Paragraph [0001]), the method comprising:
obtaining location information of the sun based on a current date and a current time (i.e., “The center 2 is, for example, a server device, and stores weather information such as information (hereinafter referred to as sun information) of the altitude and azimuth angle of the sun and the azimuth angle for each calendar and time in addition to each area”, Paragraph [0013]);
generating a straight line between the sun and a camera based on position information of the sun (i.e., “OA in FIGS. 7 and 8 indicates the imaging direction of the front camera 50, and SU indicates the direction in which the sun is located”, Paragraph [0057]) and location information of a vehicle (i.e., “The ADAS locator 30 sequentially measures the vehicle position of the host vehicle equipped with the ADAS locator 30 by combining the positioning signal received by the GNSS receiver 31 and the measurement result of the inertial sensor 32”, Paragraph [0021]); and
determining whether an error occurs in image recognition of the camera (i.e., “the calculated horizontal angle difference and vertical angle difference determine whether or not the surrounding environment using the captured image of the front camera 50 cannot be recognized by backlight from the sun (hereinafter referred to as a backlight threshold range)”, Paragraph [0056]) based on the straight line (i.e., FIG. 7, “SU”, Paragraph [0057]), a capturing direction of the camera (i.e., FIG. 7, “OA”, Paragraph [0057]), and a moving direction of the vehicle (See for example, “The imaging direction of the front camera 50 when passing through the link may be obtained from the direction of the optical axis of the front camera 50 with respect to the own vehicle and the link azimuth of the link scheduled to pass by the own vehicle”, Paragraph [0054]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 2, 4, 9, 10, and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Yamada et al. (JP 2017-207340) in view of Furukawa (U.S. Pub. No. 2013/0038734).
As to claim 1, Yamada et al. teaches a method of detecting soiling of camera image recognition (i.e., “travel plan creation device and a center that create a travel plan for an autonomous vehicle”, Paragraph [0001]), the method comprising:
obtaining an altitude and an azimuth of the sun based on a current date, a current time, and location information of a vehicle (i.e., “The center 2 is, for example, a server device, and stores weather information such as information (hereinafter referred to as sun information) of the altitude and azimuth angle of the sun and the azimuth angle for each calendar and time in addition to each area”, Paragraph [0013]; “the position information is transmitted from the vehicle-side unit 1, and the center 2 transmits the distribution information about the region corresponding to the position indicated by the position information on the basis of the position information”, Paragraph [0014]);
generating a straight line between the sun and a camera in a three-dimensional space (i.e., “The prediction unit 140 calculates, for the target link, a horizontal angle difference between the azimuth angle of the front camera 50 in the imaging direction and the azimuth angle of the sun, and a vertical angle difference between the elevation angle of the front camera 50 in the imaging direction and the elevation angle of the sun, respectively”, Paragraph [0056]) based on the altitude and the azimuth of the sun (i.e., “OA in FIGS. 7 and 8 indicates the imaging direction of the front camera 50, and SU indicates the direction in which the sun is located”, Paragraph [0057]); and
determining that an error occurs in image recognition of the camera (i.e., “the calculated horizontal angle difference and vertical angle difference determine whether or not the surrounding environment using the captured image of the front camera 50 cannot be recognized by backlight from the sun (hereinafter referred to as a backlight threshold range)”, Paragraph [0056]; and “When both the horizontal angle difference and the vertical angle difference calculated for the target link are within the backlight threshold range, the prediction unit 140 sets the link as a backlight target link”, Paragraph [0061]) in response to the straight line passing through the lens surface (i.e., “sensing range”, Paragraph [0058]; and Paragraph [0060]).
However, Yamada et al. does not explicitly disclose generating a lens surface of the camera in the three-dimensional space based on a moving direction of the vehicle and a lens surface angle of the camera.
Furukawa teaches generating a lens surface of the camera in the three-dimensional space (See for example, FIG. 5, “imaging plane”, Paragraph [0036]) based on a moving direction of the vehicle and a lens surface angle of the camera (i.e., “For the purpose of simplicity of explanation, in this embodiment, it is assumed that the optical axis 18 of the camera 111 runs in parallel with the extension lines defining the lateral extremities of the vehicle and the vehicle 15 runs in parallel with the surface of the road”, Paragraph [0038]).
Yamada et al. and Furukawa are analogous art because they are from the field of digital image processing in vehicle environments.
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Yamada et al. by incorporating the generation of a lens surface of the camera in the three-dimensional space (i.e., sensing range) based on a moving direction of the vehicle and a lens surface angle of the camera, as taught by Furukawa.
The suggestion/motivation for doing so would have been to accurately determine the range in which the camera can shoot.
Therefore, it would have been obvious to combine Furukawa with Yamada et al. to obtain the invention as specified in claim 1.
As to claim 2, Yamada et al. teaches wherein the obtaining of the altitude and the azimuth comprises obtaining the altitude and the azimuth of the sun corresponding to the current date, the current time, and the location information of the vehicle by using a sun path diagram (i.e., “stored sun information”, Paragraph [0013]).
As to claim 4, Yamada et al. teaches wherein the determining of the error occurrence comprises determining that the error occurs in the image recognition of the camera based on a position of the sun during a first time preset (i.e., “a predetermined time period”, Paragraph [0046]).
As to claim 9, Yamada et al. teaches an apparatus for detecting soiling of camera image recognition (i.e., “Travel Support System 4”, Paragraph [0013]), the apparatus comprising:
a processor; and a memory (See for example, “ECU 10”, Paragraph [0028]);
wherein the processor is configured to: obtain an altitude and an azimuth of the sun based on a current date, a current time, and location information of a vehicle(i.e., “The center 2 is, for example, a server device, and stores weather information such as information (hereinafter referred to as sun information) of the altitude and azimuth angle of the sun and the azimuth angle for each calendar and time in addition to each area”, Paragraph [0013]; “the position information is transmitted from the vehicle-side unit 1, and the center 2 transmits the distribution information about the region corresponding to the position indicated by the position information on the basis of the position information”, Paragraph [0014]);
generate a straight line between the sun and a camera in a three-dimensional space (i.e., “The prediction unit 140 calculates, for the target link, a horizontal angle difference between the azimuth angle of the front camera 50 in the imaging direction and the azimuth angle of the sun, and a vertical angle difference between the elevation angle of the front camera 50 in the imaging direction and the elevation angle of the sun, respectively”, Paragraph [0056]) based on the altitude and the azimuth of the sun (i.e., “OA in FIGS. 7 and 8 indicates the imaging direction of the front camera 50, and SU indicates the direction in which the sun is located”, Paragraph [0057]); and
determine that an error occurs in image recognition of the camera (i.e., “the calculated horizontal angle difference and vertical angle difference determine whether or not the surrounding environment using the captured image of the front camera 50 cannot be recognized by backlight from the sun (hereinafter referred to as a backlight threshold range)”, Paragraph [0056]; and “When both the horizontal angle difference and the vertical angle difference calculated for the target link are within the backlight threshold range, the prediction unit 140 sets the link as a backlight target link”, Paragraph [0061]) if the straight line passes through the lens surface (i.e., “sensing range”, Paragraph [0058]; and Paragraph [0060]).
However, Yamada et al. does not explicitly disclose wherein the processor is configured to generate a lens surface of the camera in the three-dimensional space based on a moving direction of the vehicle and a lens surface angle of the camera.
Furukawa teaches a processor that is configured to (i.e., “image processing unit 13”, Paragraph [0025]) generate ng a lens surface of the camera in the three-dimensional space (See for example, FIG. 5, “imaging plane”, Paragraph [0036]) based on a moving direction of the vehicle and a lens surface angle of the camera (i.e., “For the purpose of simplicity of explanation, in this embodiment, it is assumed that the optical axis 18 of the camera 111 runs in parallel with the extension lines defining the lateral extremities of the vehicle and the vehicle 15 runs in parallel with the surface of the road”, Paragraph [0038]).
Therefore, in view of Furukawa, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Yamada et al. by incorporating the generation of a lens surface of the camera in the three-dimensional space based on a moving direction of the vehicle and a lens surface angle of the camera, as taught by Furukawa, in order to accurately determine the range in which the camera can shoot.
As to claim 10, Yamada et al. teaches wherein the processor is configured to obtain the altitude and the azimuth of the sun corresponding to the current date, the current time, and the location information of the vehicle by using a sun path diagram (i.e., “stored sun information”, Paragraph [0013]).
As to claim 12, Yamada et al. teaches wherein the processor is configured to determine that the error occurs in the image recognition of the camera based on the position of the sun during a first time preset (i.e., “a predetermined time period”, Paragraph [0046]).
Allowable Subject Matter
Claims 3, 5-7, 11, and 13-15 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter: the closest prior art made of record fails to disclose, teach, and/or suggest, inter alia, the claimed invention of claims 1 and 9, and further comprising the generation of the lens surface of the camera comprises obtaining three points on X, Y, and Z axes in the three-dimensional space based on the moving direction of the vehicle and the lens surface angle, and generating a surface created by the three points as the lens surface of the camera; and the claimed invention of claims 4 and 12, and further comprising the determination of the error occurrence comprises determining whether the error occurs in the image recognition of the camera by using an artificial intelligence learning model based on pixel segmentation using image information captured by the camera during a second time set to be longer than the first time as input if it is not determined that the error occurred in the image recognition of the camera based on the position of the sun during the first time.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSE M TORRES whose telephone number is (571)270-1356. The examiner can normally be reached Monday thru Friday; 10:00 AM to 6:00 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Mehmood can be reached at 571-272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOSE M TORRES/Examiner, Art Unit 2664 03/05/2026