DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. JP2021-118650, filed on 7/19/2021.
Information Disclosure Statement
The information disclosure statement(s) submitted on 12/15/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement(s) is/are being considered by the examiner.
Response to Arguments
Applicant's arguments filed 7/14/2025 have been fully considered but they are drawn towards newly amended claim language.
Regarding Rejections under 35 U.S.C. § 103,
Applicant contends that the cited prior art fails to disclose newly amended limitations of independent claim 1, and newly added claim 10, including “an image processor that obtains a captured image captured by a camera included in the vehicle, and stores the captured image in the storage, the captured image including a road surface in rear, in front or at a side of the vehicle, and a notification pattern projected on the road surface by a projector included in the vehicle, wherein the image processor performs image processing of removing the notification pattern from the captured image to acquire an image including the road surface in rear, in front or at a side of the vehicle without the notification pattern.”
See the rejection below for how the cited art in light of new/existing references reads on the newly amended language as well as the examiner’s interpretation of the cited art in view of the presented claim set.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-4, 5, 8, and 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Suzuki et al. (US 20180260182 A1) (hereinafter Suzuki) in view of Takii et al. (US 20210162927 A1) (hereinafter Takii).
Regarding claim 1, Suzuki discloses:
A control device for a vehicle, comprising:
a storage; and [See Suzuki, See Suzuki, ¶ 0036, 0065-0066, 0067-0069 discloses image display signals supplied from an on-vehicle CPU to first and second image display units.]
an image processor that obtains a captured image captured by a camera included in the vehicle, and stores the captured image in the storage, the captured image including a road surface in rear, in front or at a side of the vehicle, and a notification pattern projected on the road surface by a projector included in the vehicle, [See Suzuki, ¶ 0023-0024 discloses that a “second image display unit” 20 displays the road-surface image G2 on the road surface R. In this implementation, the second image display unit 20 may include, for example, headlights. The headlights may include light-distribution-variable multi-light-source headlights provided in a front part of the vehicle 100. In other words, the road-surface image G2 may be displayed with the use of the headlights. The second image display unit 20 is not limited to the headlights as mentioned above, insofar as the second image display unit 20 is able to display the road-surface image G2 on the road surface R. Non-limiting examples may include a digital mirror device and a projector; See Suzuki, ¶ 0036, 0065-0066, 0067-0069 discloses a detection unit 40 and an on-vehicle camera to detect presence or absence of transmissive image and road-surface images by detecting image display signals supplied from an on-vehicle CPU to first and second image display units. In one preferred but non-limiting example, the detection unit 40 may detect both the transmissive image G1 and the road-surface image G2. As a configuration for the detection unit 40 to detect the images, for example, a configuration may be adopted in which imaging of the front windshield 102 and the road surface R is carried out with the use of an on-vehicle camera, to directly detect presence or absence of the images, i.e., the transmissive image G1 and the road-surface image G2; See Suzuki, ¶ 0030-0033 discloses a transmissive image projected onto a road surface R, for example, including travel route guidance information, regulation information at a destination of guidance, and current traveling speed information. The travel route guidance information may be denoted by arrow symbols. The regulation information and the current traveling speed information may be denoted by characters. The road-surface image G2 illustrated in FIG. 3 may include, for example, warning information to the pedestrian W. The warning information may be denoted by characters and symbols; See Suzuki, ¶ 0037-0039, 0041-0044 discloses a derivation unit may derive image information on the transmissive image G1 and image information on the road-surface image G2. A determination unit determine presence or absence of superimposition of the transmissive image G1 on the road-surface image G2, on the basis of one or both of the image information on the transmissive image G1 and the image information on the road-surface image G2. Further, a display stop unit may stop displaying any one of the transmissive image G1 and the road-surface image G2, on the basis of the detection result of the detection unit 40.]
Suzuki does not appear to explicitly disclose:
wherein the image processor performs image processing of removing the notification pattern from the captured image to acquire an image including the road surface in rear, in front or at a side of the vehicle without the notification pattern.
However, Takii discloses:
wherein the image processor performs image processing of removing the notification pattern from the captured image to acquire an image including the road surface in rear, in front or at a side of the vehicle without the notification pattern. [See Takii, ¶ 0092-0094, 0120-0121, 0122-0123 discloses that a heads-up display in a vehicle displays an image of a light pattern projected onto a road surface that has been captured by a camera. It is particularly noted that any one of the captured image and the CG image may be selected to be displayed on the heads-up display. As discussed, the driver of the vehicle may glean an enhanced spatial understanding/awareness of the displayed/projected shape on the road-surface actually being observed by pedestrians or other drivers by looking at a captured image of a light pattern being displayed by the HUD. Hence, image processing is performed to create a computer-generated image indicating a virtual object of the object together with a captured image of the surroundings of a vehicle. In other words, the view being presented to the driver includes the captured image of the surroundings of the vehicle, but presents a CG-image of a notification pattern rather than an image of the projected pattern as captured by the camera. Since the CG-image is not the image of the captured image of the projected pattern, this is what constitutes “removing” the notification pattern per the claim language.]
It would have been obvious to the person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention disclosed by Suzuki to add the teachings of Takii in order to better grasp a positional relationship between an object and a projected light pattern with respect to a line of sight of the object and the light pattern.
Regarding claim 2, Suzuki in view of Takii discloses all the limitations of claim 1.
Suzuki discloses:
wherein the image processor stores the notification pattern projected prior to obtaining the captured image from which the notification pattern is to be removed as an initial setting in the storage. [See Takii, ¶ 0152 discloses determining that the shape of the light pattern of the captured image captured by the camera is different from the shape of the reference light pattern (via distortion or interference from an object being within the path of the projected light). A reference light pattern may be stored in a storage – a display control unit may acquire the reference light pattern via the vehicle control unit.]
The reasons to combine the cited prior art are applicable to those presented for previously rejected claim 1.
Regarding claim 3, Suzuki in view of Takii discloses all the limitations of claim 1.
Suzuki discloses:
wherein the image processor obtains a difference between a first image captured in a state of the notification pattern being not projected [See Suzuki, Fig. 4 illustrates an image captured by an on-vehicle camera which clearly does not contain a projected notification pattern G2 (a state of the notification pattern “being not projected”).] and a second image captured in a state of the notification pattern being projected, [See Suzuki, Fig. 5 illustrates an image captured by an on-vehicle camera which clearly does contain a projected notification pattern G2 (a state of the notification pattern “being projected”) and stores, as the notification pattern, an image of the difference in the storage. [See Suzuki, ¶ 0056 discloses a comparison unit 54 may perform the comparison of the image information (step S5). Here, the image information on the transmissive image G1 and the image information on the road-surface image G2 may be compared with each other.]
Regarding claim 4, Suzuki in view of Takii discloses all the limitations of claim 1.
Suzuki discloses:
wherein the image processor compares a first image captured by imaging a region in a vicinity of the subject vehicle at a predetermined position in a state of the notification pattern being projected with a second image captured by imaging a region in a vicinity of the subject vehicle having moved from the predetermined position, and stores an image matching between the first image and the second image as the notification pattern in the storage. [See Suzuki, ¶ 0056 discloses a comparison unit 54 may perform the comparison of the image information (step S5). Here, the image information on the transmissive image G1 and the image information on the road-surface image G2 may be compared with each other.]
Takii discloses:
wherein the image processor compares a first image captured by the camera when the vehicle is at a predetermined position in a state where the notification pattern is projected with a second image captured by the camera when the vehicle moves from the predetermined position, and stores a portion of the first image and a portion of the second image matching between the first image and the second image as the notification pattern in the storage.[See Takii, ¶ 0150-0158 discloses determining that the shape of the light pattern of the captured image captured by the camera is different from the shape of the reference light pattern (via distortion or interference from an object being within the path of the projected light). A reference light pattern may be stored in a storage – a display control unit may acquire the reference light pattern via the vehicle control unit.]
The reasons to combine the cited prior art are applicable to those presented for previously rejected claim 1.
Regarding claim 5, Suzuki in view of Takii discloses all the limitations of claim 2.
Takii discloses:
the captured image further includes a reference line drawn on the road surface. [See Takii, ¶ 0150 discloses capturing images of a projected light pattern emitted onto a road surface, wherein it is specifically noted that the light pattern projected on the road surface is not limited to the examples illustrated; See Takii, Figs. 8A, 11A-11D]
The reasons to combine the cited prior art are applicable to those presented for previously rejected claim 1.
Regarding claim 8, Suzuki in view of Takii discloses all the limitations of claim 5.
Suzuki discloses:
wherein the monitor controller further displays, on the monitor, a guide line for guiding a driver to move the subject vehicle. [See Suzuki, ¶ 0028-0031, Figs. 3-5 disclose/illustrate displaying captured images including a projected notification pattern in a region in a vicinity of a vehicle. Particularly, Fig. 4 illustrating an arrow for guiding a driver as to where to move the subject vehicle.]
Takii discloses:
further comprising: a monitor controller that displays, on a monitor included in the vehicle, the image from which the notification pattern is removed, acquired by the image processor, [See Takii, ¶ 0121 discloses an image selection switch for selecting a view provided by a heads-up display (HUD) to be any one of a captured image and a CG image.]
The reasons to combine the cited prior art are applicable to those presented for previously rejected claim 1.
Regarding claim 9, Suzuki in view of Takii discloses all the limitations of claim 1.
Takii discloses:
further comprising a monitor controller that displays, on a monitor included in the vehicle, the image from which the notification pattern is removed, acquired by the image processor. [See Takii, ¶ 0092-0094, 0120-0121, 0122-0123 discloses that a heads-up display in a vehicle displays an image of a light pattern projected onto a road surface that has been captured by a camera. It is particularly noted that any one of the captured image and the CG image may be selected to be displayed on the heads-up display. As discussed, the driver of the vehicle may glean an enhanced spatial understanding/awareness of the displayed/projected shape on the road-surface actually being observed by pedestrians or other drivers by looking at a captured image of a light pattern being displayed by the HUD. Hence, image processing is performed to create a computer-generated image indicating a virtual object of the object together with a captured image of the surroundings of a vehicle. In other words, the view being presented to the driver includes the captured image of the surroundings of the vehicle, but presents a CG-image of a notification pattern rather than an image of the projected pattern as captured by the camera. Since the CG-image is not the image of the captured image of the projected pattern, this is what constitutes “removing” the notification pattern per the claim language.]
The reasons to combine the cited prior art are applicable to those presented for previously rejected claim 1.
Claim(s) 6 and 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Suzuki in view of Takii in view of Koike (US 20030146827 A1) (hereinafter Koike)
Regarding claim 6, Suzuki discloses all the limitations of claim 5.
Suzuki does not appear to disclose:
wherein the projection device projects the notification pattern on a region in rear of the subject vehicle, the imaging device images a parking lot in the region in rear of the subject vehicle, and the reference line is a parking line indicating a parking position in the parking lot.
However, Koike discloses:
wherein the projection device projects the notification pattern on a region in rear of the subject vehicle, [See Koike, ¶ 0057-0058, 0061-0062 discloses rearward projection of a visible light pattern onto a road surface; See Koike, Figs. 5A, 5B, 7 illustrate said projection.]
the imaging device images a parking lot in the region in rear of the subject vehicle, and [See Koike, Fig. 7 illustrates a parking maneuver to the rear of the subject vehicle.]
the reference line is a parking line indicating a parking position in the parking lot. [See Koike, Fig. 7 illustrates a parking maneuver to the rear of the subject vehicle including lines of a parking space; For additional context, see Koike, ¶ 0072 discloses while the beam radiators 24 radiate the visible light beams, the beam ECU 32 extracts the visible light patterns formed by the subject vehicle 22, the visible light patterns formed by the other vehicle, and the invisible light patterns formed by the infrastructure, by processing the images from the imaging devices 44.]
It would have been obvious to the person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention disclosed by Suzuki in view of Takii to add the teachings of Koike in order to enable improved parking awareness by projecting visible light patterns onto a road surface surrounding a parking space.
Regarding claim 7, Suzuki in view of Takii in view of Koike discloses all the limitations of claim 6.
Koike discloses:
wherein the notification pattern is a linear pattern, and the parking line is a linear line. [See Koike, Fig. 7 illustrates parking space having linear lines; See Koike, Figs. 5a, 5b illustrate projecting notification patterns in a linear pattern.]
The reasons to combine the cited prior art are applicable to those presented for previously rejected claim 6.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Suzuki in view of Yamada (WO2022190166A1; translation provided herewith) (hereinafter Yamada).
Regarding claim 10 Suzuki discloses:
A control device for a vehicle, comprising:
a storage; and [See Suzuki, See Suzuki, ¶ 0036, 0065-0066, 0067-0069 discloses image display signals supplied from an on-vehicle CPU to first and second image display units.]
an image processor that obtains the captured image captured by the camera, and stores the captured image in the storage to acquire an image in which no notification pattern is included. [See Suzuki, ¶ 0023-0024 discloses that a “second image display unit” 20 displays the road-surface image G2 on the road surface R. In this implementation, the second image display unit 20 may include, for example, headlights. The headlights may include light-distribution-variable multi-light-source headlights provided in a front part of the vehicle 100. In other words, the road-surface image G2 may be displayed with the use of the headlights. The second image display unit 20 is not limited to the headlights as mentioned above, insofar as the second image display unit 20 is able to display the road-surface image G2 on the road surface R. Non-limiting examples may include a digital mirror device and a projector; See Suzuki, ¶ 0036, 0065-0066, 0067-0069 discloses a detection unit 40 and an on-vehicle camera to detect presence or absence of transmissive image and road-surface images by detecting image display signals supplied from an on-vehicle CPU to first and second image display units. In one preferred but non-limiting example, the detection unit 40 may detect both the transmissive image G1 and the road-surface image G2. As a configuration for the detection unit 40 to detect the images, for example, a configuration may be adopted in which imaging of the front windshield 102 and the road surface R is carried out with the use of an on-vehicle camera, to directly detect presence or absence of the images, i.e., the transmissive image G1 and the road-surface image G2; See Suzuki, ¶ 0030-0033 discloses a transmissive image projected onto a road surface R, for example, including travel route guidance information, regulation information at a destination of guidance, and current traveling speed information. The travel route guidance information may be denoted by arrow symbols. The regulation information and the current traveling speed information may be denoted by characters. The road-surface image G2 illustrated in FIG. 3 may include, for example, warning information to the pedestrian W. The warning information may be denoted by characters and symbols; See Suzuki, ¶ 0037-0039, 0041-0044 discloses a derivation unit may derive image information on the transmissive image G1 and image information on the road-surface image G2. A determination unit determine presence or absence of superimposition of the transmissive image G1 on the road-surface image G2, on the basis of one or both of the image information on the transmissive image G1 and the image information on the road-surface image G2. Further, a display stop unit may stop displaying any one of the transmissive image G1 and the road-surface image G2, on the basis of the detection result of the detection unit 40.]
Suzuki does not appear to explicitly disclose:
a timing unit;
a projector controller that intermittently controls light emission from a projector included in the vehicle in a cycle based on a timing signal from the timing unit such that a notification pattern intermittently projected, by the projector, on a road surface in rear, in front or at a side of the vehicle is visually recognized by a human;
However, Yamada discloses:
a timing unit; [See Yamada, ¶ 0022 discloses defining a “section” 412 per Fig. 6, by which a section defines a timing of a cycle for determining brightness rate adjustment.]
a projector controller that intermittently controls light emission from a projector included in the vehicle in a cycle based on a timing signal from the timing unit such that a notification pattern intermittently projected, by the projector, on a road surface in rear, in front or at a side of the vehicle is visually recognized by a human; [See Yamada, ¶ 0011 discloses a road lighting ECU including an interface connected to a projection unit (a projector) which projects an image onto the road surface that indicates information to be notified to pedestrians or occupants of other vehicles; See Yamada, ¶ 0026-0028, Fig. 8 discloses a frame buffer storing photographed data obtained by continuously photographing road surface lighting images whose luminance changes with time according to the luminance change pattern. Frames 403A, 403B, and 403D store captured images of high-brightness road surface lighting images 201A, 201B, and 201D. Frames 403C and 403E store captured images obtained by capturing low-luminance road surface lighting images 201C and 201E.]
an imaging device that controls a camera included in the vehicle to capture an image including the road surface at a timing when the notification pattern is not projected during the cycle based on the timing signal from the timing unit; [See Yamada, ¶ 0012 discloses a camera ECU including an interface connected to a camera. If a road lighting image is projected during shooting, the camera will generate a shot image that includes the road lighting image.]
It would have been obvious to the person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention disclosed by Suzuki to add the teachings of Yamada in order to coordinate a projection rate with a camera capture rate to selectively obtain images of a projected pattern to be displayed to a driver of a vehicle.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PATRICK E DEMOSKY whose telephone number is (571)272-8799. The examiner can normally be reached Monday - Friday 7-4 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jamie Atala can be reached at 5712727384. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PATRICK E DEMOSKY/Primary Examiner, Art Unit 2486