DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 28 January 2026 has been entered.
Response to Arguments
Applicant’s arguments with respect to claim(s) 7,19, and 31 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 7-10,19-20,24,31-34 rejected under 35 U.S.C. 103 as being unpatentable over by Bhakta; Vikrant R. (US 20160377252 A1) in view of SHIBATA; Yoshinori et al. (US 20200139879 A1) in view of Delaney; Mark Lawrence et al. (US 20140362203 A1)
Regarding claim 7, Bhakta teaches,
A method (¶18, automotive headlamp providing “spatially adaptable and spectrally tunable light sources” using dichromatic illumination) comprising:
generating a headlight frame (¶48-52 and Fig. 8, “modulating the amount of blue light outputted”) by a processor (¶48-52 and Fig. 8, “controller 830”, depicted in fig. 8) in a digital micromirror device (DMD) headlight control unit, (¶48-52 and Fig. 8, “controller 830”, that modulates outputted blue light, also “presents electronic image data to DMD 810” to implement a desired pattern) wherein the headlight frame comprises a headlight pattern; (¶48-52 and Fig. 8, “controller 830”, depicted in fig. 8, “modulating the amount of blue light outputted by the blue lamp 86”)
transmitting, by the processor, (¶48-52 and Fig. 8, “controller 830”) the headlight frame (¶48-52 and Fig. 8, “controller 830”, depicted in fig. 8, modulated “amount of blue light outputted by the blue lamp 86”) and a bit plane of a structured light pattern (¶48-52 and Fig. 8, “modulating the power of the blue laser source 81 to modulate the amount of yellow light emitted by phosphor 85”) to the DMD headlight control unit; (¶48-52 and Fig. 8, “controller 830” presents “electronic image data to the DMD 810 to implement a desired pattern” based on the modulated blue light output by “blue lamp 86” and modulated amount of yellow light from “phosphor 85” by modulated power of “blue laser source 81”)
generating, by the DMD controller, (¶48-53, Fig. 8 and 9, “controller 830” performs functions operates “digital color correction to correct the color at the output”) bit planes of a hybrid headlight frame, (¶52-59 and Fig. 9, color correction performed on “single frame” with “Subframe 1”, “Subframe 2”, and “Subframe 3” as depicted in fig. 9) wherein the bit planes (¶52-59 and Fig. 9, subframes 1, 2, and 3 reflect illumination from light output from “yellow and blue light sources” as depicted in fig. 9) comprise the bit plane of the structured light pattern (¶53,49, Fig. 9 and 8, “only the laser-phosphor illumination is illuminated and reflected” that modulates “amount of yellow light emitted by phosphor 85”) and bit planes of the headlight pattern. (¶53,49, and Fig. 9 and 8, “blue LED source is illuminated and reflected” from the modulated “amount of blue light outputted by the blue lamp 86”)
The prior art Bhakta teaches a processor that performs the particular functions of the method being claimed, but is not explicit in separate the control units and controllers as claimed, which Bhakta teaches a controller that performs the claimed functions.
However, Shibata teaches,
a digital micromirror device (DMD) headlight control unit (¶124,149,51, and Fig. 1, “control device 50 for a vehicular lamp 2” depicted in fig. 1)
a DMD controller (¶124,149,51, and Fig. 1, “light source controller 20” depicted in fig. 1) in the DMD headlight control unit (¶149,136,176, and Fig. 1, “control device 50” in the lamp body 4 includes “light source controller 20 controls the light source 10” that transmits drive signal to “optical deflection device 26” depicted in fig. 1)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata which has a separate light source controller in the control device in the lamp body. This teaching provides the arrangement a technology for increasing the types of producible light distribution patterns.
But does not explicitly teach,
forming a headlight profile in which one or more hybrid headlight frames are to be projected during respective first time periods and one or more headlight frames are to be projected during respective second time periods, wherein each of the first time periods includes a first projection time for the bit plane of the structured light pattern and a second projection time for the bit planes of the headlight pattern; and
synchronizing image capture to start during the first projection time of each first time period and to stop before the second projection time of each first time period begins.
However, Delaney teaches additionally,
forming a headlight profile (¶49 and fig. 3, display an “electronically generated 8-bit grayscale pattern” light 314 depicted in fig. 3) in which one or more hybrid headlight frames are to be projected (¶49 and 51, “image exposure” to electronically generated 8-bit grayscale pattern by “SLM 350” of transmitted light 314 through “any given pixel of the pattern, depending on its grayscale value”) during respective first time periods (¶49,51-54, and fig. 4, “electronically generated 8-bit grayscale pattern” of light 314 through any given pixel of the pattern “activated to provide” pattern portion “P1” at time “T1” as depicted in fig. 4) and one or more headlight frames are to be projected (¶49 and 51, “image exposure” to electronically generated 8-bit grayscale pattern by “SLM 350” of transmitted light 314 through “any given pixel of the pattern, depending on its grayscale value”) during respective second time periods, (¶49,51-54, and fig. 4, “electronically generated 8-bit grayscale pattern” of light 314 through any given pixel of the pattern “activated to provide” pattern portion “P2” at time “T2” as depicted in fig. 4) wherein each of the first time periods (¶51-54 and fig. 4, subdivision time T1 of “subdivision times T1-T4” depicted in fig. 4) includes a first projection time for the bit plane (¶51-54 and fig. 4, pattern portion P1 of “pattern portions P1-P4” depicted in fig. 4) of the structured light pattern (¶53-54 and fig. 4, “pattern subdivision exposure sequence 430” including pattern portion “P1” exposed from a light generator during subdivision time “T1” as depicted in fig. 4) and a second projection time (¶51-54 and fig. 4, subdivision time T2 of “subdivision times T1-T4” depicted in fig. 4) for the bit planes of the headlight pattern; (¶53-54 and fig. 4, “pattern subdivision exposure sequence 430” including pattern portion “P2” exposed from a light generator during subdivision time “T2” as depicted in fig. 4) and
synchronizing image capture (¶47,51-54, and fig. 3-4, “timing and synchronization portion (TSP) 336” controlling image exposure by multiple “grayscale patterns” within “image integration period of the camera 260” so that “projecting a gray level pattern of illumination during an image exposure period” over total timing diagram 440 to “generate the gray level strip pattern PatIter_k is TIter_k” depicted in fig. 4) to start during the first projection time of each first time period (¶51-54 and fig. 4, subdivision time T1 of “subdivision times T1-T4” of Tlter_k representing “total time to generate the gray level strip pattern PatIter_k” depicted in fig. 4) and to stop before the second projection time of each first time period begins. (¶51-54 and fig. 4, subdivision time T2 of “subdivision times T1-T4” immediately following subdivision time T1 in the “total time to generate the gray level strip pattern PatIter_k” depicted in fig. 4)
The prior art Delaney presents a total time to generate images of a timed sequence of binary patterns. The pattern for each period sequences and synchronizes image exposure to a sequence of generated binary patterns as controlled by a spatial light modulator (SLM) 350 and timing and synchronization portion (TSP) 336 in response to the SLM 350 that sequences the superimposing of binary patterns over time. It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney which implements pattern generation and image exposure synchronization. This allows for use of various analysis modes such as in a contrast-based analysis mode and SIM-based analysis mode that make use of images lighted with specific structured illumination patterns in a way that can improve accuracy.
Regarding claim 8, Bhakta with Shibata with Delaney teaches the limitations of claim 7,
Bhakta teaches additionally,
selecting, by the processor, (¶50-59, Fig. 8 and 9, “controller 830” operates a “digital color correction to correct the color at the output” to spectrally tune the “single frame” depicted in Fig. 9) a hybrid sequence (¶54 and Fig. 9, “percentage weights” of the “subframes” of the total single frame's total time, such as “13.3% white, 43.3% blue and 43.3% yellow”, depicted in fig. 9) for the bit planes (¶54 and Fig. 9, “output color is spectrally tuned by the percentage weights” of the various subframes, “subframe 1”, “subframe 2”, and “subframe 3”, as depicted in fig. 9) of the hybrid headlight frame, (¶52-54, output color of the “single frame” is spectrally tuned by the percentage weights of the ”time weight of the various subframes”) wherein the hybrid sequence comprises the first projection time (¶53-54 and Fig. 9, percentage weights as a “time weight” for “subframe 3” of the single frame’s total time) for the bit plane of the structured light pattern (¶53-54 and Fig. 9, time weight for “Subframe 3, labeled 907, only the laser-phosphor illumination is illuminated and reflected”) and projection times, (¶53-54 and Fig. 9, percentage weights as a “time weight” for “subframe 2” of the single frame’s total time) within the second projection time, (¶53-54 and Fig. 9, percentage weight as a “time weight” for “subframe 2” of single frame 901 depicted in fig. 9) for the bit planes of the headlight pattern. (¶53-54 and Fig. 9, time weight for “Subframe 2, labeled 905, only the blue LED source is illuminated and reflected”)
Regarding claim 9, Bhakta with Shibata with Delaney teaches the limitations of claim 8,
Bhakta teaches additionally,
selecting (¶54, “output color is spectrally tuned”) further comprises selecting from a plurality of hybrid sequences, (¶54 and Fig. 9, “subframes can be modulated using a duty cycle approach” where the “output color is spectrally tuned by the percentage weights for the “single frame’s total time”) wherein each hybrid sequence (¶54 and Fig. 9, “percentage weights” of the “subframes” of the total single frame's total time, such as “13.3% white, 43.3% blue and 43.3% yellow”, depicted in fig. 9) of the plurality of hybrid sequences (¶54 and Fig. 9, modulated “subframes” using a duty cycle approach for the “single frame’s total time”) comprises a first projection time (¶54,53 and Fig. 9, specific “percentage weight” for subframe 3 such as “43.3% yellow” of the single frame’s total time when “only the laser-phosphor illumination is illuminated and reflected” which is separate from the “spectrally tuned” that is separate from the “43.3% blue” when “only the blue LED source is illuminated and reflected”) for the bit plane of the structured light pattern. (¶54,53 and Fig. 9, percentage weight such as “43.3% yellow”)
Shibata teaches additionally,
selecting from a plurality of hybrid sequences (¶76-77, and Fig. 5A, “emit light with determined illuminance S106 based on when “illuminance setting unit 42 sets the illuminance value of light emitted to each individual region R” at S105 as disclosed in fig. 5A) based on an amount of ambient light, (¶76-77, and Fig. 5A, “illuminance setting unit 42 then sets the illuminance value of light emitted to each individual region R (S105)” after the “luminance analyzer 14 detects the luminance of each individual region R (S102)” as disclosed in fig. 5A)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney which has a separate light source controller in the control device in the lamp body. This teaching provides the arrangement a technology for increasing the types of producible light distribution patterns.
Delaney teaches additionally,
hybrid sequences comprises a different first projection time for the bit plane of the structured light pattern. (¶54 and fig. 4, pixel columns that are activated to provide the respective subdivision pattern portions P1-P4 during respective subdivision times T1-T4 where the “times T1-T4 are binary subdivisions, that is T3=2*T4, T2=2*T3, and T1=2*T2” depicted in fig. 4)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney which implements pattern generation and image exposure synchronization. This allows for use of various analysis modes such as in a contrast-based analysis mode and SIM-based analysis mode that make use of images lighted with specific structured illumination patterns in a way that can improve accuracy.
Regarding claim 10, Bhakta with Shibata with Delaney teaches the limitations of claim 7,
Bhakta teaches additionally,
a camera trigger packet, (¶52-59 and Fig. 9, “time weight of various subframes” with associated tuned “percentage weights” of “single frame’s total time” where “color correction is performed”) wherein the camera trigger packet (¶52-59 and Fig. 9, “time weight” of the portion of the frame “of the various subframes” for the “single frame’s total time” depicted in fig. 9) indicates a current time of the first processor, (¶52-54 and Fig. 9, “single frame” with a “single frame’s total time”) a time delta until the bit plane of the structured light pattern is projected, (¶52-54 and Fig. 9, “single frame” with subframe 2 where “only the blue LED source is illuminated and reflected, so that blue light is projected at the output”) and the first projection time for the structured light pattern. (¶52-54 and Fig. 9, “single frame” with subframe “only the laser-phosphor illumination is illuminated and reflected”)
But does not explicitly teach,
transmitting, by the first processor, a camera trigger packet to a second processor coupled to a camera,
However, Shibata teaches additionally,
transmitting, by the first processor, (¶69,72, and Fig. 1, “lamp controller 18” with an illuminance setting unit 42 depicted in fig. 1) a camera trigger packet (¶65 and Fig. 1, light source controller 20 sets “illuminance value every .1-5 ms” based on illuminance values determined at “illuminance setting unit 42”) to a second processor (¶65,53, and Fig. 1, “illuminance setting unit 42”, depicted in fig. 1 as included in the lamp controller 18, transmits a signal indicating the illuminance value to the “light source controller 20”) coupled to a camera, (¶50-51 and Fig. 1, “control device 50” receives image data acquired by the “imager 12”, that includes “high speed camera 36”, transmitted to “luminance analyzer 14”. The control device 50, that includes “light source controller 20”, is coupled to imager 12 as depicted in fig. 1)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney which has a separate light source controller in the control device in the lamp body. This teaching provides the arrangement a technology for increasing the types of producible light distribution patterns.
Regarding claim 19, Bhakta teaches,
A method (¶18, automotive headlamp providing “spatially adaptable and spectrally tunable light sources” using dichromatic illumination) comprising:
transmitting, by a first processor, (¶48-52 and Fig. 8, “controller 830”) a headlight frame (¶48-52 and Fig. 8, “controller 830”, depicted in fig. 8, modulated “amount of blue light outputted by the blue lamp 86”) and a bit plane of a structured light pattern (¶48-52 and Fig. 8, “modulating the power of the blue laser source 81 to modulate the amount of yellow light emitted by phosphor 85”) to a controller, (¶48-52 and Fig. 8, “controller 830” presents “electronic image data”) the headlight frame comprising a headlight pattern; (¶48-52 and Fig. 8, “controller 830” presents “electronic image data to the DMD 810 to implement a desired pattern” based on the modulated blue light output by “blue lamp 86” and modulated amount of yellow light from “phosphor 85” by modulated power of “blue laser source 81”)
a camera trigger packet, (¶52-59 and Fig. 9, “time weight of various subframes” with associated tuned “percentage weights” of “single frame’s total time” where “color correction is performed”) wherein the camera trigger packet (¶52-59 and Fig. 9, “time weight” of the portion of the frame “of the various subframes” for the “single frame’s total time” depicted in fig. 9) indicates a current time of the first processor, (¶45,52-54 and Fig. 9, “duty cycle” of single frame with a “single frame’s total time”) a time delta until the bit plane of the structured light pattern is projected, (¶45,52-54 and Fig. 9, duty cycle depicted in fig. 9 for single frame 901 with “subframe 3, labeled 907” where “only the laser-phosphor illumination is illuminated and reflected, so the projected light is yellow for Subframe 3” that occurs for “43.3% yellow” percentage weight of the “duty cycle”) and a projection time for the structured light pattern; (¶45,52-54 and Fig. 9, duty cycle depicted in fig. 9 for single frame 901 with “subframe 3, labeled 907” where “only the laser-phosphor illumination is illuminated and reflected, so the projected light is yellow for Subframe 3” that occurs after subframe 1 and subframe 2 of the “duty cycle” of the single frame) and
generating, by the controller, (¶48-53, Fig. 8 and 9, “controller 830” performs functions operates “digital color correction to correct the color at the output”) bit planes of a hybrid headlight frame, (¶52-59 and Fig. 9, color correction performed on “single frame” with “Subframe 1”, “Subframe 2”, and “Subframe 3” as depicted in fig. 9) wherein the bit planes (¶52-59 and Fig. 9, subframes 1, 2, and 3 reflect illumination from light output from “yellow and blue light sources” as depicted in fig. 9) comprise the bit plane of the structured light pattern (¶53,49, Fig. 9 and 8, “only the laser-phosphor illumination is illuminated and reflected” that modulates “amount of yellow light emitted by phosphor 85”) and bit planes of the headlight pattern; (¶53,49, and Fig. 9 and 8, “blue LED source is illuminated and reflected” from the modulated “amount of blue light outputted by the blue lamp 86”)
The prior art Bhakta teaches a processor that performs the particular functions of the method being claimed, but is not explicit in separate the control units and controllers as claimed, which Bhakta teaches a controller that performs the claimed functions. Specifically, Bhakta does not explicitly teach,
transmitting, by the first processor to a second processor, a camera trigger packet,
wherein the camera trigger packet is configured to indicate a start and a stop of image capture to occur during the projection of each structure light pattern of each hybrid headlight pattern, the stop of the image capture occurring before projection of the bit planes of the headlight pattern in the hybrid headlight frame.
However, Shibata teaches additionally,
transmitting, by a first processor, (¶69,72, and Fig. 1, “lamp controller 18” that includes an illuminance setting unit 42 depicted in fig. 1) a headlight frame (¶72 and Fig. 1, “detection result from the luminance analyzer 14”) and a bit plane of a structured light pattern (¶72,69, and Fig. 1, “detection result from the tracking unit 40” determined from “a specific target among the targets detected by the condition analyzer 16”) to a controller (¶72 and Fig. 1, “illuminance setting unit 42” of the lamp controller 18, depicted in fig. 1, determines the illuminance value of light emitted based on “the detection result from the luminance analyzer 14 and the detection result from the tracking unit 40”)
transmitting, by the first processor to a second processor, (¶65,53, and Fig. 1, “illuminance setting unit 42”, depicted in fig. 1 as included in the lamp controller 18, transmits a signal indicating the illuminance value to the “light source controller 20”) a camera trigger packet, (¶65 and Fig. 1, light source controller 20 sets “illuminance value every .1-5 ms” based on illuminance values determined at “illuminance setting unit 42”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata which has a separate light source controller in the control device in the lamp body. This teaching provides the arrangement a technology for increasing the types of producible light distribution patterns.
Delaney teaches additionally,
wherein the camera trigger packet (¶51-54 and fig. 4, “timing diagram 440” of fig. 4 depicting “total time to generate the gray level strip pattern PatIter_k is TIter_k”) is configured to indicate a start and a stop of image capture (¶51-54 and fig. 4, “TSP 336 may thus control an overall image exposure achieved by the multiple incremental grayscale patterns because it may control the number of incremental requests within an image integration period of the camera 260” to correspond with activating “respective subdivision pattern portions P1-P4 during respective subdivision times T1-T4” depicted in fig. 4) to occur during the projection of each structure light pattern of each hybrid headlight pattern, (¶51-54 and fig. 4, “time between the time periods T1-T4” depicted in fig. 4 where pattern portions P1-P4 indicate activated "pixel columns across light stripe” correspond to express time periods T1-T4 with express subdivision times) the stop of the image capture (¶51-54, 72-73, and fig. 8, relatively consistent “discrete latency period (Tsi-Tpta)” occurring before beginning “image exposure” of structured illumination iterations during “camera integration duration 825” depicted in fig. 8) occurring before projection of the bit planes of the headlight pattern (¶51-54, 72-73, and fig. 8, “discrete latency period (Tsi-Tpta)” which corresponds to before the beginning of “illumination pattern trigger signals 835” corresponding to gray level pattern generation sequence(s) represented by signals 845 on the timeline 840 and “camera integration duration 825” depicted in fig. 8) in the hybrid headlight frame. (¶72-73 and fig. 8, image exposure occurring during “camera integration duration 825” bounded by “start at integration begin time Tsi and end at integration end time Tei” depicted in fig. 8)
The prior art Delaney presents a total time to generate images of a timed sequence of binary patterns. The pattern for each period sequences and synchronizes image exposure to a sequence of generated binary patterns as controlled by a spatial light modulator (SLM) 350 and timing and synchronization portion (TSP) 336 in response to the SLM 350 that sequences the superimposing of binary patterns over time. It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney which implements pattern generation and image exposure synchronization with a latency period that is relatively consistent and predictable. This allows for use of various analysis modes such as in a contrast-based analysis mode and SIM-based analysis mode that make use of images lighted with specific structured illumination patterns in a way that can improve accuracy.
Regarding claim 20, Bhakta with Shibata with Delaney teaches the limitations of claim 19,
Bhakta teaches additionally,
generating, by the first processor, (¶48-52 and Fig. 8, “controller 830” performs modulating) the headlight pattern; (¶48-52 and Fig. 8, “modulating the amount of blue light outputted by the blue lamp 86”) and
generating, by the first processor, (¶48-52 and Fig. 8, “controller 830” performs modulating) the structured light pattern. (¶48-52 and Fig. 8, “modulating the power of the blue laser source 81 to modulate the amount of yellow light emitted by phosphor 85”)
Regarding claim 24, Bhakta with Shibata with Delaney teaches the limitations of claim 19,
Bhakta teaches additionally,
transmitting, by the controller, (¶49 and Fig. 8, “controller 830” performs) the bit planes of the hybrid headlight frame (¶49 and Fig. 8, controller 830 presents “electronic image data to the DMD 810”) to a digital micromirror device (DMD); (¶49 and Fig. 8, “DMD” with thousands of individually selectable micro-mirrors)
projecting, by the DMD, (¶49 and Fig. 8, “DMD” with thousands of individually selectable micro-mirrors) the bit planes of the hybrid headlight frame, (¶49 and Fig. 8, DMD 810 implements a desired pattern “resulting in a patterned, spectrally adjustable and/or spatially adaptable white light beam”) to produce a headlight projection and a structured light projection; (¶49 and Fig. 8, “patterned, spectrally adjustable and/or spatially adaptable white light beam” based on “amount of blue light outputted by the blue lamp 86” and “power of the blue laser source 81 to modulate the amount of yellow light emitted by phosphor 85”) and
capturing, by a camera, (¶31, “sensor such as a forward looking camera”) an image of a reflection of the structured light projection, (¶31, forward looking camera senses “spectrum of white color that is being output” of the yellow and blue light sources) based on the projection time for the structured light pattern. (¶31, sensor such as a forward looking camera senses changes “in the spectrum of white color that is being output” to control respective “duty cycles” of the light)
Regarding claim 31, it is a broader method claim without a digital micromirror device (DMD) similar to method claim 7. Refer to rejection of claim 7 to teach the limitations of claim 31.
Regarding claim 32, dependent on claim 31, it is a method claim similar to method claim 8, dependent on claim 7. Refer to rejection of claim 8 to teach the limitations of claim 32.
Regarding claim 33, dependent on claim 32, it is a method claim similar to method claim 9, dependent on claim 8. Refer to rejection of claim 9 to teach the limitations of claim 33.
Regarding claim 34, dependent on claim 31, it is a method claim similar to method claim 10, dependent on claim 7. Refer to rejection of claim 10 to teach the limitations of claim 34.
Claim(s) 11,35 rejected under 35 U.S.C. 103 as being unpatentable over by Bhakta; Vikrant R. (US 20160377252 A1) in view of SHIBATA; Yoshinori et al. (US 20200139879 A1) in view of Delaney; Mark Lawrence et al. (US 20140362203 A1) in view of WATANO; Yuichi et al. (US 20210402915 A1)
Regarding claim 11, Bhakta with Shibata with Delaney teaches the limitations of claim 10,
Bhakta teaches additionally,
projecting the bit plane (¶53 and Fig. 9, single frame of “many frames projected per second from the DMD” depicted in fig. 9) of the structured light pattern (¶53 and Fig. 9, “Subframe 3, labeled 907, only the laser-phosphor illumination is illuminated and reflected”) by a DMD (¶53,48, Fig. 8 and 9, single frame depicted fig. 9 of many frames projected per second from “DMD 810” depicted in fig. 8) at the time delta (¶45,52-54 and Fig. 9, duty cycle depicted in fig. 9 for single frame 901 with “subframe 3, labeled 907” where “only the laser-phosphor illumination is illuminated and reflected, so the projected light is yellow for Subframe 3” that occurs for “43.3% yellow” percentage weight of the “duty cycle”) and for the first projection time; (¶45,52-54 and Fig. 9, duty cycle depicted in fig. 9 for single frame 901 with “subframe 3, labeled 907” where “only the laser-phosphor illumination is illuminated and reflected, so the projected light is yellow for Subframe 3” that occurs after subframe 1 and subframe 2 of the “duty cycle” of the single frame) and
capturing an image (¶31, “sensor such as a forward looking camera”) when the structured light pattern is projected, (¶31, sensor such as a forward looking camera senses changes “in the spectrum of white color that is being output” to control respective “duty cycles” of the light)
Shibata teaches additionally,
projecting the bit plane of the structured light pattern (¶43 and Fig. 1, optical reflection device 26 configured to “selectively reflect the light emitted from the light source 22”) by a DMD (¶43 and Fig. 1, “optical deflection device 26” constituted by “digital mirror device (DMD)”) coupled to the DMD controller (¶43,65, and Fig. 1, “light source controller 20” controls the light source unit 10 by transmitting drive signal to the light source 22 and “optical deflection device 26” as depicted in fig. 1)
but does not explicitly teach,
capturing, by the camera responsive to the camera trigger packet, an image, wherein a camera exposure time based on the projection time is used.
However, Watano teaches additionally,
capturing, by the camera (¶149, “vehicle camera 6”) responsive to the camera trigger packet, (¶149, change the light distribution pattern such that a “light distribution pattern when the vehicle camera 6 captures an image of the nth frame and a light distribution pattern when the vehicle camera 6 captures an image of the mth (m≠n) frame are different from each other”) an image (¶149, “vehicle camera 6 captures an image”) when the structured light pattern is projected, (¶149, vehicle camera 6 captures an image of the nth of a “light distribution pattern”) wherein a camera exposure time (¶149, “exposure time (shutter speed)”) is based on the first projection time. (¶149, “time for maintaining one light distribution pattern” is changed stepwise with respect to “exposure time (shutter speed) of the vehicle camera 6”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney with the vehicle lamp control of Watano which relates exposure time with the changes in light distribution pattern. This allows for maintaining the camera’s exposure time with the received amount of light distribution during the changes in light distribution patterns that can help improve accuracy.
Regarding claim 35, dependent on claim 34, it is a method claim similar to method claim 11, dependent on claim 10. Refer to rejection of claim 11 to teach the limitations of claim 35.
Claim(s) 12,21-22,36 rejected under 35 U.S.C. 103 as being unpatentable over by Bhakta; Vikrant R. (US 20160377252 A1) in view of SHIBATA; Yoshinori et al. (US 20200139879 A1) in view of Delaney; Mark Lawrence et al. (US 20140362203 A1) in view of Krishnamurthy; Sailesh Bharathwaaj et al. (US 10885642 B1)
Regarding claim 12, Bhakta with Shibata with Delaney teaches the limitations of claim 10,
But does not teach the additional limitations of claim 12,
However, Krishnamurthy teaches additionally,
synchronizing (18:6-25, “synchronize their internal clocks 304”) a clock of the first processor (18:6-25, “camera clients 220” with an internal clock) and a clock of the second processor (18:6-25, “camera server 225” with an internal clock) using a time synchronization protocol of a networking protocol. (18:6-25, “camera clients 220 and camera server 225 may synchronize their internal clocks 304 using a synchronization protocol”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney with the clocks of Krishnamurthy which can be synchronized using a synchronization protocol. This allows for addressing the issues related to having communication issues such as latency issues of the respective computers.
Regarding claim 21, Bhakta with Shibata with Delaney teaches the limitations of claim 20,
Bhakta teaches additionally,
determining, a time to trigger a camera (¶31, “sensor such as a forward looking camera”) based the projection time for the structured light pattern (¶31, sensor such as a forward looking camera senses changes “in the spectrum of white color that is being output” to control respective “duty cycles” of the light)
But does not teach the additional limitations of claim 21,
determining an offset between the first processor and the second processor by synchronizing a first clock of the first processor and a second clock of the second processor; and
determining, by the second processor, a time to trigger based on the offset,
However, Krishnamurthy teaches additionally,
determining an offset (27:25-48, “camera server 225 determines a delay for each camera 305” accounting for desynchronization) between the first processor (27:25-48, accounting for desynchronization between camera server 225 and “camera 305”) and the second processor (27:25-48, accounting for desynchronization of “camera server 225” and camera 305) by synchronizing (18:6-25, “synchronize their internal clocks 304”) a first clock of the first processor (18:6-25, “camera clients 220” with an internal clock) and a second clock of the second processor; (18:6-25, “camera server 225” with an internal clock) and
determining, by the second processor, (27:25-48, “camera server 225 determines a delay”) a time to trigger (27:25-48, “Camera server 225 then adjusts timestamps 324 for frame data 330 for frames 320 from that camera 305 by the determined delay”) based on the offset, (27:25-48, “determined delay”) and the current time of the first processor. (27:25-48, “timestamps 324 for frame data 330 for frames 320 from that camera 305”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney with the clocks of Krishnamurthy which can be synchronized and account for desynchronization. This allows for addressing the issues related to having communication issues such as latency issues of the respective computers.
Regarding claim 22, Bhakta with Shibata with Delaney with Krishnamurthy teaches the limitations of claim 21,
Krishnamurthy teaches additionally,
synchronizing (18:6-25, “synchronize their internal clocks 304”) the first clock of the first processor (18:6-25, “camera clients 220” with an internal clock) and the second clock of the second processor (18:6-25, “camera server 225” with an internal clock) using is performed using an Ethernet precision time protocol (PTP) or a controller area network (CAN) time synchronization protocol. (18:6-25, “camera clients 220 and camera server 225 may synchronize their internal clocks 304 using a synchronization protocol” such as “Precision Time Protocol (PTP)”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney with the clocks of Krishnamurthy which can be synchronized using a synchronization protocol. This allows for addressing the issues related to having communication issues such as latency issues of the respective computers.
Regarding claim 36, dependent on claim 35, it is a method claim similar to method claim 12, dependent on claim 10. Refer to rejection of claim 12 to teach the limitations of claim 36.
Claim(s) 13,37 rejected under 35 U.S.C. 103 as being unpatentable over by Bhakta; Vikrant R. (US 20160377252 A1) in view of SHIBATA; Yoshinori et al. (US 20200139879 A1) in view of Delaney; Mark Lawrence et al. (US 20140362203 A1) in view of VISWANATHAN; Anirudh (US 20200184231 A1)
Regarding claim 13, Bhakta with Shibata with Delaney teaches the limitations of claim 10,
But does not teach the additional limitations of claim 13,
However, Viswanathan teaches additionally,
second processor (¶60, “road control”) is in an advanced driver assistance systems (ADAS) (¶60, road control provided with “HD ap policies” associated such as “assisted driving” assistance) electronic control unit (ECU). (¶60, road control provided via the CAN (computer area network) to the “electronic control unit (ECU)”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney with the processor of Viswanathan which is an ECU for driver assisting. This allows for improving the vehicles detection reliability.
Regarding claim 37, dependent on claim 35, it is a method claim similar to method claim 13, dependent on claim 10. Refer to rejection of claim 13 to teach the limitations of claim 37.
Claim(s) 23 rejected under 35 U.S.C. 103 as being unpatentable over by Bhakta; Vikrant R. (US 20160377252 A1) in view of SHIBATA; Yoshinori et al. (US 20200139879 A1) in view of Delaney; Mark Lawrence et al. (US 20140362203 A1) in view of TSURU; Daisuke et al. (US 20220021798 A1)
Regarding claim 23, Bhakta with Shibata with Delaney teaches the limitations of claim 19,
Shibata teaches additionally,
determining, by the second processor, (¶65, “light source controller 20”) a projection time indicator (¶65, light source controller 20 “adjusts the on-time ratio” of each mirror element 30) based on an amount of ambient light; (¶76-77, and Fig. 5A, “illuminance setting unit 42 then sets the illuminance value of light emitted to each individual region R (S105)” after the “luminance analyzer 14 detects the luminance of each individual region R (S102)” as disclosed in fig. 5A)
But does not explicitly teach,
transmitting, by the second processor to the first processor, the projection time indicator.
However, Tsuru teaches additionally,
transmitting, by the second processor (¶71, “control information transmitting section 142”) to the first processor, (¶71, “control information transmitting section 142 transmits, to the light-emitting device 14, information”) the projection time indicator. (¶71, control information transmitting section 142 “transmits light emission pattern specification, light emission time point correction requests”)
It would have been obvious to one with ordinary skills in the art before the effective filing date of the claimed invention to combine the light modulation of Bhakta with the control device of Shibata with illumination pattern generation of Delaney with the information processing of Tsuru which transmits light emission information. Sending this information is essential for synchronous processing and will allow for achieving synchronization.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JIMMY S LEE whose telephone number is (571)270-7322. The examiner can normally be reached Monday thru Friday 10AM-8PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joseph G. Ustaris can be reached at (571) 272-7383. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOSEPH G USTARIS/Supervisory Patent Examiner, Art Unit 2483
/JIMMY S LEE/Examiner, Art Unit 2483