Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
DETAILED ACTION
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 1/16/26 has been entered.
Response to Arguments
Applicant’s amendments with respect to claims 3-4 and 20-21 to address the 112 rejections are persuasive and those rejections are withdrawn.
With respect to the art, applicant argues that the art does not disclose “a level of ambiguity associated with the first distance measurement determined using the TOF sensor, a search space within the primitive.”
The examiner disagrees. Any ToF sensor (or any sensor for that matter) has an error margin or level of ambiguity associated with whatever it is measuring, in this case, the first distance measurement. Every sensor has an error margin and the based in part on language does not limit the claims significantly as a determination by the sensor that an object, for example, is 1 meter away +/- 2 cm is based in part on the level of ambiguity. Applicant’s specification seems to recognize this reality in ¶110. The examiner is unsure if applicant has support in the disclosure, but it may be useful to differentiate the inherent error margin of a sensor with the level of ambiguity that applicant is using if different.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-32 are rejected under 35 USC 103 as being unpatentable over 11,509,803 Li et al. (hereafter Li) in view of 9,325,973, Hazeghi, et al. (hereafter Hazeghi)
1. An apparatus for generating one or more depth maps, comprising:
a structured light source configured to emit a pattern of light based on a primitive, the primitive including a set of uniquely identifiable features; (Li 7:10-20 light source emits different patterns of light (Spec ¶43 describes primitive as pattern)
a time-of-flight (ToF) sensor; (Li 3:64-4:20 time of flight techniques necessarily requires a tof sensor; see also 13:25-29:40 describing the use of tof)
at least one memory; and (Li 9:35-37 memory)
one or more processors coupled to the at least one memory, the one or more processors configured to: (Li 40:17-31 processor)
obtain a frame including a reflected pattern of light generated based on the pattern of light emitted by the structured light source; (Li 20:17-03 capture intervals for images)
determine, using the ToF sensor, a first distance measurement associated with a pixel of the frame; (Li 20:31-21:12 distance)
Li does not disclose
determine, based at least in part on a level of ambiguity associated with the first distance measurement determined using the ToF sensor, a search space within the primitive, the search space including a subset of features from the set of uniquely identifiable features of the primitive;
determine, based on searching the search space within the primitive, a feature of the primitive corresponding to a region around the pixel of the frame;
determine a second distance measurement associated with the pixel of the frame based at least in part on determining the feature of the primitive from the search space within the primitive; and
generate a depth map based at least in part on the second distance measurement.
Hazeghi
determine, based at least in part on a level of ambiguity associated with the first distance measurement determined using the ToF sensor, a search space within the primitive, the search space including a subset of features from the set of uniquely identifiable features of the primitive;
determine, based on searching the search space within the primitive, a feature of the primitive corresponding to a region around the pixel of the frame;
determine a second distance measurement associated with the pixel of the frame based at least in part on determining the feature of the primitive from the search space within the primitive; and
generate a depth map based at least in part on the second distance measurement. (Hazeghi 7:47-8:57 discloses a system using the pattern to find features around the pattern and the depth of things in the image; note that every sensor has an error margin and that would be a level of ambiguity associated with the distance measurement determined using the sensor)
It would have been obvious to modify the system of Li to utilize the pattern measurements of Hazeghi for the purposes of robust and efficient detection and rapid and precise depth detection as taught by Hazeghi (4:1-5)
2. The apparatus of claim 1, wherein the one or more processors are configured to:
obtain a first exposure of the frame associated with a first level of illumination;
obtain a second exposure of the frame associated with a second level of illumination that is different than the first level of illumination; and
determine the first distance measurement associated with the pixel of the frame based at least in part on a comparison between a first light amplitude associated with the pixel in the first exposure and a second light amplitude associated with the pixel in the second exposure. (Li 26:26-40 brightness value (amplitude of light) in different exposures for determining distance)
3. The apparatus of claim 1, wherein: the first distance measurement is determined to be within a range of distance measurements with an error margin; and the one or more processors are configured to determine a size of the search space within the primitive based at least in part on the range of distance measurements, wherein the range of distance measurements is positively correlated to the size of the search space. (Li at 20:31-21:12 measures the distance; note that all measurements have an error margin or some sort; Hazeghi 7:47-8:57 discloses a system using the pattern to find features around the pattern and the depth of things in the image; note that the “wherein the range …” is treated as reciting a definition and is without effect given the rest of the language)
4. The apparatus of claim 3, wherein the one or more processors are configured to determine the range of distance measurements based at least in part on the level of ambiguity associated with the first distance measurement, wherein the level of ambiguity is positively correlated to the range of distance measurements. (Li 20:31-21:12 determines a distance; note that the wherein clause regarding ambiguity appears to be a definition and has no effect on the claim; it is further noted that the wherein clause is reciting an inherent property, that is, if there is a high level of uncertainty in the measurement, there are a lot of possibilities of distances the actual distance could be)
5. The apparatus of claim 1, wherein the one or more processors are configured to: determine, based at least in part on the first distance measurement, an offset between a first location of the pixel of the frame and a second location of the feature of the primitive, wherein the offset is inversely proportional to the first distance measurement; and determine the search space within the primitive based at least in part on the offset. (It is noted that in physics, a projected pattern grows as a proportional rate from the distance of the projector; therefore the offset distance (which is the inverse of the projected pattern) is inversely proportional; Li 20:31-21:12 determines a distance see calculation discussion from about 14:65-23:33)
6. The apparatus of claim 5, wherein the one or more processors are configured to set a central axis of the search space within the primitive as the second location of the feature of the primitive. (Hazeghi 19:16-19:30 plane of each DOE is normal to optical axis, which is a central axis)
7. The apparatus of claim 1, wherein the region around the pixel of the frame has a predetermined size, (the region around the pixel is predetermined e.g. the size of the sensor, the 8 pixels around the pixel frame, note that applicant does not provide any size)
and wherein the one or more processors are configured to: determine a first region of the search space, the first region of the search space having the predetermined size; and (Li’s sensor provides the processor the data for the region that it is capturing image data for)
determine whether image data within the region around the pixel of the frame corresponds to image data within the first region of the search space. (The processor processes the pixels for images and therefore determines that image data corresponds to image data in the region)
8. The apparatus of claim 7, wherein the one or more processors are configured to:
determine the image data within the region around the pixel of the frame corresponds to the image data within the first region of the search space; and (Li’s sensor provides the processor the data for the region that it is capturing image data for
determine the second distance measurement based at least in part on determining a distance between the pixel of the frame and a corresponding feature of the first region of the search space. (Li 26:24-59 determines distance between the pixel and the object in the image)
9. The apparatus of claim 7, wherein the one or more processors are configured to: determine the image data within the region around the pixel of the frame does not correspond to the image data within the first region of the search space within the primitive; (Hazeghi 15:61-16:24 relies on the patterns to identify distance so if it doesn’t find the pattern in the image data region, it would search other regions until the pattern is found)
determine a second region of the search space, the second region of the search space having the predetermined size; and
determine whether image data within the region around the pixel of the frame corresponds to image data within the second region of the search space. (Hazeghi 15:61-16:24 relies on the patterns to identify distance so if it doesn’t find the pattern in the image data region, it would search other regions until the pattern is found)
10. The apparatus of claim 1, wherein: the pattern of light emitted by the structured light source includes a plurality of light points; and (Li 7:15-20 pattern of lights; note a pattern would require more than one light point)
a feature within the set of uniquely identifiable features of the primitive includes two or more light points of the plurality of light points. (Li 7:15-20 pattern of lights; note a pattern would require more than one light point)
11. The apparatus of claim 10, wherein a light point of the feature corresponds to two or more pixels of the frame. (Note that a pixel is considered the smallest portion of an image and thus for any feature in a picture, other than a speck of dust, would correspond to more than a single pixel)
12. The apparatus of claim 1, wherein the structured light source is configured to emit the pattern of light using a diffractive optical element that simultaneously projects a plurality of patterns of light corresponding to the primitive. (Li 7:15-20 diffuse flash)
13. The apparatus of claim 1, wherein the one or more processors are configured to: obtain an additional frame while the structured light source is not emitting the pattern of light based on the primitive; (Li 16:35-17:16 discusses an example where frames are captured where some images are captured with the light source on and some with the light source off)
determine ambient light signals based at least in part on the additional frame; and (LI 16:35-17:16 determines intensities of light)
subtract the ambient light signals from the frame before determining the first distance measurement associated with the pixel of the frame. (Li 12:10-16 discusses combining corresponding pixel values of images 17:44-55 determines depth information after processing all the frames; although Li does not explicitly disclose subtracting ambient light signals, it would have been obvious to subtract ambient light in order to only utilize the structured light source as the data for determining distance for the purposes of removing the influence of ambient light)
14. The apparatus of claim 13, wherein the one or more processors are configured to: determine light signals corresponding to multi-path interference using the frame after subtracting the ambient light signals from the frame; and subtract the light signals corresponding to multi-path interference from the frame before determining the first distance measurement associated with the pixel of the frame. (Li 17:44-55 determines depth information; although Li does not explicitly disclose subtracting mutli-path interference signals, it would have been obvious to subtract multi-path light in order to only utilize the structured light source as the data for determining distance for the purposes of removing the influence of multi-path interference)
15. The apparatus of claim 1, wherein the one or more processors are configured to fit a function to light signals corresponding to the pixel of the frame before determining the first distance measurement associated with the pixel of the frame. (Li 17:44-55 processes the data of light signals and that processing fits a function to the light signals)
16. The apparatus of claim 1, wherein the apparatus includes a mobile device. (Li Fig 1 shows glasses; glasses are mobile)
17. The apparatus of claim 1, further comprising a display. (Li Fig 2 shows a display)
Claim 18-32 are corresponding method claims to the apparatus claims 1-15 and are rejected under a similar rationale.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ming Shui whose telephone number is 303-297-4247. The examiner can normally be reached on M-TH 7-5 PT.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Greg Morse can be reached on 571-272-3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Ming Shui/
Primary Examiner, Art Unit 2663