DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Examiner acknowledges the reply filed on 11/21/2025 in which claims 1-2, 13, 15, 17, and 19-20 have been amended. Claim 16 has been canceled. Claim 21 has been added. Currently claims 1-15 and 17-21 are pending for examination in this application.
In response to this amendment:
The drawing objections are withdrawn.
The 112 rejection is withdrawn.
The 102 prior art rejections are withdrawn.
Response to Arguments
Applicant's arguments filed 11/21/2025 have been fully considered but they are not persuasive. On page 11 of the response, the applicant asserts that neither Carr nor Druml teaches “a first actuating device coupled to the first MEMS mirror along the first axis, the first actuating device having a first portion that protrudes from an edge of the first MEMS mirror.” While the examiner agrees that Carr does not teach this limitation, a conventional MEMS mirror, as being within the contemplation of Druml, does teach this limitation.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-5, 9, 15, and 17-21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Carr et al. (US 20230358891 A1), hereinafter Carr, in view of Druml (US 20220065995 A1), as evidenced by Man (US 20210302717 A1).
Regarding claim 1, Carr teaches:
An apparatus ([0092] “FIG. 1a illustrates a time-of-flight sensor system 10.”), comprising:
a laser light source configured to transmit a beam of light pulses towards a target, projecting at least one corresponding beam spot on the target ([0092] “The system 10 comprises an illumination source which in this example is dot projector formed by a vertical cavity surface emitting laser (VCSEL) array 11… The diffraction grating diffracts light from the lens component 15 and, in particular, focused light from the lens component 15 to provide more spots on the subject 19”);
a beam steering arrangement ([0019] “The optical system may move the spot illumination in the scanning pattern using the at least one actuator.”)
[…]; and
an array of sensors arranged according to a grid, a sensor in the array of sensors configured to sense a light pulse incident on the sensor in response to reflection of at least one light pulse of the beam of light pulses from a field of view (FOV) region in the target (FIGS. 9-10 show a ‘grid’ arrangement. [0030] “The sensor may comprise a plurality of pixels”; [0136] “As shown in FIG. 9, each spot 201 that reflects off the subject, and is received by sensor 108, illuminates a corresponding area on the surface 302 of the sensor that detects light—referred to as the sensor surface from here on out.”),
the sensor in the array of sensors further configured to provide a signal indicative of a time of incidence of the light pulse on the sensor ([0143] “The information is used by the processor 125 to form an output frame—in particular, a depth map—that provides time of flight information (e.g. depth information) regarding the subject in the scene.”), wherein:
the FOV region is portioned into grid cells according to the grid ([0136] “The sensor surface 302 represents that sensor's field of view, and is thus a representation of the scene from the point of view of the sensor 108. Thus, the position of the spot 201 on the sensor surface 302 corresponds to a position of the spot in the scene, or in other words, a position of the spot on the subject 104.”),
each sensor in the array of sensors is configured to sense at least one echo light pulse reflected from a respective grid cell in the FOV region ([0139] “Further, each spot is contained within the bounds of a pixel 304 and the number of spots in the grid of spots corresponds to the number of pixels 304 making up the sensor surface 302. In other words, the grid of spots is aligned with the array of pixels such that each pixel contains a spot, as shown in FIG. 9.”),
the beam steering arrangement is configured to vary a direction of transmission of the beam of light pulses, projecting at least one beam spot per grid cell in the FOV region ([0054] “Further, the above set up allows for a spot illumination to illuminate an area of the sensor that is within the bounds of a pixel and to be moved to different locations within the bounds of that single pixel.”).
Carr further contemplates a range of beam-steering options ([0019] “The optical system may move the spot illumination in the scanning pattern using the at least one actuator. The at least one actuator may be any suitable actuation mechanism for incorporation into the sensor system… Additionally or alternatively, the at least one actuator may comprise a voice coil motor (VCM), an optical image stabilisation actuator, an auto-focus actuator, a 4 wires SMA actuator or 8 wires SMA actuator, a module tilt actuator, or an adaptive beam-steering mechanism for steering the non-uniform illumination (spot illumination). The at least one actuator may be arranged to move the emitted non-uniform illumination by moving any one of the following components which may be comprised in an apparatus present the time-of-flight sensor system: a lens, a prism, a mirror, a dot projector, and a light source. The at least one actuator may provide at least one degree of freedom to provide the scanning pattern, for example two degrees of freedom.”). However, Carr does not explicitly teach:
a beam steering arrangement, including: a first microelectromechanical (MEMS) mirror configured to oscillate around a first axis with a first oscillating angle;
a second MEMS mirror configured to oscillate around a second axis with a second oscillating angle;
a first actuating device coupled to the first MEMS mirror along the first axis, the first actuating device having a first portion that protrudes from an edge of the first MEMS mirror; and
a second actuating device coupled to the second MEMS mirror along the second axis;
Druml, in the same field of endeavor, teaches an arrangement of two MEMS scanning mirrors, as is conventional in the art:
a beam steering arrangement, including: a first microelectromechanical (MEMS) mirror configured to oscillate around a first axis with a first oscillating angle ([0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.”); and
a second MEMS mirror configured to oscillate around a second axis with a second oscillating angle ([0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction),
a first actuating device coupled to the first MEMS mirror along the first axis, the first actuating device having a first portion that protrudes from an edge of the first MEMS mirror (FIG. 1, MEMS Mirror 12; Note that conventional MEMS Mirrors utilize torsion beams protruding from the mirror edge, as further evidenced below with regards to Man.); and
a second actuating device coupled to the second MEMS mirror along the second axis; (The second MEMS mirror is reasonably considered to have a similar structure as that of the first.)
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the conventional MEMS beam-steering system, as highlighted in Druml, for the beam-steering mechanism of Carr, as one of the known and predictable choices in the art.
For increased clarity regarding the structure of a conventional MEMS mirror, see, for example, Man ([0003] “FIG. 1 illustrates a conventional one axis electromagnetic driven MEMS micro-mirror device comprising a planar movable plate 5 having a reflective mirror surface 8. The movable plate is supported axially by a torsional beam 6 to a silicon substrate 2. A coil 7 is placed around the movable plate such that when a current is applied to coil 7 under a magnetic field perpendicular to torsional beam 6 lengthwise, a Lorentz force is generated and acted on the movable plate such that the movable plate rotates about the torsional beam 6.”).
Regarding claim 2, Carr in view of Druml teaches the apparatus of claim 1, as described above, and further teaches:
wherein each respective actuating device is configured to drive an oscillating movement of the respective MEMS mirror (Druml: [0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.” Note that a Lissajous pattern is the result of two orthogonal sinusoidal oscillations.).
Regarding claim 3, Carr in view of Druml teaches the apparatus of claim 2, as described above, and further teaches:
wherein the first axis of oscillation of the first MEMS mirror and the second axis of oscillation of the second MEMS mirror transverse one another (Druml: [0034] “first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.” The horizontal and vertical directions are transverse to one another.).
Regarding claim 4, Carr in view of Druml teaches the apparatus of claim 2, as described above, and further teaches:
wherein the beam steering arrangement comprises a MEMS lens coupled to at least one of the first and second MEMS mirrors, the MEMS lens configured to vary the direction of transmission of the light pulses within each grid cell in the FOV (Carr: [0128] “The actuator 122, in this example, is a shape memory alloy (SMA) actuator comprising one or more SMA components for driving movement in the lens 110. Another type of actuator may be used, such as voice coil motor (VCM) or voice coil actuator, or a microelectromechanical systems (MEMS) magnetic actuator.”; [0054] “Further, the above set up allows for a spot illumination to illuminate an area of the sensor that is within the bounds of a pixel and to be moved to different locations within the bounds of that single pixel.” Carr contemplates a MEMS-actuated lens for beam-steering within a pixel grid cell.).
Regarding claim 5, Carr in view of Druml teaches the apparatus of claim 1, as described above, and further teaches:
comprising a diffractive optical element (DOE) arranged between the laser source and the beam steering arrangement, the DOE configured to split the beam of light pulses to produce a plurality of beams of light pulses to the beam steering arrangement (Carr: [0092] “FIG. 1a illustrates a time-of-flight sensor system 10… The optical system comprises a diffraction grating (not shown) located between the lens element 15 and the subject 19. The diffraction grating diffracts light from the lens component 15 and, in particular, focused light from the lens component 15 to provide more spots on the subject 19 when in the illumination source is in a spot illumination mode.”).
Regarding claim 9, Carr in view of Druml teaches the apparatus of claim 1, as described above, and further teaches:
wherein the beam steering arrangement is configured to cyclically vary the direction of transmission of the beam of light pulses according to a pattern selected among a raster scan pattern or a Lissajous pattern ([0016] “The optical system may be configured to move the spot illumination in a scanning pattern across at least part of the sensor surface… The scanning pattern may be a raster scanning pattern.”).
Regarding claim 15, the system of claim 15 is encompassed in scope by the apparatus of claim 1 and is rejected for the same reasons.
Regarding claim 17, Carr in view of Druml teaches the system of claim 15, as described above, and further teaches:
wherein the first mirror is configured to rotate along the first axis with a first frequency, and the second mirror is configured to rotate along the second axis with a second frequency (Druml: [0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.” The nature of a Lissajous scanner implies that each scanning direction is rotated at a frequency.).
Regarding claim 18, Carr in view of Druml teaches the system of claim 17, as described above, and further teaches:
wherein the first axis and the second axis transverse one another (Druml: [0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.” The horizontal and vertical directions are transverse to one another.).
Regarding claim 19, Carr in view of Druml teaches the system of claim 15, as described above, and further teaches:
wherein the first mirror is configured to steer the beam of light pulses along the first oscillating angle, and the second mirror is configured to steer the beam of light pulses along the second oscillating angle having a different angular range from the first oscillating angle (It is a simple matter of design choice to choose and/or control two different MEMS mirrors such that they oscillate through different ranges of angles.).
Regarding claim 20, Carr in view of Druml teaches the system of claim 15, as described above, and further teaches:
wherein the plurality of optical components include a biaxial MEMS mirror suitable to rotate along two orthogonal axes (Druml: [0033] “The MEMS mirror 12 is a mechanical moving mirror (i.e., a MEMS micro-mirror) integrated on a semiconductor chip (not shown). The MEMS mirror 12 according to this embodiment is configured to rotate about either a single scanning axis (i.e., a 1D MEMS mirror) or two scanning axes (i.e., a 2D MEMS mirror) that are typically orthogonal to each other. As a 2D MEMS mirror, the MEMS mirror 12 may be a Lissajous scanner that is configured to control the steering of the laser beams in two dimensions (e.g., in horizontal and vertical directions).”).
Examiner’s note on claim interpretation: the optical arrangement of the amended claims here would involve having at least two mirrors, as dictated by claim 15, each configured to oscillate in at least one dimension, and either a third biaxially oscillating mirror or that at least one of the two mirrors claimed in claim 15 is a biaxially oscillating mirror. While the examiner suspects that the intention of the applicant was for claim 20 to involve a biaxial MEMS mirror instead of the two separate mirrors in claim 15, there is nothing strictly unclear or inconsistent with the current language.
While Druml contemplates either using two 1D MEMS mirrors or using a single 2D MEMS mirror, as equivalent options, a combination of both two 1D MEMS mirrors and a single 2D MEMS mirror, as would meet the claimed limitations, is a simple duplication of equivalent parts performing a predictable and expectable result. See MPEP 2144.04 (VI) (B).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the system of Carr in view of Druml with an additional 2D MEMS mirror as a simple duplication of equivalent parts with a predictable and expected result.
Regarding claim 21, Carr in view of Druml teaches the system of claim 15, as described above, and further teaches:
wherein the first and second mirrors are microelectromechanical (MEMS) mirrors ([0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.”).
Claim(s) 11-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Carr in view of Druml.
Regarding claim 11, Carr teaches:
A method comprising: providing an array of sensors arranged according to a grid, a sensor in the array of sensors configured to sense a light pulse incident ton the sensor in response to reflection of at least one light pulse of a beam of light pulses from a field of view (FOV) region in a target (FIGS. 9-10 show a ‘grid’ arrangement. [0030] “The sensor may comprise a plurality of pixels”; [0136] “As shown in FIG. 9, each spot 201 that reflects off the subject, and is received by sensor 108, illuminates a corresponding area on the surface 302 of the sensor that detects light—referred to as the sensor surface from here on out.”);
partitioning the FOV region into grid cells according to the grid ([0136] “The sensor surface 302 represents that sensor's field of view, and is thus a representation of the scene from the point of view of the sensor 108. Thus, the position of the spot 201 on the sensor surface 302 corresponds to a position of the spot in the scene, or in other words, a position of the spot on the subject 104.”);
projecting at least one beam spot per grid cell in the FOV region by driving a beam steering arrangement to vary a direction of transmission of light pulses […] ([0054] “Further, the above set up allows for a spot illumination to illuminate an area of the sensor that is within the bounds of a pixel and to be moved to different locations within the bounds of that single pixel.”); and
receiving a signal from each sensor in the array of sensors indicative of at least one echo light pulse reflected from a respective grid cell in the FOV region ([0142] “When the spot pattern is incident on the sensor surface 302, the data for each pixel 304 is read out by the sensor 108 and provided to the processor 125 for image processing. The data for each pixel 304 provides information on the illumination received by that pixel.”).
Carr further contemplates a range of beam-steering options ([0019] “The optical system may move the spot illumination in the scanning pattern using the at least one actuator. The at least one actuator may be any suitable actuation mechanism for incorporation into the sensor system… Additionally or alternatively, the at least one actuator may comprise a voice coil motor (VCM), an optical image stabilisation actuator, an auto-focus actuator, a 4 wires SMA actuator or 8 wires SMA actuator, a module tilt actuator, or an adaptive beam-steering mechanism for steering the non-uniform illumination (spot illumination). The at least one actuator may be arranged to move the emitted non-uniform illumination by moving any one of the following components which may be comprised in an apparatus present the time-of-flight sensor system: a lens, a prism, a mirror, a dot projector, and a light source. The at least one actuator may provide at least one degree of freedom to provide the scanning pattern, for example two degrees of freedom.”). However, Carr does not explicitly teach:
projecting at least one beam spot per grid cell in the FOV region by driving a beam steering arrangement to vary a direction of transmission of light pulses in a Lissajous pattern;
Druml, in the same field of endeavor, teaches an arrangement of two MEMS scanning mirrors, as is conventional in the art:
projecting at least one beam spot per grid cell in the FOV region by driving a beam steering arrangement to vary a direction of transmission of light pulses in a Lissajous pattern ([0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.”);
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the conventional MEMS beam-steering system, as highlighted in Druml, for the beam-steering mechanism of Carr, as one of the known and predictable choices in the art.
Regarding claim 12, Carr in view of Druml teaches the method of claim 11, as described above, and further teaches:
wherein the driving the beam steering arrangement to vary the direction of transmission of the light pulses includes driving the beam steering arrangement to cyclically vary the direction of transmission of the light pulses within each grid cell in the FOV (Carr: [0054] “Further, the above set up allows for a spot illumination to illuminate an area of the sensor that is within the bounds of a pixel and to be moved to different locations within the bounds of that single pixel.”).
Regarding claim 13, Carr in view of Druml teaches the method of claim 11, as described above, and further teaches:
comprising: driving the beam steering arrangement to cyclically vary the direction of transmission of the light pulses according to the Lissajous pattern (Druml: [0034] “It will be further appreciated that a LIDAR scanning system may include multiple scanning mirrors 12 in a Lissajous scanning system (i.e., a 2×1D system), where a first 1D MEMS mirror has a single scanning axis for steering a light beam in a horizontal scanning direction and a second 1D MEMS mirror has a single scanning axis for steering the light beam in a vertical scanning direction.”).
Regarding claim 14, Carr in view of Druml teaches the method of claim 11, as described above, and further teaches:
comprising: calculating a measurement of a distance of the target from the array of sensors based on the signals received (Carr: [0143] “The information is used by the processor 125 to form an output frame—in particular, a depth map—that provides time of flight information (e.g. depth information) regarding the subject in the scene. The output frame comprises frame pixels, each frame pixel reflecting the illumination data captured by the corresponding pixel on the sensor surface 302.”).
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Carr in view of Druml, as evidenced by Man, and further in view of Gao et al. (WO 2021136105 A1), hereinafter Gao.
Regarding claim 6, Carr in view of Druml teaches the apparatus of claim 1, as described above, and further contemplates a range of beam-steering options ([0019] “The optical system may move the spot illumination in the scanning pattern using the at least one actuator. The at least one actuator may be any suitable actuation mechanism for incorporation into the sensor system… Additionally or alternatively, the at least one actuator may comprise a voice coil motor (VCM), an optical image stabilisation actuator, an auto-focus actuator, a 4 wires SMA actuator or 8 wires SMA actuator, a module tilt actuator, or an adaptive beam-steering mechanism for steering the non-uniform illumination (spot illumination). The at least one actuator may be arranged to move the emitted non-uniform illumination by moving any one of the following components which may be comprised in an apparatus present the time-of-flight sensor system: a lens, a prism, a mirror, a dot projector, and a light source. The at least one actuator may provide at least one degree of freedom to provide the scanning pattern, for example two degrees of freedom.”).
Carr does not explicitly teach an optical phased array:
wherein the beam steering arrangement comprises an optical phased array.
Gao, in the same field of endeavor, teaches the use of an optical phased array for lidar beam-steering ([0639] “As shown in FIG 49, the optical element 230 is an OPA device, which can deflect the incident light beam to obtain an outgoing light beam whose scanning direction matches the control signal.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the optical phased array of Gao for the steering mechanism of Carr, as one of the known and predictable options for beam-steering.
Claim(s) 7-8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Carr in view of Druml, as evidenced by Man, and further in view of Pacala et al. (US 20180152691 A1), hereinafter Pacala.
Regarding claim 7, Carr in view of Druml teaches the apparatus of claim 1, as described above, but does not explicitly teach:
wherein the sensors comprise an avalanche photodiode (APD) or a single photon avalanche photodiode (SPAD).
Pacala, in the same field of endeavor, teaches that the pixels of a detector may be an array of SPADs ([0039] “In one implementation, a first pixel 171 in the set of pixels 170 includes an array of single-photon avalanche diode detectors (hereinafter “SPADs”)”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used SPAD arrays of Pacala as the pixels for the sensor of Carr, as one of the known and predictable choices of detector.
Regarding claim 8, Carr in view of Druml teaches the apparatus of claim 1, as described above, but does not teach:
Comprising: a diffusive optical element coupled to the array of sensors, the diffusive optical element arranged between the target and the array of sensors;
SPAD sensors in the array of sensors configured to provide a joint signal indicative of a time of incidence of at least one light pulse in a joint area of respective grid cells,
wherein the diffusive optical element is configured to split the light pulse incident on the diffusive optical element into photons and to direct the photons towards respective SPAD sensors
Pacala, in the same field of endeavor, teaches:
Comprising: a diffusive optical element coupled to the array of sensors, the diffusive optical element arranged between the target and the array of sensors ([0039] “In one implementation, a first pixel 171 in the set of pixels 170 includes an array of single-photon avalanche diode detectors (hereinafter “SPADs”), and the diffuser 180 spreads lights rays—previously passed by a corresponding first aperture 141, collimated by a corresponding first lens 151, and passed by the optical filter 160—across the area of the first pixel 171, as shown in FIGS. 3, 5, and 6.”);
SPAD sensors in the array of sensors configured to provide a joint signal indicative of a time of incidence of at least one light pulse in a joint area of respective grid cells ([0023] “In one example, the aperture layer 140 can define apertures of diameters matched to a power output of illumination sources in the system and to a number and photon detection capacity of subpixel photodetectors in each pixel in the set of pixels 170 to achieve a target number of photons incident on each pixel within each sampling period.”),
wherein the diffusive optical element is configured to split the light pulse incident on the diffusive optical element into photons and to direct the photons towards respective SPAD sensors ([0039] “In one implementation, a first pixel 171 in the set of pixels 170 includes an array of single-photon avalanche diode detectors (hereinafter “SPADs”), and the diffuser 180 spreads lights rays… across the area of the first pixel 171, as shown in FIGS. 3, 5, and 6.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have used the SPAD arrays of Pacala as the pixels for the sensor of Carr, as one of the known and predictable choices of detector, and to have modified the system with the diffuser of Pacala to distribute the received light evenly across the subpixels.
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Carr in view of Druml, as evidenced by Man, and further in view of Smith et al. (US 9285477 B1), hereinafter Smith.
Regarding claim 10, Carr in view of Druml teaches the apparatus of claim 1, as described above, but does not teach:
comprising at least one of:
a first optical element coupled to the beam steering arrangement, the first optical element interposed between the beam steering arrangement and the target, or
a second optical element coupled to the array of sensors, the second optical element interposed between the target and the array of sensors,
wherein the at least one of the first optical element or the second optical element is each configured to counter a Keystone-Pincushion deformation during projecting the at least one beam spot per grid cell in the FOV region.
However, the use of optical elements to correct for pincushion deformation is well-known in the art of light projection, see for example, Smith:
comprising at least one of:
a first optical element coupled to the beam steering arrangement, the first optical element interposed between the beam steering arrangement and the target, or a second optical element coupled to the array of sensors, the second optical element interposed between the target and the array of sensors ((Col. 5, Line 55 – Col. 6, Line 8) “A second potential distortion effect may be the distortion of the 2D field of view, into one having a pincushion distortion pattern… In accordance with an embodiment of the invention, a compensation mirror (e.g., the fixed mirror 8) may be inserted into the path of the outgoing beam, the incoming beam, or both, where this mirror is shaped appropriately in order to compensate or correct for both of the above-described forms of distortion.”),
wherein the at least one of the first optical element or the second optical element is each configured to counter a Keystone-Pincushion deformation during projecting the at least one beam spot per grid cell in the FOV region ((Col. 5, Line 55 – Col. 6, Line 8) “A second potential distortion effect may be the distortion of the 2D field of view, into one having a pincushion distortion pattern… In accordance with an embodiment of the invention, a compensation mirror (e.g., the fixed mirror 8) may be inserted into the path of the outgoing beam, the incoming beam, or both, where this mirror is shaped appropriately in order to compensate or correct for both of the above-described forms of distortion.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the time-of-flight system of Carr with the pincushion distortion correction of Smith to emit a more uniform grid.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SEAN C. GRANT whose telephone number is (571)272-0402. The examiner can normally be reached Monday - Friday, 9:30 am - 6:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached at (571)270-3603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SEAN C. GRANT/Examiner, Art Unit 3645
/YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645