Prosecution Insights
Last updated: April 19, 2026
Application No. 17/446,844

METHOD FOR ASCERTAINING AN OPERATING PARAMETER FOR OPERATING A SURROUNDINGS DETECTION SYSTEM FOR A VEHICLE, AND SURROUNDINGS DETECTION SYSTEM

Non-Final OA §102§103
Filed
Sep 03, 2021
Examiner
CLOUSER, BENJAMIN WADE
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Robert Bosch GmbH
OA Round
3 (Non-Final)
36%
Grant Probability
At Risk
3-4
OA Rounds
4y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants only 36% of cases
36%
Career Allow Rate
5 granted / 14 resolved
-16.3% vs TC avg
Strong +75% interview lift
Without
With
+75.0%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
39 currently pending
Career history
53
Total Applications
across all art units

Statute-Specific Performance

§101
0.9%
-39.1% vs TC avg
§103
58.5%
+18.5% vs TC avg
§102
27.1%
-12.9% vs TC avg
§112
13.6%
-26.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 14 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claim 9 is objected to because of the following informalities: “a predetermined having predefined geometric properties light pattern” should read “a predetermined light pattern having predefined geometric properties”. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-5 and 7-11 are rejected under 35 U.S.C. 102(a)(2) as being unpatentable over Kageyama (US 2020/0314347) in view of O’Keefe (US 2018/0059248 A1). Regarding Claim 1, Kageyama discloses a method for ascertaining an operating parameter for operating a surroundings detection system (Figure 13 shows the method for processing the measured phase difference, which is ultimately used to control the lens apparatus of the device) for a vehicle ([0001] disclose operation on a vehicle; Figure 41 shows potential mounting points for the sensor apparatus on the vehicle), the surroundings detection system including a projection unit and an image recording unit ([0412]: “The imaging section 2410 includes at least one of a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, or other cameras.” ToF cameras are well-known in the art, and typically include an emitter which emits short pulses in a pattern over the field of regard, and a camera which receives the background and reflected light.), the method comprising the following steps: providing a projection signal to an interface to the projection unit, the projection signal including a control parameter for projecting a light pattern into a surrounding area of the vehicle (Figure 40 shows a block diagram of the vehicle illustrating an example of a schematic configuration of a vehicle control system. In it, the communication network (2010) connects the integrated control unit (2600) to the vehicle outside information detecting unit (2400) and the imaging section (2410), which is the ToF camera in the embodiment here. Thus, the control unit can provide a control parameter to the ToF camera in the device of Kageyama); reading in image data via an interface to the image recording unit, the image data including the light pattern projected into the surrounding area (Figure 11 shows the imaging control unit 301, which accepts lights from the surroundings through lens 311, reads in the incident light with the image sensor 41, processes it, and eventually records it into the recording unit 314. Note that [0432] indicates that the entire electronic apparatus 300 of Figure 11 can be implemented within the integrated control unit 2600 of Figure 40); and processing the image data, using a processing specification, to ascertain the operating parameter (Figure 13 shows the method for processing the measured phase difference, which is ultimately used to control the lens apparatus of the device. The steps of Figure 13 are described in [0182] – [0198], and includes S21, in which “the phase difference detection unit 326 calculates the focus deviation amount.”), wherein the operating parameter is representative of a calibration state or operating impairment of the surroundings detection system (Figure 13, step S21 “CALCULATE FOCUS DEVIATION AMOUNT” is described in [0197]: “a final focus deviation amount is calculated on the basis of the phase difference obtained from the pair of image signals determined of which the reliability is high in step S20.” Under the broadest reasonable interpretation, the examiner identifies the ‘final focus deviation amount’ of Kageyama with an operating impairment.). Kageyama does not teach and O’Keefe does teach wherein the projection signal ([0004]: “Embodiments of the present disclosure provide a laser range finder (e.g. a LIDAR) comprising one or more steerable lasers that non-uniformly scans a FOV based on laser steering parameters.”) comprises a predetermined light pattern having predefined geometric properties ([0004]: “Laser steering parameters (e.g. instructions) are formulated using sensor data from the local environment. The laser steering parameters can function to configure or instruct a steerable laser in a LIDAR to scan a FOV, with a dynamic angular velocity, thereby creating a complex shaped region of increased or non-uniform laser pulse density during the scan.”; Under the broadest reasonable interpretation, the ‘instructions or laser steering parameters’ of O’Keefe amount to a predetermination of the light pattern. Scanning a given field-of-view inherently gives the light pattern geometric properties, in this case, a boundary, and changing the density over the field of view imparts further geometric properties). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Kageyama with the teaching of O’Keefe to implement a projection signal with a predetermined light pattern with predefined geometric properties. O’Keefe notes in [0020] that “the disclosed techniques can improve the cost effectiveness of a laser range finder,” and that “the disclosed techniques enable a laser range finder with a smaller total number of laser pulses per second to distribute those laser pulses in a dynamic and intelligently selected manner.” The lower power requirements can translate into lower cost, which is highly desirable to end users. Regarding Claim 2, Kageyama in view of O’Keefe discloses all the limitations of Claim 1 as discussed above, and Kageyama further discloses wherein, in processing step, the operating parameter is ascertained (Figure 13, step S21 “CALCULATE FOCUS DEVIATION AMOUNT” is described in [0197]: “a final focus deviation amount is calculated on the basis of the phase difference obtained from the pair of image signals determined of which the reliability is high in step S20.”), which is configured to effectuate a refocusing of the image recording unit ([0198]: “In step S22, the lens control unit 327 controls the driving of the lens 311 on the basis of the focus determination result from the phase difference detection unit 326.”; [0175]: “Specifically, the lens control unit 327 calculates a drive amount of the lens 311 on the basis of the focus determination result supplied from the phase difference detection unit 326 and moves the lens 311 according to the calculated drive amount.”; The defocus amount is therefore used to improve the focusing of the imager.). Regarding Claim 3, Kageyama in view of O’Keefe discloses all the limitations of Claim 1 as discussed above, and Kageyama further discloses wherein, in the providing step, the projection signal is provided to project the light pattern, which represents a light point structure and/or a light strip structure and/or a light point cloud and/or another geometric structure ([0412] discloses that the imager may be a time of flight camera. These are well-known in the art to utilize light point clouds spread over the imager’s field of regard to generate range or depth maps of their surroundings). Regarding Claim 4, Kageyama in view of O’Keefe discloses all the limitations of Claim 1 as discussed above, and Kageyama further discloses wherein, in the processing step, the processing specification effectuates a comparison of at least one image parameter of the image data to a stored empirical value to obtain a comparison result ([0174]: “The phase difference correction unit 325 corrects the phase difference detected by the phase difference detection unit 326 using the correction parameter stored in the memory 324.” The phase difference is calculated directly from the image pixels (see Figure 13) and constitutes the image parameter of the instant applications), the operating parameter being ascertained using the comparison result ([0175]: “Specifically, the lens control unit 327 calculates a drive amount of the lens 311 on the basis of the focus determination result supplied from the phase difference detection unit 326 and moves the lens 311 according to the calculated drive amount.”) Regarding Claim 5, Kageyama in view of O’Keefe discloses all the limitations of Claim 1 as discussed above and Kageyama further discloses wherein, in the processing step, the processing specification effectuates a calculation of at least one blur value in an image represented by the image data , the operating parameter being ascertained using the blur value ([0173]: “In addition, in a case in which the focus object is not in focus, the phase difference detection unit 326 calculates the amount of the deviation of the focus (the defocus amount) and supplies information indicating the calculated defocus amount to the lens control unit 327 as the focus determination result.” Here the ‘defocus amount’ of Kageyama is reasonably construed to be the ‘blur value’ of the instant application, which is used to refocus the lens system). Regarding Claim 7, Kageyama in view of O’Keefe discloses all the limitations of Claim 1 as discussed above and Kageyama further discloses wherein, in the providing step, the projection signal is provided to project the light pattern into an object space in the surrounding area of the vehicle (Figure 41 shows several possible mounting points on a vehicle for a detection unit and imaging unit, as well as several imaging ranges (a-d) into the vehicle surroundings associated with each mounting position). Regarding Claim 8, Kageyama in view of O’Keefe discloses all the limitations of Claim 1 as discussed above, and Kageyama further discloses wherein the steps of the method are carried out repeatedly and/or continuously (Referring to the steps S11-S22 outlined in Figure 13, [0199] states that: “The focus control is performed as described above. This processing is repeatedly executed as occasion demands”). Regarding Claim 9, Kageyama discloses a control unit configured to ascertain an operating parameter for operating a surroundings detection system for a vehicle (Figure 11 shows the imaging control unit 301, which accepts lights from the surroundings through lens 311, reads in the incident light with the image sensor 41, processes it, and eventually records it into the recording unit 314. Note that [0432] indicates that the entire electronic apparatus 300 of Figure 11 can be implemented within the integrated control unit 2600 of Figure 40), the surroundings detection system including a projection unit and an image recording unit ([0412]: “The imaging section 2410 includes at least one of a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, or other cameras.” ToF cameras are well-known in the art, and typically include an emitter which emits short pulses in a pattern over the field of regard, and a camera which receives the background and reflected light.), the control unit configured to: provide a projection signal to an interface to the projection unit, the projection signal including a control parameter for projecting a light pattern into a surrounding area of the vehicle (Figure 40 shows a block diagram of the vehicle illustrating an example of a schematic configuration of a vehicle control system. In it, the communication network (2010) connects the integrated control unit (2600) to the vehicle outside information detecting unit (2400) and the imaging section (2410), which is the ToF camera in the embodiment here. Thus, the control unit can provide a control parameter to the ToF camera in the device of Kageyama); read in image data via an interface to the image recording unit, the image data including the light pattern projected into the surrounding area (Figure 11 shows the imaging control unit 301, which accepts lights from the surroundings through lens 311, reads in the incident light with the image sensor 41, processes it, and eventually records it into the recording unit 314. Note that [0432] indicates that the entire electronic apparatus 300 of Figure 11 can be implemented within the integrated control unit 2600 of Figure 40); and process the image data, using a processing specification, to ascertain the operating parameter (Figure 13 shows the method for processing the measured phase difference, which is ultimately used to control the lens apparatus of the device. The steps of Figure 13 are described in [0182] – [0198], and includes S21, in which “the phase difference detection unit 326 calculates the focus deviation amount.”), wherein the operating parameter is representative of a calibration state or operating impairment of the surroundings detection system (Figure 13, step S21 “CALCULATE FOCUS DEVIATION AMOUNT” is described in [0197]: “a final focus deviation amount is calculated on the basis of the phase difference obtained from the pair of image signals determined of which the reliability is high in step S20.” Under the broadest reasonable interpretation, the examiner identifies the ‘final focus deviation amount’ of Kageyama with an operating impairment.). Kageyama does not teach and O’Keefe does teach wherein the projection signal ([0004]: “Embodiments of the present disclosure provide a laser range finder (e.g. a LIDAR) comprising one or more steerable lasers that non-uniformly scans a FOV based on laser steering parameters.”) comprises a predetermined light pattern having predefined geometric properties ([0004]: “Laser steering parameters (e.g. instructions) are formulated using sensor data from the local environment. The laser steering parameters can function to configure or instruct a steerable laser in a LIDAR to scan a FOV, with a dynamic angular velocity, thereby creating a complex shaped region of increased or non-uniform laser pulse density during the scan.”; Under the broadest reasonable interpretation, the ‘instructions or laser steering parameters’ of O’Keefe amount to a predetermination of the light pattern. Scanning a given field-of-view inherently gives the light pattern geometric properties, in this case, a boundary, and changing the density over the field of view imparts further geometric properties). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Kageyama with the teaching of O’Keefe to implement a projection signal with a predetermined light pattern with predefined geometric properties. O’Keefe notes in [0020] that “the disclosed techniques can improve the cost effectiveness of a laser range finder,” and that “the disclosed techniques enable a laser range finder with a smaller total number of laser pulses per second to distribute those laser pulses in a dynamic and intelligently selected manner.” The lower power requirements can translate into lower cost, which is highly desirable to end users. Regarding Claim 10, Kageyama discloses a non-transitory machine-readable memory medium on which is stored a computer program for ascertaining an operating parameter for operating a surroundings detection system for a vehicle ([0434]: “Note that a computer program for realizing each function of the electronic apparatus 300 described using FIG. 11 is able to be mounted on any control unit or the like. In addition, it is also possible to provide a computer readable recording medium in which such a computer program is stored.”; [0421]: “The storage section 2690 may include a read only memory (ROM) that stores a variety of programs to be executed by a microcomputer”), the surroundings detection system including a projection unit and an image recording unit, the computer program, when executed by a computer, causing the computer to perform the following steps ([0407]: “Each control unit includes a microcomputer that performs operation processing in accordance with a variety of programs, a storage section that stores the programs, parameters used for the variety of operations, or the like executed by the microcomputer, and a driving circuit that drives devices subjected to various types of control”): providing a projection signal to an interface to the projection unit, the projection signal including a control parameter for projecting a light pattern into a surrounding area of the vehicle (Figure 40 shows a block diagram of the vehicle illustrating an example of a schematic configuration of a vehicle control system. In it, the communication network (2010) connects the integrated control unit (2600) to the vehicle outside information detecting unit (2400) and the imaging section (2410), which is the ToF camera in the embodiment here. Thus, the control unit can provide a control parameter to the ToF camera in the device of Kageyama); reading in image data via an interface to the image recording unit, the image data including the light pattern projected into the surrounding area (Figure 11 shows the imaging control unit 301, which accepts lights from the surroundings through lens 311, reads in the incident light with the image sensor 41, processes it, and eventually records it into the recording unit 314. Note that [0432] indicates that the entire electronic apparatus 300 of Figure 11 can be implemented within the integrated control unit 2600 of Figure 40); and processing the image data, using a processing specification, to ascertain the operating parameter (Figure 13 shows the method for processing the measured phase difference, which is ultimately used to control the lens apparatus of the device. The steps of Figure 13 are described in [0182] – [0198], and includes S21, in which “the phase difference detection unit 326 calculates the focus deviation amount.”), wherein the operating parameter is representative of a calibration state or operating impairment of the surroundings detection system (Figure 13, step S21 “CALCULATE FOCUS DEVIATION AMOUNT” is described in [0197]: “a final focus deviation amount is calculated on the basis of the phase difference obtained from the pair of image signals determined of which the reliability is high in step S20.” Under the broadest reasonable interpretation, the examiner identifies the ‘final focus deviation amount’ of Kageyama with an operating impairment.). Kageyama does not teach and O’Keefe does teach wherein the projection signal ([0004]: “Embodiments of the present disclosure provide a laser range finder (e.g. a LIDAR) comprising one or more steerable lasers that non-uniformly scans a FOV based on laser steering parameters.”) comprises a predetermined light pattern having predefined geometric properties ([0004]: “Laser steering parameters (e.g. instructions) are formulated using sensor data from the local environment. The laser steering parameters can function to configure or instruct a steerable laser in a LIDAR to scan a FOV, with a dynamic angular velocity, thereby creating a complex shaped region of increased or non-uniform laser pulse density during the scan.”; Under the broadest reasonable interpretation, the ‘instructions or laser steering parameters’ of O’Keefe amount to a predetermination of the light pattern. Scanning a given field-of-view inherently gives the light pattern geometric properties, in this case, a boundary, and changing the density over the field of view imparts further geometric properties). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Kageyama with the teaching of O’Keefe to implement a projection signal with a predetermined light pattern with predefined geometric properties. O’Keefe notes in [0020] that “the disclosed techniques can improve the cost effectiveness of a laser range finder,” and that “the disclosed techniques enable a laser range finder with a smaller total number of laser pulses per second to distribute those laser pulses in a dynamic and intelligently selected manner.” The lower power requirements can translate into lower cost, which is highly desirable to end users. Regarding Claim 11, Kageyama discloses a surroundings detection system for a vehicle, the surroundings detection system comprising: a projection unit configured to project a light pattern into a surrounding area of the vehicle ([0412]: “The imaging section 2410 includes at least one of a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, or other cameras.” ToF cameras are well-known in the art, and typically include an emitter which emits short pulses in a pattern over the field of regard, and a camera which receives the background and reflected light.); an image recording unit configured to record image data which represent the light pattern projected into the surrounding area (Figure 11 shows the imaging control unit 301, which accepts lights from the surroundings through lens 311, reads in the incident light with the image sensor 41, processes it, and eventually records it into the recording unit 314. Note that [0432] indicates that the entire electronic apparatus 300 of Figure 11 can be implemented within the integrated control unit 2600 of Figure 40); and a control unit connected to the projection unit and the image recording unit in a signal transfer-capable manner, the control unit configured to ascertain an operating parameter for operating the surroundings detection system for the vehicle ([0407]: “Each control unit includes a microcomputer that performs operation processing in accordance with a variety of programs, a storage section that stores the programs, parameters used for the variety of operations, or the like executed by the microcomputer, and a driving circuit that drives devices subjected to various types of control”), the control unit configured to: provide a projection signal to an interface to the projection unit, the projection signal including a control parameter for projecting a light pattern into a surrounding area of the vehicle (Figure 40 shows a block diagram of the vehicle illustrating an example of a schematic configuration of a vehicle control system. In it, the communication network (2010) connects the integrated control unit (2600) to the vehicle outside information detecting unit (2400) and the imaging section (2410), which is the ToF camera in the embodiment here. Thus, the control unit can provide a control parameter to the ToF camera in the device of Kageyama); read in image data via an interface to the image recording unit, the image data including the light pattern projected into the surrounding area (Figure 11 shows the imaging control unit 301, which accepts lights from the surroundings through lens 311, reads in the incident light with the image sensor 41, processes it, and eventually records it into the recording unit 314. Note that [0432] indicates that the entire electronic apparatus 300 of Figure 11 can be implemented within the integrated control unit 2600 of Figure 40); and process the image data, using a processing specification, to ascertain the operating parameter (Figure 13 shows the method for processing the measured phase difference, which is ultimately used to control the lens apparatus of the device. The steps of Figure 13 are described in [0182] – [0198], and includes S21, in which “the phase difference detection unit 326 calculates the focus deviation amount.”), wherein the operating parameter is representative of a calibration state or operating impairment of the surroundings detection system (Figure 13, step S21 “CALCULATE FOCUS DEVIATION AMOUNT” is described in [0197]: “a final focus deviation amount is calculated on the basis of the phase difference obtained from the pair of image signals determined of which the reliability is high in step S20.” Under the broadest reasonable interpretation, the examiner identifies the ‘final focus deviation amount’ of Kageyama with an operating impairment.). Kageyama does not teach and O’Keefe does teach wherein the projection signal ([0004]: “Embodiments of the present disclosure provide a laser range finder (e.g. a LIDAR) comprising one or more steerable lasers that non-uniformly scans a FOV based on laser steering parameters.”) comprises a predetermined light pattern having predefined geometric properties ([0004]: “Laser steering parameters (e.g. instructions) are formulated using sensor data from the local environment. The laser steering parameters can function to configure or instruct a steerable laser in a LIDAR to scan a FOV, with a dynamic angular velocity, thereby creating a complex shaped region of increased or non-uniform laser pulse density during the scan.”; Under the broadest reasonable interpretation, the ‘instructions or laser steering parameters’ of O’Keefe amount to a predetermination of the light pattern. Scanning a given field-of-view inherently gives the light pattern geometric properties, in this case, a boundary, and changing the density over the field of view imparts further geometric properties). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Kageyama with the teaching of O’Keefe to implement a projection signal with a predetermined light pattern with predefined geometric properties. O’Keefe notes in [0020] that “the disclosed techniques can improve the cost effectiveness of a laser range finder,” and that “the disclosed techniques enable a laser range finder with a smaller total number of laser pulses per second to distribute those laser pulses in a dynamic and intelligently selected manner.” The lower power requirements can translate into lower cost, which is highly desirable to end users. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Kageyama in view of O’Keefe as applied to Claim 5, and in view of Nelson (US 2008/0123961). Regarding Claim 6, Kageyama discloses all the limitations of Claim 5 as discussed in the analysis above. Kageyama does not teach and Nelson does teach wherein, in the processing step, the processing specification is configured to ascertain an impulse response function ([0061]: “The point spread function represents the impulse response of the system…” and is determined by the method disclosed in [0061]”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Kageyama with the teaching of Nelson to determine the point spread function and impulse response of the system. Nelson notes in [0060] in reference to the determination of the point spread function, that “These procedures, when embedded in software, may ensure improved evaluation accuracy when compared to conventional counting and interpolation approaches, improving the quality of data correlations considerably.” Such procedures would be advantageous in the device of Kageyama, which seeks to properly tune the focal length of the system, and the instant application, which seeks to properly adjust an operating parameter of the device. Claims 12 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Kageyama in view of O’Keefe as applied to Claim 11, and in view of Chang (US 2022/0229183). Regarding Claim 12, Kageyama in view of O’Keefe teaches all the limitations of Claim 11 as discussed in the analysis above. Although Kageyama does teach the inclusion of a LiDAR system in the ‘surrounding information detecting sensor’ described in [0413], it does not teach that a LiDAR is contained in the ‘vehicle outside information detecting device’ which is taken to be the projection unit in the analysis of Claim 11. Chang does teach wherein the projection unit is a LiDAR system ([0070]: “FIG. 1 is a side-view schematic of a scene 100 with a full-field laser-illumination LiDAR system 101, according to some embodiments of the present invention”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the projection unit of Kageyama with the teaching of Chang to use a LiDAR device. Chang notes in [0019] that LiDAR devices are “one of the key sensors for autonomous driving.” Chang also notes that LiDAR sensors “emit invisible laser-light beams to scan and detect objects in the near or far vicinity of the sensors and create a three-dimensional (3D) map of the surroundings environment,” and thereby encompass the full functionality provided by the projection unit of Kageyama. Regarding Claim 13, Kageyama in view of O’Keefe teaches all the limitations of Claim 11 as discussed in the analysis above. Kageyama and O’Keefe do not teach and Chang does teach wherein the projection unit is situated adjoining or integrated into a headlight of the vehicle or adjoining the image recording unit (Figure 17B shows a side-view diagram of a combined LiDAR and smart headlight with scanned laser-pumped illumination system 1702 that utilizes a two-dimensional MEMS mirror system 1501.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the projection unit of Kageyama with the teaching of Chang to integrate it into the headlight of a vehicle. Chang notes in [0026] that “There is a need in the art for an improved smart headlight and method, and a combined vehicle smart headlight and LiDAR system and method,” and further notes in [0128] that “In comparison with most existing LiDAR sensors installed on the top of the vehicle in automotive applications, the advantages of the novel LiDAR-embedded laser headlight of the present invention are free of close-range dead angle (data unavailability at close range), prevention of dust collection and water corrosion, and easy set-up of the electrical system in the LiDAR sensors.” Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Segawa (US 2010/0054723 A1) discloses a focus adjusting apparatus for adjusting the focus of a camera imaging a moving object. Segawa does not disclose any type of projection apparatus. Huang (US 10726579 B1) discloses a calibration system and method for LiDAR-camera devices. The sensor may be calibrated while mounted on an operational vehicle. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENJAMIN WADE CLOUSER whose telephone number is (571)272-0378. The examiner can normally be reached M-F 7:30 - 5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ISAM ALSOMIRI can be reached at (571) 272-6970. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /B.W.C./Examiner, Art Unit 3645 /ISAM A ALSOMIRI/Supervisory Patent Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Sep 03, 2021
Application Filed
May 02, 2025
Non-Final Rejection — §102, §103
Aug 12, 2025
Response Filed
Aug 19, 2025
Final Rejection — §102, §103
Dec 02, 2025
Response after Non-Final Action
Dec 15, 2025
Request for Continued Examination
Dec 21, 2025
Response after Non-Final Action
Dec 30, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12541026
COHERENT LIDAR IMAGING SYSTEM
2y 5m to grant Granted Feb 03, 2026
Patent 12535581
DISTANCE MEASURING DEVICE AND DISTANCE MEASURING METHOD
2y 5m to grant Granted Jan 27, 2026
Patent 12504520
APPARATUS, PROCESSING CIRCUITRY AND METHOD FOR MEASURING DISTANCE FROM DIRECT TIME OF FLIGHT SENSOR ARRAY TO AN OBJECT
2y 5m to grant Granted Dec 23, 2025
Patent 12474568
SYSTEM AND METHOD FOR COHERENT APERTURE OF STEERED EMITTERS
2y 5m to grant Granted Nov 18, 2025
Study what changed to get past this examiner. Based on 4 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
36%
Grant Probability
99%
With Interview (+75.0%)
4y 0m
Median Time to Grant
High
PTA Risk
Based on 14 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month