Prosecution Insights
Last updated: April 19, 2026
Application No. 18/275,808

CONFIGURATION CONTROL CIRCUITRY AND CONFIGURATION CONTROL METHOD

Non-Final OA §103§112
Filed
Aug 04, 2023
Examiner
HELLNER, MARK
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sony Semiconductor Solutions Corporation
OA Round
1 (Non-Final)
91%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 91% — above average
91%
Career Allow Rate
1339 granted / 1477 resolved
+38.7% vs TC avg
Moderate +8% lift
Without
With
+8.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
38 currently pending
Career history
1515
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
42.2%
+2.2% vs TC avg
§102
29.6%
-10.4% vs TC avg
§112
13.6%
-26.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1477 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statements filed 8/4/2023 and 1/29/2024 have been considered by the examiner. Drawings The drawings filed 8/4/2023 are approved by the examiner. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b ) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the appl icant regards as his invention. Claims 10 and 20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The neural network set forth by claims 10 and 20 does not have antecedent basis in parent claims 1 and 11. It is assumed that the neural network is part of the learning algorithm. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim s 1-5, 7-15 and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Sano et al (United States Patent Application Publication No. 2018/0054581) in view of Kim et al (United States Patent Application Publication No. 2018/0054581) . With respect to claim 1, Sano et al disclose: A configuration control circuitry for a time-of-flight system , the time-of-flight system comprising an illumination unit configured to emit light to a scene [ taught by the infrared light emitting unit in figure 2 ] and an imaging unit configured to generate image data representing a time-of-flight measurement of light reflected from the scene [ taught by the solid state imaging apparatus (20) in figure 2; figure 6; and paragraphs [0086] to [0091] ] , the configuration control circuitry being configured to : obtain the image data from the imaging unit and depth data representing a depth map of the scene, wherein the depth data is generated based on the image data [ the solid state imaging apparatus (20) provides both depth (paragraphs [0086] to [0091] and image (figures 7 and 8 data ] ; determine a set of configuration parameters for at least one of the illumination unit and the imaging unit, wherein the set of configuration parameters is determined with a learning algorithm, wherein the learning algorithm is based on a first sub-module and a second sub-module, wherein the first sub-module is configured to estimate, based on the obtained image data and the obtained depth data, a measurement indicator of the depth map, wherein the second sub-module is configured to estimate, based on the estimated measurement indicator, the set of configuration parameters for improving a subsequent time-of-flight measurement. Sano et al does not teach configuration control circuitry arranged to determine a set of configuration parameters for at least one of the illumination unit and the imaging unit, wherein the set of configuration parameters is determined with a learning algorithm, wherein the learning algorithm is based on a first sub-module and a second sub-module, wherein the first sub-module is configured to estimate, based on the obtained image data and the obtained depth data, a measurement indicator of the depth map, wherein the second sub-module is configured to estimate, based on the estimated measurement indicator, the set of configuration parameters for improving a subsequent time-of-flight measurement. With respect to this difference, Kim et al disclose: a processor using machine learning configured to determine a set of configuration parameters for at least one of the illumination unit and the imaging unit [ the abstract teaches determining optimal values for parameters used in operation of an image processor ] , wherein the set of configuration parameters is determined with a learning algorithm [ the device used machine learning ] , wherein the learning algorithm is based on a first sub-module and a second sub-module [ note, a first and second module reads on individual process steps performed by a processor ] , wherein the first sub-module is configured to estimate, based on the obtained image data and the obtained depth data, a measurement indicator of the depth map [ the abstract states , “… inputting initial values for the plurality of parameters to a machine learning model having an input layer, corresponding to the plurality of parameters, and an output layer corresponding to a plurality of evaluation items extracted from a result image generated by the image signal processor…” ] , wherein the second sub-module is configured to estimate, based on the estimated measurement indicator, the set of configuration parameters for improving a subsequent time-of-flight measurement [ the abstract states , “… obtaining evaluation scores for the plurality of evaluation items using an output of the machine learning model; adjusting weights, applied to the plurality of parameters, based on the evaluation scores; and determining the optimal values using the adjusted weights…” ]. From the above, Kim et al teaches that it was known before the effective filing date of the present application to have used machine learning to extract image date and secondly evaluated the data to determine operational parameters of an imaging device. Therefore, it would have been obvious for a person of ordinary skill in the art to have had a reasonable expectation of success in using machine learning, as taught by Kim et al, to have configured the parameters of the combined image and depth system of Sano et al, when seeking to optimize device performance. Claim 11 is rejected by the combination of Sano et al and Kim et al, as applied to claim 1. With regard to claims 2 and 12, paragraph [0047] of Kim et al states, “…The image signal processor 121 may adjust a plurality of parameters associated with the raw data and signal-process the raw data according to the adjusted parameters to generate a result image. The parameters may include two or more of color, blurring, sharpness, noise , a contrast ratio, resolution, and a size…” . Therefore, claims 2 and 12 are met by the combination of Sano et al and Kim et al, as applied to claims 1 and 11. With regard to claims 3 and 13, output power, illumination pattern and wavelength are all operational parameters that would have affected the image and depth values produced by the device of Sano et al. Therefore, claims 3 and 13 would have been an obvious reasonable expectation produced by the combination of Sano et al and Kim et al because Kim et al taught using machine learning to determine operational parameters of imaging devices. With regard to claims 4 and 14, figure 6 of Sano et al teaches using a range signal with a modulation frequency and duty cycle. Therefore, claims 4 and 14 are met by the combination of Sano et al and Kim et al, as applied to claims 1 and 11, because figure 6 shows operational parameters. With regard to claims 5 and 15, collecting the charge (701 and 702) in figure 6 of Sano et al would have required integration, thus rendering these claims obvious as a reasonable expectation of an ordinary skilled engineer in processing charge for an array of detectors. The device of Sano et al measures Z axis range across an 2D array, thus meeting claims 7 and 17. Therefore, claims 7 and 17 are met by the combination of Sano et al and Kim et al, as applied to claims 1 and 11. The abstract of Kim et al teaches using initial values for the plurality of parameters for the machine learning model, thus teaching estimat ing a set of configuration parameters further based on a set of predetermined configuration parameters of at least one of the illumination unit and the imaging unit , when the teaching of Kim et al is applied to the solid state imaging device of Sano et al. Therefore, claims 8, 9, 18 and 19 are met by the combination of Sano et al and Kim et al, as applied to claims 1 and 11. A neural network is an inherent part of machine learning. Therefore, claims 10 and 20 are met by the combination of Sano et al and Kim et al, as applied to claims 1 and 11, because the teaching of Kim et al applied to Sano et al would have operated on real time-of-flight data. Claims 6 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Sano et al (United States Patent Application Publication No. 2018/0054581) in view of Kim et al (United States Patent Application Publication No. 2018/0054581), as applied to claim s 1 and 11 above, and further in view of Na et al (2022/0050206). Paragraph [0031] of Na et al teaches that it was known before the effective filing date of the present application that indirect and direct time of flight detection was interchangeable. Therefore, it would have been obvious for a person of ordinary skill in the art to have had a reasonable expectation of success in modifying the combination of Sano et al and Kim et al to have used direct time of flight detection because Na at al taught that this type of detection produced the same result. Any inquiry concerning this communication should be directed to FILLIN "Insert the name of the examiner designated to be contacted first regarding inquiries about the Office action." \* MERGEFORMAT MARK HELLNER at telephone number FILLIN "Insert the individual area code and phone number of the examiner to be contacted." \* MERGEFORMAT (571)272-6981 . Examiner interviews are available via a variety of formats. See MPEP § 713.01. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. /MARK HELLNER/ Primary Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Aug 04, 2023
Application Filed
Mar 03, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601925
VIRTUAL IMAGE DISPLAY OPTICAL ARCHITECTURES
2y 5m to grant Granted Apr 14, 2026
Patent 12597754
PREDICTIVE CONTROL OF A PULSED LIGHT BEAM
2y 5m to grant Granted Apr 07, 2026
Patent 12586976
TUNABLE MICROCHIP LASER AND LASER SYSTEM FOR RANGING APPLICATIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12586973
RARE EARTH DOPED FIBER AND FIBER OPTIC AMPLIFIER
2y 5m to grant Granted Mar 24, 2026
Patent 12578467
LIGHT DETECTION AND RANGING (LiDAR)-BASED INSPECTION DEVICE AND METHOD OF MANUFACTURING SEMICONDUCTOR DEVICE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
91%
Grant Probability
99%
With Interview (+8.2%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 1477 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month