Prosecution Insights
Last updated: April 19, 2026
Application No. 19/022,057

CONTROL APPARATUS, IMAGE PICKUP APPARATUS, CONTROL SYSTEM, MOVING APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM

Non-Final OA §102§103
Filed
Jan 15, 2025
Examiner
MESA, JOSE M
Art Unit
2484
Tech Center
2400 — Computer Networks
Assignee
Canon Kabushiki Kaisha
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
2y 5m
To Grant
86%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
401 granted / 575 resolved
+11.7% vs TC avg
Strong +16% interview lift
Without
With
+16.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
18 currently pending
Career history
593
Total Applications
across all art units

Statute-Specific Performance

§101
5.0%
-35.0% vs TC avg
§103
51.5%
+11.5% vs TC avg
§102
29.3%
-10.7% vs TC avg
§112
5.1%
-34.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 575 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Allowable Subject Matter Claims 4-7 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-3, 8, 16, 19 and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Onaka (US 2020/0077027 A1)(hereinafter Onaka). Re claim 1, Onaka discloses a control apparatus for controlling an image pickup apparatus including an image pickup unit configured to capture image data and a lens unit having a plurality of lenses, the control apparatus comprising (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62): a processor (i.e. processor as described in fig. 6 paragraph 163); and a memory storing a program that causes the processor to execute a plurality of tasks including (i.e. the operation unit CPU 135 executes various functional operations by executing a program stored in a storage medium such as a not-shown ROM as described in fig. 6 paragraph 62. Also, see paragraphs 163-164): an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (see ¶s 122-123 for an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (i.e. a holding position of the lens barrel 4 closest to the image-capturing side from the axis 70A of the image sensor unit 5 is L1, a linear expansion coefficient of a member used between the axis 70A and the lens barrel 4 in the optical axis direction is m, a linear expansion coefficient of the image sensor IS in the image-capturing plane direction is n, and a temperature when data for the shading correction is created or measured is T0, the lens barrel 4 has, for example, a thermocouple, and can measure a temperature of the lens barrel 4 or the image sensor unit 5 as described in fig. 14 paragraph 124, furthermore, the control circuit 104 obtains the tilt angle θt of the image sensor IS from the storage unit 105 or a sensor and ambient temperature T around the image sensor unit 5 as described in fig. 18 step S801 paragraph 127). Also, see paragraphs 125-126); and a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (see ¶s 123-124 for a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (i.e. a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122). Also, see fig. 18 paragraphs 125-127) Re claim 2, Onaka as discussed in claim 1 discloses all the claim limitations with additional claimed feature wherein the processor generates the distance measurement data by correcting first distance measurement data, which is generated from the image data according to the information regarding the temperature (see ¶s 123-124 for the processor (i.e. processor as described in fig. 6 paragraph 163) generates the distance measurement data by correcting first distance measurement data, which is generated from the image data according to the information regarding the temperature (i.e. a shading correction method of the monitoring camera 1 in which a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122, furthermore, the control circuit 104 calls the storage unit 105 for the color shading correction value based on the ambient temperature T, the tilt angle θt of the image sensor IS, the lens arrangement and the aperture amount, next, the control circuit 104 sets the color shading correction value corresponding to each pixel of the image sensor IS in the color shading correction circuit 131 as described in fig. 18 step S806 paragraph 132, moreover, at this time, using the ambient temperature T, an expanded length ΔL becomes L1×m×(T−T0), and at the temperature T0, an expanded distance ΔK of a pixel separated by a distance K0 on the image sensor IS from the optical axis OA becomes K0×n×(T−T0) as described in fig. 18 paragraph 133). Also, see fig. 18 paragraphs 125-131) Re claim 3, Onaka as discussed in claim 2 discloses all the claim limitations with additional claimed feature wherein the processor determines a correction value according to the information regarding the temperature and generates the distance measurement data by correcting the first distance measurement data using the correction value (see ¶s 123-124 for the processor (i.e. processor as described in fig. 6 paragraph 163) determines a correction value according to the information regarding the temperature and generates the distance measurement data by correcting the first distance measurement data using the correction value (i.e. a shading correction method of the monitoring camera 1 in which a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122, furthermore, the control circuit 104 calls the storage unit 105 for the color shading correction value based on the ambient temperature T, the tilt angle θt of the image sensor IS, the lens arrangement and the aperture amount, next, the control circuit 104 sets the color shading correction value corresponding to each pixel of the image sensor IS in the color shading correction circuit 131 as described in fig. 18 step S806 paragraph 132, moreover, at this time, using the ambient temperature T, an expanded length ΔL becomes L1×m×(T−T0), and at the temperature T0, an expanded distance ΔK of a pixel separated by a distance K0 on the image sensor IS from the optical axis OA becomes K0×n×(T−T0) as described in fig. 18 paragraph 133). Also, see fig. 18 paragraphs 125-131) Re claim 8, Onaka discloses an image pickup apparatus comprising: the control apparatus according to claim 1, a control apparatus for controlling an image pickup apparatus including an image pickup unit configured to capture image data and a lens unit having a plurality of lenses, the control apparatus comprising (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62): a processor (i.e. processor as described in fig. 6 paragraph 163); and a memory storing a program that causes the processor to execute a plurality of tasks including (i.e. the operation unit CPU 135 executes various functional operations by executing a program stored in a storage medium such as a not-shown ROM as described in fig. 6 paragraph 62. Also, see paragraphs 163-164): an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (see ¶s 122-123 for an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (i.e. a holding position of the lens barrel 4 closest to the image-capturing side from the axis 70A of the image sensor unit 5 is L1, a linear expansion coefficient of a member used between the axis 70A and the lens barrel 4 in the optical axis direction is m, a linear expansion coefficient of the image sensor IS in the image-capturing plane direction is n, and a temperature when data for the shading correction is created or measured is T0, the lens barrel 4 has, for example, a thermocouple, and can measure a temperature of the lens barrel 4 or the image sensor unit 5 as described in fig. 14 paragraph 124, furthermore, the control circuit 104 obtains the tilt angle θt of the image sensor IS from the storage unit 105 or a sensor and ambient temperature T around the image sensor unit 5 as described in fig. 18 step S801 paragraph 127). Also, see paragraphs 125-126); and a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (see ¶s 123-124 for a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (i.e. a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122). Also, see fig. 18 paragraphs 125-127); an image pickup unit configured to capture image data (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62); and a lens unit including a plurality of lenses (i.e. the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32) Re claim 16, Onaka discloses a moving apparatus comprising the image pickup apparatus according to claim 8, an image pickup apparatus comprising: the control apparatus according to claim 1, a control apparatus for controlling an image pickup apparatus including an image pickup unit configured to capture image data and a lens unit having a plurality of lenses, the control apparatus comprising (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62): a processor (i.e. processor as described in fig. 6 paragraph 163); and a memory storing a program that causes the processor to execute a plurality of tasks including (i.e. the operation unit CPU 135 executes various functional operations by executing a program stored in a storage medium such as a not-shown ROM as described in fig. 6 paragraph 62. Also, see paragraphs 163-164): an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (see ¶s 122-123 for an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (i.e. a holding position of the lens barrel 4 closest to the image-capturing side from the axis 70A of the image sensor unit 5 is L1, a linear expansion coefficient of a member used between the axis 70A and the lens barrel 4 in the optical axis direction is m, a linear expansion coefficient of the image sensor IS in the image-capturing plane direction is n, and a temperature when data for the shading correction is created or measured is T0, the lens barrel 4 has, for example, a thermocouple, and can measure a temperature of the lens barrel 4 or the image sensor unit 5 as described in fig. 14 paragraph 124, furthermore, the control circuit 104 obtains the tilt angle θt of the image sensor IS from the storage unit 105 or a sensor and ambient temperature T around the image sensor unit 5 as described in fig. 18 step S801 paragraph 127). Also, see paragraphs 125-126); and a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (see ¶s 123-124 for a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (i.e. a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122). Also, see fig. 18 paragraphs 125-127); an image pickup unit configured to capture image data (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62); and a lens unit including a plurality of lenses (i.e. the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32), wherein the moving apparatus is configured to hold and movable with the image pickup apparatus (i.e. moving image capturing apparatus as described in figs. 3-5 paragraphs 45, 46, 49) Re claim 19, Onaka discloses a control method for controlling an image pickup apparatus including an image pickup unit configured to capture image data and a lens unit having a plurality of lenses, the control method comprising steps of (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62): acquiring information regarding a temperature along an optical axis direction of the lens unit (see ¶s 122-123 for acquiring information regarding a temperature along an optical axis direction of the lens unit (i.e. a holding position of the lens barrel 4 closest to the image-capturing side from the axis 70A of the image sensor unit 5 is L1, a linear expansion coefficient of a member used between the axis 70A and the lens barrel 4 in the optical axis direction is m, a linear expansion coefficient of the image sensor IS in the image-capturing plane direction is n, and a temperature when data for the shading correction is created or measured is T0, the lens barrel 4 has, for example, a thermocouple, and can measure a temperature of the lens barrel 4 or the image sensor unit 5 as described in fig. 14 paragraph 124, furthermore, the control circuit 104 obtains the tilt angle θt of the image sensor IS from the storage unit 105 or a sensor and ambient temperature T around the image sensor unit 5 as described in fig. 18 step S801 paragraph 127). Also, see paragraphs 125-126); and generating distance measurement data from the image data according to the information regarding the temperature (see ¶s 123-124 for generating distance measurement data from the image data according to the information regarding the temperature (i.e. a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122). Also, see fig. 18 paragraphs 125-127) Re claim 20, Onaka discloses a non-transitory computer-readable storage medium storing a program that causes a computer to execute a control method for controlling an image pickup apparatus including an image pickup unit configured to capture image data and a lens unit having a plurality of lenses, the control method comprising (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62) steps of acquiring information regarding a temperature along an optical axis direction of the lens unit (see ¶s 122-123 for acquiring information regarding a temperature along an optical axis direction of the lens unit (i.e. a holding position of the lens barrel 4 closest to the image-capturing side from the axis 70A of the image sensor unit 5 is L1, a linear expansion coefficient of a member used between the axis 70A and the lens barrel 4 in the optical axis direction is m, a linear expansion coefficient of the image sensor IS in the image-capturing plane direction is n, and a temperature when data for the shading correction is created or measured is T0, the lens barrel 4 has, for example, a thermocouple, and can measure a temperature of the lens barrel 4 or the image sensor unit 5 as described in fig. 14 paragraph 124, furthermore, the control circuit 104 obtains the tilt angle θt of the image sensor IS from the storage unit 105 or a sensor and ambient temperature T around the image sensor unit 5 as described in fig. 18 step S801 paragraph 127). Also, see paragraphs 125-126); and generating distance measurement data from the image data according to the information regarding the temperature (see ¶s 123-124 for generating distance measurement data from the image data according to the information regarding the temperature (i.e. a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122). Also, see fig. 18 paragraphs 125-127) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Onaka (US 2020/0077027 A1)(hereinafter Onaka) as applied to claims 1-3, 8, 16, 19 and 20 above, and further in view of Uekusa (US 2021/0360167 A1)(hereinafter Uekusa). Re claim 9, Onaka as discussed in claim 8 discloses all the claimed limitations but fails to explicitly teach wherein the image pickup unit is capable of measuring phase differences by dividing a single pixel into a plurality of photodiodes, and wherein the processor generates the distance measurement data using the phase differences. However, the reference of Uekusa explicitly teaches wherein the image pickup unit is capable of measuring phase differences by dividing a single pixel into a plurality of photodiodes, and wherein the processor generates the distance measurement data using the phase differences (see ¶ 102 for the image pickup unit is capable of measuring phase differences by dividing a single pixel into a plurality of photodiodes, and wherein the processor generates the distance measurement data using the phase differences (i.e. at Step S801, the image processing unit 104 generates an image for phase difference AF (image-plane phase difference image) from the picked-up image (image data) obtained by the image pickup element 102, for example, the image processing unit 104 may extract only the data of dedicated pixels for the phase difference AF so as to generate the image for phase difference AF or may generate the image for phase difference AF configured only by each of the data of the photodiodes divided in each pixel as described in figs. 1, 8 paragraph 103, furthermore, at Step S802, the image processing unit 104 obtains distance information in the viewpoint region (information on a distance D between the digital camera 100 and the subject) on the basis of the image for phase difference AF as the feature amount, for calculation of the distance information, if there are right and left, that is, two in total of divided pixels in each pixel of the image for phase difference AF, for example, the image processing unit 104 performs a correlation value calculation of a value of the left divided pixel and a value of the right divided pixel included in the same line in the horizontal direction as described in figs. 1, 8 paragraph 104)) Therefore, taking the combined teachings of Onaka and Uekusa as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (phase) into the system of Onaka as taught by Uekusa. One will be motivated to incorporate the above feature into the system of Onaka as taught by Uekusa for the benefit of having an image processing unit 104 that generates an image for phase difference AF (image-plane phase difference image) from the picked-up image (image data) obtained by the image pickup element 102, for example, the image processing unit 104 may extract only the data of dedicated pixels for the phase difference AF so as to generate the image for phase difference AF or may generate the image for phase difference AF configured only by each of the data of the photodiodes divided in each pixel, wherein the image processing unit 104 obtains distance information in the viewpoint region (information on a distance D between the digital camera 100 and the subject) on the basis of the image for phase difference AF as the feature amount, wherein the distance D between the digital camera 100 and the subject is an optical distance between the image pickup element 102 and the subject, and for calculation of the distance information, if there are right and left, that is, two in total of divided pixels in each pixel of the image for phase difference AF, for example, the image processing unit 104 performs a correlation value calculation of a value of the left divided pixel and a value of the right divided pixel included in the same line in the horizontal direction in order to ease the processing time when calculating the distance information in each pixel of the image for phase difference AF (see figs. 1, 8 ¶s 103-104) Claims 10 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Onaka (US 2020/0077027 A1)(hereinafter Onaka) as applied to claims 1-3, 8, 16, 19 and 20 above, and further in view of Iijima et al. (US 2010/0128140 A1)(hereinafter Iijima). Re claim 10, Onaka as discussed in claim 8 discloses all the claimed limitations but fails to explicitly teach further comprising: a main body substrate that is electrically connected to an image sensor substrate included in the image pickup unit; a protrusion member that extends from the lens unit in a direction perpendicular to the optical axis direction; and a detector that is provided on the main body substrate to come closer to the protrusion member and detects a temperature at a position on an image target side of the lens unit. However, the reference of Iijima explicitly teaches further comprising: a main body substrate that is electrically connected to an image sensor substrate included in the image pickup unit (see ¶s 90, 93 for a main body substrate that is electrically connected to an image sensor substrate included in the image pickup unit (i.e. the circuit unit 120 includes a substrate 121, an imaging element 122, and a system LSI (hereinafter referred to as SLSI) 123 as described in figs. 1-2 paragraph 89, furthermore, the imaging element 122 is a solid-state imaging element such as a Charge Coupled Device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and is provided at a predetermined distance from the lens array 112, further, the imaging element 122 has imaging areas respectively corresponding to the lenses of the lens array 112, the imaging areas are arranged so that each imaging area is approximately perpendicular to the optical axis of the corresponding lens, the imaging element 122 is electrically connected to the SLSI 123 via a gold wire 125 and the substrate 121 as described in fig. 1 paragraph 94)); a protrusion member that extends from the lens unit in a direction perpendicular to the optical axis direction (see ¶ 89 for a protrusion member that extends from the lens unit in a direction perpendicular to the optical axis direction (i.e. the imaging element 122 has imaging areas respectively corresponding to the lenses of the lens array 112, the imaging areas are arranged so that each imaging area is approximately perpendicular to the optical axis of the corresponding lens, the imaging element 122 is electrically connected to the SLSI 123 via a gold wire 125 and the substrate 121 as described in fig. 1 paragraph 94, furthermore, although the light-shielding wall 113 according to the present variation has the protruding portions 113t on both sides of the plates, it may have the protruding portions 113t only on one side as described in figs. 25-26 paragraph 260). Also, see paragraphs 259, 261, 262); and a detector that is provided on the main body substrate to come closer to the protrusion member and detects a temperature at a position on an image target side of the lens unit (see ¶ 94 for a detector that is provided on the main body substrate to come closer to the protrusion member and detects a temperature at a position on an image target side of the lens unit (i.e. the circuit unit 120 includes a substrate 121, an imaging element 122, and a system LSI (hereinafter referred to as SLSI) 123 as described in figs. 1-2 paragraph 89, furthermore, the SLSI 123 drives the imaging element 122, and obtains an electric signal from the driven imaging element 122, further, the SLSI 123 estimates a temperature based on the electric signal obtained as described in figs. 1-2 paragraph 95, moreover, although the light-shielding wall 113 according to the present variation has the protruding portions 113t on both sides of the plates, it may have the protruding portions 113t only on one side as described in figs. 25-26 paragraph 260). Also, see paragraphs 259, 261, 262) Therefore, taking the combined teachings of Onaka and Uekusa as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (connected) into the system of Onaka as taught by Uekusa. One will be motivated to incorporate the above feature into the system of Onaka as taught by Uekusa for the benefit of having an imaging element 122 that is a solid-state imaging element such as a Charge Coupled Device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and is provided at a predetermined distance from the lens array 112, further, the imaging element 122 has imaging areas respectively corresponding to the lenses of the lens array 112, wherein the imaging areas are arranged so that each imaging area is approximately perpendicular to the optical axis of the corresponding lens in order to improve efficiency when electrically connecting the imaging element 122 to the SLSI 123 via a gold wire 125 and the substrate 121 (see fig. 1 ¶ 94) Re claim 11, Onaka as discussed in claim 8 discloses all the claimed limitations but fails to explicitly teach further comprising: a main body substrate that is electrically connected to an image sensor substrate included in the image pickup unit; a flexible substrate that is connected to the main body substrate; a housing that covers the main body substrate and is provided between the main body substrate and the lens unit; and a detector that is provided on a housing side of the flexible substrate, between the housing and the lens unit, and detects a temperature at a position on an image target side of the lens unit. However, the reference of Iijima explicitly teaches further comprising: a main body substrate that is electrically connected to an image sensor substrate included in the image pickup unit (see ¶s 90, 93 for a main body substrate that is electrically connected to an image sensor substrate included in the image pickup unit (i.e. the circuit unit 120 includes a substrate 121, an imaging element 122, and a system LSI (hereinafter referred to as SLSI) 123 as described in figs. 1-2 paragraph 89, furthermore, the imaging element 122 is a solid-state imaging element such as a Charge Coupled Device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and is provided at a predetermined distance from the lens array 112, further, the imaging element 122 has imaging areas respectively corresponding to the lenses of the lens array 112, the imaging areas are arranged so that each imaging area is approximately perpendicular to the optical axis of the corresponding lens, the imaging element 122 is electrically connected to the SLSI 123 via a gold wire 125 and the substrate 121 as described in fig. 1 paragraph 94)); a flexible substrate that is connected to the main body substrate (see ¶ 89 for a flexible substrate that is connected to the main body substrate (i.e. the substrate 121 is a platy member made of a resin, it has components fixed thereon, such as the imaging element 122 and the SLSI 123, and constitutes an electric circuit by connecting such components by a wire, on the upper surface of the substrate 121, the underside of the lens tube 111 is bonded with an adhesive, for example as described in fig. 1 paragraph 93)); a housing that covers the main body substrate and is provided between the main body substrate and the lens unit (see ¶s 90, 93 for a housing that covers the main body substrate and is provided between the main body substrate and the lens unit (i.e. the circuit unit 120 includes a substrate 121, an imaging element 122, and a system LSI (hereinafter referred to as SLSI) 123 as described in figs. 1-2 paragraph 89, furthermore, the imaging element 122 is a solid-state imaging element such as a Charge Coupled Device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and is provided at a predetermined distance from the lens array 112, further, the imaging element 122 has imaging areas respectively corresponding to the lenses of the lens array 112, the imaging areas are arranged so that each imaging area is approximately perpendicular to the optical axis of the corresponding lens, the imaging element 122 is electrically connected to the SLSI 123 via a gold wire 125 and the substrate 121 as described in fig. 1 paragraph 94)); and a detector that is provided on a housing side of the flexible substrate, between the housing and the lens unit, and detects a temperature at a position on an image target side of the lens unit (see ¶ 94 for a detector that is provided on a housing side of the flexible substrate, between the housing and the lens unit, and detects a temperature at a position on an image target side of the lens unit (i.e. the circuit unit 120 includes a substrate 121, an imaging element 122, and a system LSI (hereinafter referred to as SLSI) 123 as described in figs. 1-2 paragraph 89, furthermore, the SLSI 123 drives the imaging element 122, and obtains an electric signal from the driven imaging element 122, further, the SLSI 123 estimates a temperature based on the electric signal obtained as described in figs. 1-2 paragraph 95, moreover, although the light-shielding wall 113 according to the present variation has the protruding portions 113t on both sides of the plates, it may have the protruding portions 113t only on one side as described in figs. 25-26 paragraph 260). Also, see paragraphs 259, 261, 262) Therefore, taking the combined teachings of Onaka and Uekusa as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (connected) into the system of Onaka as taught by Uekusa. One will be motivated to incorporate the above feature into the system of Onaka as taught by Uekusa for the benefit of having an imaging element 122 that is a solid-state imaging element such as a Charge Coupled Device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and is provided at a predetermined distance from the lens array 112, further, the imaging element 122 has imaging areas respectively corresponding to the lenses of the lens array 112, wherein the imaging areas are arranged so that each imaging area is approximately perpendicular to the optical axis of the corresponding lens in order to improve efficiency when electrically connecting the imaging element 122 to the SLSI 123 via a gold wire 125 and the substrate 121 (see fig. 1 ¶ 94) Claims 12-15, 17 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Onaka (US 2020/0077027 A1)(hereinafter Onaka) as applied to claims 1-3, 8, 16, 19 and 20 above, and further in view of Makita (US 2022/0342046 A1)(hereinafter Makita). Re claim 12, Onaka discloses a control system comprising: the image pickup apparatus according to claim 8, an image pickup apparatus comprising: the control apparatus according to claim 1, a control apparatus for controlling an image pickup apparatus including an image pickup unit configured to capture image data and a lens unit having a plurality of lenses, the control apparatus comprising (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62): a processor (i.e. processor as described in fig. 6 paragraph 163); and a memory storing a program that causes the processor to execute a plurality of tasks including (i.e. the operation unit CPU 135 executes various functional operations by executing a program stored in a storage medium such as a not-shown ROM as described in fig. 6 paragraph 62. Also, see paragraphs 163-164): an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (see ¶s 122-123 for an acquiring task configured to acquire information regarding a temperature along an optical axis direction of the lens unit (i.e. a holding position of the lens barrel 4 closest to the image-capturing side from the axis 70A of the image sensor unit 5 is L1, a linear expansion coefficient of a member used between the axis 70A and the lens barrel 4 in the optical axis direction is m, a linear expansion coefficient of the image sensor IS in the image-capturing plane direction is n, and a temperature when data for the shading correction is created or measured is T0, the lens barrel 4 has, for example, a thermocouple, and can measure a temperature of the lens barrel 4 or the image sensor unit 5 as described in fig. 14 paragraph 124, furthermore, the control circuit 104 obtains the tilt angle θt of the image sensor IS from the storage unit 105 or a sensor and ambient temperature T around the image sensor unit 5 as described in fig. 18 step S801 paragraph 127). Also, see paragraphs 125-126); and a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (see ¶s 123-124 for a generating task configured to generate distance measurement data from the image data according to the information regarding the temperature (i.e. a distance between the image sensor IS and the lens barrel 4 changes when a temperature changes as described in figs as described in fig. 14 paragraph 122). Also, see fig. 18 paragraphs 125-127); an image pickup unit configured to capture image data (i.e. monitoring camera 1 can capture an image as described in fig. 1 paragraph 30, furthermore, the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32. Also, see fig. 6 paragraphs 28-29, 62); and a lens unit including a plurality of lenses (i.e. the lens barrel 4 has, in order from the front side to the rear side, a first lens unit L1, a second lens unit L2, a third lens unit L3, a fourth lens unit L4, and a fifth lens unit L5 as described in fig. 2 paragraph 32) Onaka fails to explicitly teach and a determining unit configured to determine a possibility of collision with an object based on distance information of the object acquired by the image pickup apparatus. However, the reference of Makita explicitly teaches and a determining unit configured to determine a possibility of collision with an object based on distance information of the object acquired by the image pickup apparatus (see fig. 5 ¶s 41-42 for a determining unit configured to determine a possibility of collision with an object based on distance information of the object acquired by the image pickup apparatus (i.e. first, in the step S1, the light source generator (light source unit) 50 of the optical apparatus 300 illuminates an object around the vehicle, and the controller 102 acquires distance information of the object by receiving reflected light from the object based on a signal output from the light receiving element 8 as described in fig. 7 paragraph 44, furthermore, the controller 102 determines that “there is a likelihood of collision” when the object exists within the set distance (step S4) and determines that “there is no likelihood of collision” when the object does not exist within the set distance (step S5) as described in fig. 7 paragraph 45). Also, see paragraph 46) Therefore, taking the combined teachings of Onaka and Makita as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (collision) into the system of Onaka as taught by Makita. One will be motivated to incorporate the above feature into the system of Onaka as taught by Makita for the benefit of having an in-vehicle system 1000 that includes an optical apparatus 300, a vehicle information acquiring apparatus 200, a control apparatus (ECU: electronic control unit hereinafter) 350, and a warning apparatus (warning unit) 400, wherein the optical apparatus 300 includes the controller 102 (see FIG. 2) that serves as a distance acquiring unit (acquiring unit) and a collision determining unit (determination unit), wherein the light source generator (light source unit) 50 of the optical apparatus 300 illuminates an object around the vehicle, and the controller 102 acquires distance information of the object by receiving reflected light from the object based on a signal output from the light receiving element 8, wherein the controller 102 determines that “there is a likelihood of collision” when the object exists within the set distance (step S4) and determines that “there is no likelihood of collision” when the object does not exist within the set distance (step S5) in order to improve efficiency when determining a likelihood of collision between the vehicle and the object (see fig. 7 ¶s 42, 44-45) Re claim 13, the combination of Onaka and Makita as discussed in claim 12 discloses all the claimed limitations but fails to explicitly teach further comprising a control apparatus configured to output a control signal for generating a braking force to a driving unit of a moving apparatus in a case where it is determined that there is a possibility of collision between the moving apparatus and the object. However, the reference of Makita explicitly teaches further comprising a control apparatus configured to output a control signal for generating a braking force to a driving unit of a moving apparatus in a case where it is determined that there is a possibility of collision between the moving apparatus and the object (see fig. 5 ¶s 42, 45 for a control apparatus configured to output a control signal for generating a braking force to a driving unit of a moving apparatus in a case where it is determined that there is a possibility of collision between the moving apparatus and the object (i.e. when the controller 102 determines that “there is the likelihood of collision,” the controller 102 notifies (transmits) the determination result to the control apparatus 350 and the warning apparatus 400, at this time, the control apparatus 350 controls the vehicle based on the determination result of the controller 102 (step S6), and the warning apparatus 400 warns a vehicle user (driver, passenger) based on the determination result of the controller 102 (step S7), the determination result may be notified to at least one of the control apparatus 350 and the warning apparatus 400 as described in fig. 7 paragraph 46, furthermore, the control apparatus 350 can control the movement of the vehicle by outputting a control signal to a driving unit (engine, motor, etc.) of the vehicle, for example, the vehicle provides control such as applying a brake, releasing a gas pedal, turning a steering wheel, generating a control signal for generating a braking force on each wheel, and suppressing an output of an engine or a motor as described in fig. 7 paragraph 47)) Therefore, taking the combined teachings of Onaka and Makita as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (collision) into the system of Onaka as taught by Makita. Per claim 13, Onaka and Makita are combined for the same motivation as set forth in claim 12 above. Re claim 14, the combination of Onaka and Makita as discussed in claim 12 discloses all the claimed limitations but fails to explicitly teach further comprising a warning apparatus configured to warn a driver of a moving apparatus in a case where it is determined that there is a possibility of collision between the moving apparatus and the object. However, the reference of Makita explicitly teaches further comprising a warning apparatus configured to warn a driver of a moving apparatus in a case where it is determined that there is a possibility of collision between the moving apparatus and the object (see fig. 5 ¶s 42, 45 for a warning apparatus configured to warn a driver of a moving apparatus in a case where it is determined that there is a possibility of collision between the moving apparatus and the object (i.e. when the controller 102 determines that “there is the likelihood of collision,” the controller 102 notifies (transmits) the determination result to the control apparatus 350 and the warning apparatus 400, at this time, the control apparatus 350 controls the vehicle based on the determination result of the controller 102 (step S6), and the warning apparatus 400 warns a vehicle user (driver, passenger) based on the determination result of the controller 102 (step S7), the determination result may be notified to at least one of the control apparatus 350 and the warning apparatus 400 as described in fig. 7 paragraph 46, furthermore, the warning apparatus 400 warns the user, for example, by issuing an alert sound, displaying warning information on a screen of a car navigation system, or vibrating a seat belt or steering wheel as described in fig. 7 paragraph 47)) Therefore, taking the combined teachings of Onaka and Makita as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (collision) into the system of Onaka as taught by Makita. Per claim 14, Onaka and Makita are combined for the same motivation as set forth in claim 12 above. Re claim 15, the combination of Onaka and Makita as discussed in claim 12 discloses all the claimed limitations but fails to explicitly teach further comprising a notification unit configured to notify information regarding a collision between a moving apparatus and the object to the outside. However, the reference of Makita explicitly teaches further comprising a notification unit configured to notify information regarding a collision between a moving apparatus and the object to the outside (see fig. 5 ¶s 42, 45, 46 for a notification unit configured to notify information regarding a collision between a moving apparatus and the object to the outside (i.e. the in-vehicle system 1000 and the vehicle 500 may include a notification apparatus (notification unit) that notifies a manufacturer of the in-vehicle system 1000 and a seller (dealer) of the vehicle 500 of any collisions between the vehicle 500 and an obstacle, for example, the notification apparatus may transmit information (collision information) on a collision between the vehicle 500 and the obstacle to a preset external notification destination by e-mail or the like as described in fig. 7 paragraph 50)) Therefore, taking the combined teachings of Onaka and Makita as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (collision) into the system of Onaka as taught by Makita. Per claim 15, Onaka and Makita are combined for the same motivation as set forth in claim 12 above. Re claim 17, Onaka as discussed in claim 16 discloses all the claimed limitations but fails to explicitly teach further comprising a determining unit configured to determine a possibility of collision with an object based on distance information of the object acquired by the image pickup apparatus. However, the reference of Makita explicitly teaches further comprising a determining unit configured to determine a possibility of collision with an object based on distance information of the object acquired by the image pickup apparatus (see fig. 5 ¶s 41-42 for a determining unit configured to determine a possibility of collision with an object based on distance information of the object acquired by the image pickup apparatus (i.e. first, in the step S1, the light source generator (light source unit) 50 of the optical apparatus 300 illuminates an object around the vehicle, and the controller 102 acquires distance information of the object by receiving reflected light from the object based on a signal output from the light receiving element 8 as described in fig. 7 paragraph 44, furthermore, the controller 102 determines that “there is a likelihood of collision” when the object exists within the set distance (step S4) and determines that “there is no likelihood of collision” when the object does not exist within the set distance (step S5) as described in fig. 7 paragraph 45). Also, see paragraph 46) Therefore, taking the combined teachings of Onaka and Makita as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (collision) into the system of Onaka as taught by Makita. One will be motivated to incorporate the above feature into the system of Onaka as taught by Makita for the benefit of having an in-vehicle system 1000 that includes an optical apparatus 300, a vehicle information acquiring apparatus 200, a control apparatus (ECU: electronic control unit hereinafter) 350, and a warning apparatus (warning unit) 400, wherein the optical apparatus 300 includes the controller 102 (see FIG. 2) that serves as a distance acquiring unit (acquiring unit) and a collision determining unit (determination unit), wherein the light source generator (light source unit) 50 of the optical apparatus 300 illuminates an object around the vehicle, and the controller 102 acquires distance information of the object by receiving reflected light from the object based on a signal output from the light receiving element 8, wherein the controller 102 determines that “there is a likelihood of collision” when the object exists within the set distance (step S4) and determines that “there is no likelihood of collision” when the object does not exist within the set distance (step S5) in order to improve efficiency when determining a likelihood of collision between the vehicle and the object (see fig. 7 ¶s 42, 44-45) Re claim 18, the combination of Onaka and Makita as discussed in claim 17 discloses all the claimed limitations but fails to explicitly teach further comprising a notification unit configured to notify information regarding a collision with the object to the outside. However, the reference of Makita explicitly teaches further comprising a notification unit configured to notify information regarding a collision with the object to the outside (see fig. 5 ¶s 42, 45, 46 for a notification unit configured to notify information regarding a collision with the object to the outside (i.e. the in-vehicle system 1000 and the vehicle 500 may include a notification apparatus (notification unit) that notifies a manufacturer of the in-vehicle system 1000 and a seller (dealer) of the vehicle 500 of any collisions between the vehicle 500 and an obstacle, for example, the notification apparatus may transmit information (collision information) on a collision between the vehicle 500 and the obstacle to a preset external notification destination by e-mail or the like as described in fig. 7 paragraph 50)) Therefore, taking the combined teachings of Onaka and Makita as a whole, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (collision) into the system of Onaka as taught by Makita. Per claim 18, Onaka and Makita are combined for the same motivation as set forth in claim 17 above. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSE M MESA whose telephone number is (571)270-1706. The examiner can normally be reached Monday-Friday 8:30AM-6:00PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thai Tran can be reached at 571-272-7382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. 2/5/2026 /JOSE M. MESA/ Examiner Art Unit 2484 /THAI Q TRAN/Supervisory Patent Examiner, Art Unit 2484
Read full office action

Prosecution Timeline

Jan 15, 2025
Application Filed
Feb 05, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598333
DATA PROCESSING METHOD AND APPARATUS, AND DEVICE, STORAGE MEDIUM AND PROGRAM PRODUCT
2y 5m to grant Granted Apr 07, 2026
Patent 12598389
IMAGING DEVICE, SENSOR CHIP, AND PROCESSING CIRCUIT
2y 5m to grant Granted Apr 07, 2026
Patent 12597444
SYSTEMS AND METHODS FOR AUTOMATED DIGITAL EDITING
2y 5m to grant Granted Apr 07, 2026
Patent 12580004
VIDEO EDITING SUPPORT DEVICE, VIDEO EDITING SUPPORT METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12581156
DISPLAY APPARATUS AND RECORDING METHOD
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
86%
With Interview (+16.4%)
2y 5m
Median Time to Grant
Low
PTA Risk
Based on 575 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month