Prosecution Insights
Last updated: April 19, 2026
Application No. 18/319,859

Motion Sensor Camera Illumination

Non-Final OA §103§112
Filed
May 18, 2023
Examiner
AYNALEM, NATHNAEL B
Art Unit
2488
Tech Center
2400 — Computer Networks
Assignee
Comcast Cable Communications LLC
OA Round
3 (Non-Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
90%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
505 granted / 662 resolved
+18.3% vs TC avg
Moderate +14% lift
Without
With
+13.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
32 currently pending
Career history
694
Total Applications
across all art units

Statute-Specific Performance

§101
5.6%
-34.4% vs TC avg
§103
39.5%
-0.5% vs TC avg
§102
22.3%
-17.7% vs TC avg
§112
21.6%
-18.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 662 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 03/05/2026 has been entered. Response to Amendment and Argument Applicant’s amendment and argument with respect to pending claims 1, 3-8, 10-15 and 18-20 filed on 03/05/2026 have been fully considered but the argument with respect to the independent claims 1 and 10 has been rendered moot in view of a new ground(s) of rejection. Claim Rejections - 35 USC § 103 Summary of Arguments: Regarding claim 14, applicant argue that Sinitsyn does not teach “controlling, based on identifying that the glaring portion of the object is a region of interest of the object, a second subset of light sources illuminating a second portion of the object to decrease in intensity” because “Sinitsyn merely describes reducing the illumination of a single lighting apparatus that is illuminating an object”. Remarks, pp. 10-11. Examiner’s Response: Examiner respectfully disagrees. Sinitsyn at ¶¶0040, 0050, 0099, 0134 partially reproduced partially below, discloses the following: [0040] The outdoor lighting system 100 comprises a plurality of outdoor lighting apparatuses 101. Four outdoor lighting apparatuses 101 are shown in FIG. 1a… [0050] …If there has been an adverse change, the lighting controller 330 controls a lighting parameter of at least one of the plurality of outdoor lighting apparatuses 101 to change the illumination of the monitored area so as to improve the image quality parameter of the captured image data. Hence, in some embodiments, the lighting controller 330 controls a lighting parameter of at least one of the plurality of lighting units so as to change the illumination of the area so as to improve the image quality parameter of the captured image data back to its previous (i.e. before the change) state. [0099] …The lighting controller 330 can then control lighting parameters of only the outdoor lighting apparatuses 101 that affect the illumination of the monitored area, rather than controlling all the outdoor lighting apparatuses 101 to have the new illumination conditions…. [0134] The case of reflections from rain will be illustrated with reference to FIG. 8. This shows an example image 700a as may be displayed on the display 310. In this example, it is assumed that lighter region 701 in the image 700a is a bright spot caused by glare from a puddle from a specific outdoor lighting apparatus 101. In this embodiment, in step S3, it may be detected that there is the bright spot 701 in the image 700a that was not previously there. .. As a result, the system can reduce the illumination of the outdoor lighting apparatus 101 in the monitored area of the image 700a causing the bright spot 701, e.g. by controlling that outdoor lighting apparatus 101 to rotate to the different orientation or by reducing its light output. This would lead to an improved image 700b. As noted above, Sinitsyn discloses a plurality of outdoor lighting apparatuses 101, and controlling a lighting parameter of at least one of the plurality of outdoor lighting apparatuses 101 to change the illumination of the monitored area so as to improve the image quality parameter of the captured image data. The controlling may include controlling lighting parameters of only the outdoor lighting apparatuses 101 that affect the illumination of the monitored area. Based on detecting there is the bright spot 701 in the image 700a, as illustrated in FIG. 8, the system can reduce the illumination of the outdoor lighting apparatus 101 in the monitored area of the image 700a causing the bright spot 701, e.g. by controlling that outdoor lighting apparatus 101 to rotate to the different orientation or by reducing its light output. Thus, Sinitsyn teaches “controlling, based on identifying that the glaring portion of the object is a region of interest of the object, a second subset of light sources illuminating a second portion of the object to decrease in intensity”. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1, 3-8, 10-15 and 18-20 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claim 1 recites the limitation “causing a second subset of the plurality of light sources to illuminate, at a second intensity that is lower than the first intensity, the second portion of the object,” which was not described in the originally filed specification of the current application. Claim 10 recites the limitation “causing a second subset of the one or more light sources to illuminate, at a second intensity that is lower than the first intensity, second portion of the object,” which was not described in the originally filed specification of the current application. Claim 14 recites the limitation “a second subset of light sources illuminating a second portion of the object to decrease in intensity,” which was not described in the originally filed specification of the current application. Dependent claims 3-8, 11-13, 15 and 18-20 are rejected based on their dependency from the rejected claims 1, 10 and 14. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 3, 4-7, 21, 22 and 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Abalos (US 11102453 B2) in view of Van Der Sijde et al. (US 20200154027 A1). Regarding claim 1, Abalos teaches a method comprising: receiving, by a computing device, a motion detection signal related to a field of view (FOV) of a camera, wherein the motion detection signal comprises information identifying an object within the FOV (Figs. 1-7, col. 4, lines 57-61: Image and video data captured by lens 202 can be provided as input to other components in camera system 100, such as image processing unit 204 and detection and analytics unit 210. Col. 9, lines 34-42: targets can be tracked within camera 120's FOV if there is detected movement. col. 5, lines 35-45: motion detection); identifying a first portion of the object and a second portion of the object (col. 5, lines 35-45: detection operations can include…human detection, object-in-hand detection, sound classification, facial recognition) Note that this interpretation is consistent with the disclosure of the current application. See for example current application ¶0048 describing “face recognition process”; and controlling the plurality of light sources to illuminate the object within the FOV (col. 3, lines 61-65: Light devices 122 can be any light device or array of light devices, that illuminates all or a portion of camera's 120 FOV. Col. 6, lines 25-27: Controller 212 can include rules, policies, logic, instructions, etc., to manage lighting operations of light devices 122) wherein the controlling comprises: causing the first subset of the plurality of light sources to illuminate, at a first intensity, the first portion of the object within the FOV; and causing a second subset of the plurality of light sources to illuminate, at a second intensity that is lower than the first intensity (col. 8, lines 47-61, lighting can be optimized such that light devices 122-1, 122-2 light up target 602 at a first intensity, and light devices 122-3, 122-4 light up target 604 at a second intensity (in this case, the first and second intensities can be different or the same depending on the lighting needs to illuminate the target). Accordingly, intelligent activation of lighting can optimize a first subset of light devices to a first intensity, and a second subset of light devices to a second intensity, and so on. Col. 9, lines 9-32: illuminating target 602 with light devices 122-1, 122-2, but turning off or de-optimizing light devices 122-3, 122-4 in accordance with some received energy budget), Abalos does not explicitly disclose causing a second subset of the plurality of light sources to illuminate, at a second intensity that is lower than the first intensity, the second portion of the object. However, Van Der Sijde discloses causing a second subset of the plurality of light sources to illuminate, at a second intensity that is lower than the first intensity, the second portion of the object (¶0057: while less light is provided in region 40, corresponding to foreground person 30. Extra light is provided to the face 52 of the person in the background. ¶0080: FIG. 14A illustrates how the scene is illuminated when six LEDs are supplied with varying levels of current and three LEDs receive no current. The center LED 96 in the left column is supplied with five times more current than the five LEDs 97, 98, 99, 100, and 101 which surround LED 96). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos’s illumination system by incorporating the teaching of Van Der Sijde as noted above, in order to lower a power consumption of the system as suggested by Van Der Sijde (¶0003). Regarding claim 3, Abalos in view of Van Der Sijde discloses the method of claim 1. Van Der Sijde discloses wherein the second subset surrounds the first subset of the plurality of light sources (See Figs. 11B, 12B, 13B and 14B: e.g. Fig. 14B illustrates center LED 96 surrounded by LEDs LEDs 97, 98, 99, 100, and 101, as described in ¶0080). The motivation statement set forth above with respect to claim 1 applies here. Regarding claim 4, Abalos discloses the method of claim 1, wherein the plurality of light sources (lighting devices 122) are co-located with a motion sensor capturing the motion detection signal (Col. 4, lines 50- 63, col. 5, lines 35-45: The captured data by the camera 120 is used for motion detection. Col. 3, line 65 to col. 4, line 1: Light devices 122 can be built into the camera or can be an external accessory that communicates with the cameras in an area (e.g., an external illuminator such as a flood light)). Regarding claim 5, Abalos discloses the method of claim 1, wherein the controlling comprises adjusting different light sources of the plurality of light sources to illuminate with different intensities based on a distance to the object within the FOV (col. 6, lines 60-63: Based on this determination, the camera/system can determine that a lighting change is needed, and can determine that the system needs to adjust one or more LEDs within the array of LEDs. col. 8, lines 27-36, col. 9, lines 9-33: when camera 120 has an object in its foreground but the system wants to illuminate targets in the background, with enough LEDs, the power on the LEDs aimed at close targets can be lowered while LEDs illuminating the longer distances and background can be maintained at high power). Regarding claim 6, Abalos teaches the method of claim 1, wherein the controlling comprises determining intensity of the first subset of the plurality of light sources based on a battery level associated with the computing device (col. 2, lines 49-62, col. 8, lines 1-6: lighting adjustments can take power resources and limits into consideration in some embodiments (step 312). Thus, if there are power limits or constraints, camera system 100 can determine an amount of power needed to turn on a light device within the array of light devices at a particular intensity). Regarding claim 7, Abalos discloses the method of claim 1, further comprising determining the first subset of the plurality of light sources based on a battery level associated with the computing device (col. 2, lines 49-62, col. 8, lines 1-6: lighting adjustments can take power resources and limits into consideration in some embodiments (step 312). Thus, if there are power limits or constraints, camera system 100 can determine an amount of power needed to turn on a light device within the array of light devices at a particular intensity). Regarding claim 21, Abalos discloses the method of claim 1, wherein the controlling further comprises adjusting at least one of the first intensity or the second intensity based on sound data received by the computing device (col. 9, line 61 to col. 10, line 6: audio sources can be captured by sensors 214 (or other external sensors)... Based on the audio source detection, however, video analytics models may be applied that determine or infer the location of a target that's about to enter the scene. Lighting can then be adjusted based on the predicted location/trajectory, such as by directing a floodlight to its predicted location or adjusting gain based on its predicted location and an identification of the target). Regarding claim 22, Abalos discloses the method of claim 1, wherein the controlling further comprises: causing, before causing the second subset of the plurality of light sources to illuminate at the second intensity, the second subset of the plurality of light sources to illuminate at the first intensity (col. 8, lines 47-61, col. 9, lines 22-32: lighting can be optimized such that light devices 122-1, 122-2 light up target 602 at a first intensity, and light devices 122-3, 122-4 light up target 604 at a second intensity (in this case, the first and second intensities can be different or the same depending on the lighting needs to illuminate the target).), wherein the causing the second subset of the plurality of lights to illuminate at the second intensity is based on determining that the first portion of the object is a region of interest of the object (col. 9, lines 9-32: assuming target 602 is considered a threat but target 604 is not, power can also be managed as target 602 approaches by illuminating target 602 with light devices 122-1, 122-2, but turning off or de-optimizing light devices 122-3, 122-4 in accordance with some received energy budget). Regarding claim 23, Abalos teaches the method of claim 1, wherein the plurality of light sources comprises an array of light emitting diodes oriented at a plurality of different angles (Fig. 5, col. 3, lines 61-65: Light devices 122 can be any light device or array of light devices, that illuminates all or a portion of camera's 120 FOV. For example, light devices 122 can be a static matrix of different FOV LEDs or individually position-able LEDs of various FOVs (field of view).). Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Abalos (US 11102453 B2) in view of Van Der Sijde et al. (US 20200154027 A1) as applied to claim 1, and further in view of Sinitsyn et al. (US 20190246477 A1). Regarding claim 8, Abalos in view of Van Der Sijde does not disclose generating a light reflection map based on glare detected in a calibration image of the FOV; and determining one or more light sources of the first subset of the plurality of light sources that should be reduced based on one or more locations, corresponding to the glare, in the light reflection map. However, Sinitsyn discloses generating a light reflection map based on glare detected in a calibration image of the FOV(¶0134: This shows an example image 700a as may be displayed on the display 310. In this example, it is assumed that lighter region 701 in the image 700a is a bright spot caused by glare from a puddle from a specific outdoor lighting apparatus 101); and determining one or more light sources of the first subset of the plurality of light sources that should be reduced based on one or more locations, corresponding to the glare, in the light reflection map (Figs. 1, 3-6, ¶0040: The outdoor lighting system 100 comprises a plurality of outdoor lighting apparatuses 101. Four outdoor lighting apparatuses 101 are shown in FIG. 1a…¶0050, 0099: the lighting controller 330 controls a lighting parameter of at least one of the plurality of lighting units so as to change the illumination of the area so as to improve the image quality parameter of the captured image data…¶0134-0135: the system can reduce the illumination of the outdoor lighting apparatus 101 in the monitored area of the image 700a causing the bright spot 701, e.g. by controlling that outdoor lighting apparatus 101 to rotate to the different orientation or by reducing its light output). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos in view of Van Der Sijde by incorporating the teaching of Sinitsyn as noted above, in order to improve the captured image quality (Sinitsyn: ¶0130, 0134). Claim(s) 10-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Abalos (US 11102453 B2) in view of Van Der Sijde et al. (US 20200154027 A1) and Deaton (US 9839088 B1). Regarding claim 10, Abalos teaches a method comprising: receiving, by a computing device, a motion detection signal related to a field of view (FOV) of a camera, wherein the motion detection signal comprises information identifying an object within the FOV (Figs. 1-7, col. 4, lines 57-61: Image and video data captured by lens 202 can be provided as input to other components in camera system 100, such as image processing unit 204 and detection and analytics unit 210. Col. 9, lines 34-42: targets can be tracked within camera 120's FOV if there is detected movement. col. 5, lines 35-45: motion detection); selecting, based on the battery level and on the object within the FOV, one or more light sources, of a plurality of light sources, of the camera (col. 8, lines 1-27: lighting adjustments can take power resources and limits into consideration in some embodiments (step 312). Thus, if there are power limits or constraints, camera system 100 can determine an amount of power needed to turn on a light device within the array of light devices at a particular intensity...for example, camera system 100 can determine whether to turn on one or more light devices based on a set of contextual rules according to the amount of power needed and allowed for the light device…so that the system can devote more lighting resources to the person rather than the cow); and controlling the one or more light sources to illuminate the object within the FOV (Col. 9, lines 34-49, col. 10, lines 7-12: Based on the predicted trajectory of target 602, the second camera can be informed that lighting should be adjusted in accordance with an illumination determined by camera 120 or camera system 100), wherein the controlling comprises: causing a first subset of the one or more light sources to illuminate, at a first intensity, a first portion of the object within the FOV; and causing a second subset of the one or more light sources to illuminate, at a second intensity that is lower than the first intensity (col. 8, lines 47-61, lighting can be optimized such that light devices 122-1, 122-2 light up target 602 at a first intensity, and light devices 122-3, 122-4 light up target 604 at a second intensity (in this case, the first and second intensities can be different or the same depending on the lighting needs to illuminate the target). Accordingly, intelligent activation of lighting can optimize a first subset of light devices to a first intensity, and a second subset of light devices to a second intensity, and so on. Col. 9, lines 9-32: illuminating target 602 with light devices 122-1, 122-2, but turning off or de-optimizing light devices 122-3, 122-4 in accordance with some received energy budget), Abalos does not explicitly disclose receiving information indicating a battery level associated with the computing device; causing a second subset of the plurality of light sources to illuminate, at a second intensity that is lower than the first intensity, the second portion of the object. However, Van Der Sijde discloses causing a second subset of the plurality of light sources to illuminate, at a second intensity that is lower than the first intensity, the second portion of the object (¶0057: while less light is provided in region 40, corresponding to foreground person 30. Extra light is provided to the face 52 of the person in the background. ¶0080: FIG. 14A illustrates how the scene is illuminated when six LEDs are supplied with varying levels of current and three LEDs receive no current. The center LED 96 in the left column is supplied with five times more current than the five LEDs 97, 98, 99, 100, and 101 which surround LED 96). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos’s illumination system by incorporating the teaching of Van Der Sijde as noted above, in order to lower a power consumption of the system as suggested by Van Der Sijde (¶0003). Furthermore, Deaton discloses receiving information indicating a battery level associated with the computing device (col. 4, lines 62-67, col. 20, line 64 to col. 21, line 19: the security light controller may determine a third lighting control output based on the battery voltage or battery status input or alternative sensor input in order to select the appropriate light output level dependent upon such modified or degraded status). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos in view of Van Der Sijde by incorporating the teaching of Deaton as noted above, in order to preserve or lengthen the battery life (Deaton: col. 20, line 64 to col. 21, line 19). Regarding claim 11, Abalos discloses the method of claim 10, wherein the controlling comprises determining intensity of the one or more light sources based on the battery level (col. 2, lines 49-62, col. 8, lines 1-6: lighting adjustments can take power resources and limits into consideration in some embodiments (step 312). Thus, if there are power limits or constraints, camera system 100 can determine an amount of power needed to turn on a light device within the array of light devices at a particular intensity). Regarding claim 12, Abalos discloses the method of claim 10, further comprising receiving, by the computing device, one or more camera images of an area limited to a portion of the FOV that is illuminated by the one or more light sources (col. 8, lines 27-67: Frames 610-1, 610-2, 610-3, 610-4, and 6-10-5 show camera 120's FOV, which are illuminated by light devices 122-1, 122-2, 122-3, and 122-4). Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Abalos (US 11102453 B2) in view of Van Der Sijde et al. (US 20200154027 A1) and Deaton (US 9839088 B1) as applied to claim 10, and further in view of Koizumi (US 20210136301 A1). Regarding claim 13, Abalos in view of Van Der Sijde and Deaton do not explicitly disclose determining a camera frame rate based on the battery level. However, Koizumi discloses determining a camera frame rate based on the battery level (Fig. 1, ¶0052, 0065, 0119: the imaging apparatus 200 detects a remaining battery level (step S901). Subsequently, the imaging apparatus 200 determines whether or not the detected value is less than a largest threshold (step S902). In a case where the detected value is less than the threshold (step S902: Yes), the imaging apparatus 200 changes the readout image size or the frame rate in accordance with the detected value o as to reduce the image quality (step S903)). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos in view of Van Der Sijde and Deaton by incorporating the teaching of Koizumi as noted above, in order to extend the drive time in an imaging apparatus driven by a battery (Koizumi : ¶0005). Claim(s) 14, 15, 18, 19, 20 and 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Abalos (US 11102453 B2) in view of Sinitsyn et al. (US 20190246477 A1). Regarding claim 14, Abalos teaches a method comprising: receiving, by the camera device, a plurality of camera images of a field of view (FOV) of the camera device, wherein each camera image of the plurality of camera images is received based on causing a different light source, of the a plurality of light sources, to illuminate (Figs. 1, 2 and 4-7: col. 8, lines 27- 67: Frames 610-1, 610-2, 610-3, 610-4, and 6-10-5 show camera 120's FOV, which are illuminated by light devices 122-1, 122-2, 122-3, and 122-4); receiving, by the camera device, a motion detection signal related to the FOV, wherein the motion detection signal comprises information identifying an object within the FOV (col. 4, lines 57-61: Image and video data captured by lens 202 can be provided as input to other components in camera system 100, such as image processing unit 204 and detection and analytics unit 210. Col. 9, lines 34-42: targets can be tracked within camera 120's FOV if there is detected movement. col. 5, lines 35-45: motion detection). Abalos does not explicitly disclose identifying, based on the information identifying the object, a glaring portion of the object; controlling, based on the identifying the glaring portion of the object, a subset of light sources, of the plurality of light sources and illuminating the glaring portion of the object, to decrease in intensity at the glaring portion; and controlling, based on identifying that the glaring portion of the object is a region of interest of the object, a second subset of light sources illuminating a second portion of the object to decrease in intensity. However, Sinitsyn teaches identifying, based on the information identifying the object, a glaring portion of the object (¶0134: In this example, it is assumed that lighter region 701 in the image 700a is a bright spot caused by glare from a puddle from a specific outdoor lighting apparatus 101. In this embodiment, in step S3, it may be detected that there is the bright spot 701 in the image 700a that was not previously there…); controlling, based on the identifying the glaring portion of the object, a subset of light sources, of the plurality of light sources and illuminating the glaring portion of the object, to decrease in intensity at the glaring portion (Figs. 1, 3-6, ¶0040: The outdoor lighting system 100 comprises a plurality of outdoor lighting apparatuses 101. Four outdoor lighting apparatuses 101 are shown in FIG. 1a…¶0050, 0099: the lighting controller 330 controls a lighting parameter of at least one of the plurality of lighting units so as to change the illumination of the area so as to improve the image quality parameter of the captured image data…¶0134-0135: the system can reduce the illumination of the outdoor lighting apparatus 101 in the monitored area of the image 700a causing the bright spot 701, e.g. by controlling that outdoor lighting apparatus 101 to rotate to the different orientation or by reducing its light output); and controlling, based on identifying that the glaring portion of the object is a region of interest of the object, a second subset of light sources illuminating a second portion of the object to decrease in intensity (¶0050: the lighting controller 330 controls a lighting parameter of at least one of the plurality of lighting units so as to change the illumination of the area so as to improve the image quality parameter of the captured image data…¶0134-0136: In this example, it is assumed that lighter region 701 in the image 700a is a bright spot caused by glare from a puddle from a specific outdoor lighting apparatus 101…As a result, the system can reduce the illumination of the outdoor lighting apparatus 101 in the monitored area of the image 700a causing the bright spot 701, e.g. by controlling that outdoor lighting apparatus 101 to rotate to the different orientation or by reducing its light output). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos by incorporating the teaching of Sinitsyn as noted above, in order to improve the captured image quality (Sinitsyn: ¶0130, 0134). Regarding claim 15, Abalos in view of Sinitsyn discloses the method of claim 14. Sinitsyn further discloses generating, based on the plurality of camera images, a reflection map indicating one or more positions comprising glare (¶0054:The image processor 320 receives image data comprising a plurality of frames from the camera 201, and analyses the frames of the image data to obtain an image quality parameter for each frame. ¶0134: In this example, it is assumed that lighter region 701 in the image 700a is a bright spot caused by glare from a puddle from a specific outdoor lighting apparatus 101. In this embodiment, in step S3, it may be detected that there is the bright spot 701 in the image 700a that was not previously there… ), wherein the controlling comprises determining one or more light sources of the subset of the light sources that should be reduced based on the reflection map (Figs. 1, 3-6, ¶0040: The outdoor lighting system 100 comprises a plurality of outdoor lighting apparatuses 101. Four outdoor lighting apparatuses 101 are shown in FIG. 1a…¶0050, 0099: the lighting controller 330 controls a lighting parameter of at least one of the plurality of lighting units so as to change the illumination of the area so as to improve the image quality parameter of the captured image data…¶0134-0135: the system can reduce the illumination of the outdoor lighting apparatus 101 in the monitored area of the image 700a causing the bright spot 701, e.g. by controlling that outdoor lighting apparatus 101 to rotate to the different orientation or by reducing its light output). The motivation statement set forth above with respect to claim 14 applies here. Regarding claim 18, Abalos discloses the method of claim 14, wherein the controlling comprises adjusting different light sources of the subset of the light sources to illuminate with different intensities based on a distance to the object (col. 8, lines 27-36, col. 9, lines 4-17: when camera 120 has an object in its foreground but the system wants to illuminate targets in the background, with enough LEDs, the power on the LEDs aimed at close targets can be lowered while LEDs illuminating the longer distances and background can be maintained at high power). Regarding claim 19, Abalos discloses the method of claim 14, wherein the controlling comprises determining intensity of the subset of the light sources based on a battery level associated with the camera device (col. 2, lines 49-62, col. 8, lines 1-6: lighting adjustments can take power resources and limits into consideration in some embodiments (step 312). Thus, if there are power limits or constraints, camera system 100 can determine an amount of power needed to turn on a light device within the array of light devices at a particular intensity). Regarding claim 20, Abalos discloses the method of claim 14, further comprising determining the subset of the light sources based on a battery level associated with the camera device (col. 8, lines 1-27: lighting adjustments can take power resources and limits into consideration in some embodiments (step 312). Thus, if there are power limits or constraints, camera system 100 can determine an amount of power needed to turn on a light device within the array of light devices at a particular intensity...for example, camera system 100 can determine whether to turn on one or more light devices based on a set of contextual rules according to the amount of power needed and allowed for the light device…so that the system can devote more lighting resources to the person rather than the cow). Regarding claim 25, Abalos teaches the method of claim 14, wherein the plurality of light sources comprises an array of light emitting diodes oriented at a plurality of different angles (Fig. 5, col. 3, lines 61-65: Light devices 122 can be any light device or array of light devices, that illuminates all or a portion of camera's 120 FOV. For example, light devices 122 can be a static matrix of different FOV LEDs or individually position-able LEDs of various FOVs (field of view).). Claim(s) 24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Abalos (US 11102453 B2) in view of Van Der Sijde et al. (US 20200154027 A1) as applied to claim 1, and further in view of Koizumi (US 20210136301 A1). Regarding claim 24, Abalos in view of Van Der Sijde do not explicitly disclose reducing a frame rate of the camera based on a battery level associated with the computing device. However, Koizumi teaches reducing a frame rate of the camera based on a battery level associated with the computing device (Fig. 1, ¶0052, 0065: the control unit 240 reduces the frame rate together with the decrease in the remaining battery level. ¶0119: the imaging apparatus 200 detects a remaining battery level (step S901). Subsequently, the imaging apparatus 200 determines whether or not the detected value is less than a largest threshold (step S902). In a case where the detected value is less than the threshold (step S902: Yes), the imaging apparatus 200 changes the readout image size or the frame rate in accordance with the detected value o as to reduce the image quality (step S903)). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Abalos in view of Van Der Sijde by incorporating the teaching of Koizumi as noted above, in order to extend the drive time in an imaging apparatus driven by a battery (Koizumi: ¶0005). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NATHNAEL AYNALEM whose telephone number is (571)270-1482. The examiner can normally be reached M-F 9AM-5:30 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SATH PERUNGAVOOR can be reached at 571-272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NATHNAEL AYNALEM/Primary Examiner, Art Unit 2488
Read full office action

Prosecution Timeline

May 18, 2023
Application Filed
Jun 06, 2025
Non-Final Rejection — §103, §112
Sep 10, 2025
Response Filed
Dec 02, 2025
Final Rejection — §103, §112
Mar 05, 2026
Request for Continued Examination
Mar 09, 2026
Response after Non-Final Action
Mar 17, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600319
VEHICLE DOOR INTERFACE SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12587634
Disallowing Unnecessary Layers in Multi-Layer Video Bitstreams
2y 5m to grant Granted Mar 24, 2026
Patent 12581103
VIDEO ENCODING/DECODING METHOD AND DEVICE, AND BITSTREAM STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12581126
LOW COMPLEXITY NN-BASED IN LOOP FILTER ARCHITECTURES WITH SEPARABLE CONVOLUTION
2y 5m to grant Granted Mar 17, 2026
Patent 12572023
OPTICAL NAVIGATION DEVICE WITH INCREASED DEPTH OF FIELD
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
90%
With Interview (+13.9%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 662 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month