DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Applicant(s) Response to Official Action
The response filed on 12/03/2025 has been entered and made of record.
Response to Arguments/Amendments
Presented arguments have been fully considered, but are rendered moot in view of the new ground(s) of rejection necessitated by amendment(s) initiated by the applicant(s).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-13 are rejected under 35 U.S.C. 103 as being unpatentable over Wong et al., hereinafter referred to as Wong (US 2023/0012744 A1) in view of Sironi et al., hereinafter referred to as Sironi (US 2022/0036110 A1) in further view of Dahlgren et al., hereinafter referred to as Dahlgren (US 2026/0006304 A1).
As per claim 1, Wong discloses an object tracking apparatus (Wong: Abstract) comprising:
at least one processor or circuit configured to function as (Wong: Paras. [0045], [0048] disclose a processor system (130) comprising a CPU and other components configured to control the system and execute programming.):
(1) an acquisition unit configured to acquire (a) an output of an event detection apparatus (EBS sensor) that detects an event based on a change in luminance of a pixel and (b) an output of an imaging apparatus (image sensor) that images an object at a predetermined frame rate (Wong: Paras. [0002], [0006], [0042], [0045], [0048] disclose the processor system (130) processing data output from the image sensor and EBS sensor (asynchronously detects a change in light intensity for every pixel), which include both event detection signals and imaging sensor data, thereby acquiring their outputs.);
(2) a first control unit configured to control the event detection apparatus and the imaging apparatus (Wong: Paras. [0048], [0059] disclose the processor system (130) controls the components of the imaging device, including the drive circuit (211) which drives the unit pixels.);
(4) a data processing unit configured to generate image data from the output of the imaging apparatus and the output of the event detection apparatus (Wong: Paras. [0010], [0048] disclose the processor system (130) processes both event and image sensor data to perform functions like object recognition, which requires generating usable data from the raw sensor outputs.); and
(5) a third control unit configured to perform detection and tracking control of an object by using the image data generated by the data processing unit (Wong: Paras. [0010], [0114], [0120] disclose performing object recognition and classification by evaluating events and using the image sensor to obtain further information about the object.),
wherein the second control unit performs control (Wong: Paras. [0007]-[0009], [0115]-[0119] disclose the concept of setting operating parameters of the image sensor [imaging apparatus] in response to outputs from the EBS sensor [event detection apparatus].).
However, Wong does not explicitly disclose “… a second control unit configured to control a detection condition for the event detection apparatus to detect an event; … wherein the second control unit performs control for setting a detection condition of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate.”.
Further, Sironi is in the same field of endeavor and teaches a second control unit configured to control a detection condition for the event detection apparatus to detect an event (Sironi: Para. [0011] discloses for an event-based sensor, the “activation threshold Q can be fixed, or can be adapted”. Adapting this threshold is equivalent to controlling a detection condition.);
wherein the second control unit performs control for setting a detection condition of the event detection apparatus (Sironi: Paras. [0020], [0034], [0038] disclose a two-process method: a first process detects an object/region of interest (ROI) using a frame (image data) and a second process determines the track based on events in that ROI. The ROI determined by the frame-based process sets the detection condition (region) for the event-based apparatus.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, and having the teachings of Wong and Sironi before him or her, to modify the imaging system of Wong to include the controlling and setting detection condition feature as described in Sironi. The motivation for doing so would have been to improve overall system efficiency and robustness by providing an optimized configuration that modulates event sensor output data based on an imager’s capturing operation.
However, Wong-Sironi do not explicitly disclose “… setting a detection condition of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate …”.
Furthermore, Dahlgren is in the same field of endeavor and teaches setting a detection condition (thresholds) of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate (Dahlgren: Paras. [0138], [0145]-[0146], [0216], [0221] disclose the captured high-resolution images at a synchronous rate as well as any captured high-resolution video stream may be analyzed based on thresholds detected in change detectors 231 and automatically adjust the settings of the camera module 300.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, and having the teachings of Wong-Sironi and Dahlgren before him or her, to modify the imaging of Wong-Sironi to include the detection condition of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate feature as described in Dahlgren. The motivation for doing so would have been to improve motion sensitivity at low light conditions by providing a configuration that enables robust exposure adjustment for motion that would otherwise not be detected.
As per claim 2, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the second control unit performs control for setting a detection condition of the event detection apparatus by determining brightness (Sironi: luminance) or contrast of a captured image acquired by the imaging apparatus or a detection result of an object (Sironi: Paras. [0007]-[0012], [0034]-[0039] disclose updating the detection threshold in the event-based sensor when luminance change exceeds threshold (positive/negative) polarity.).
As per claim 3, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the second control unit acquires information on a tracking object determined by the third control unit and performs control for setting a detection region of an event in an image generated by the event detection apparatus (Sironi: Paras. [0011], [0034]-[0039], [0102]-[0104] disclose using tracking object information from the fast process to update the ROI for event detection.).
As per claim 4, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the second control unit performs control for changing a detection threshold of an event in the event detection apparatus (Sironi: Paras. [0007]-[0012], [0034]-[0039] disclose updating the detection threshold in the event-based sensor when luminance change exceeds threshold (positive/negative) polarity.).
As per claim 5, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the third control unit calculates a reliability of an object detection result based on a captured image acquired by the imaging apparatus and a reliability of an object detection result based on an image acquired by the event detection apparatus, and performs tracking control by using an object detection result in which the reliability is higher (Sironi: Paras. [0099]-[0104] disclose selection of detection results based on confidence/reliability from the slow vs. fast process.).
As per claim 6, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the second control unit performs control for changing a detection region of an event in an image acquired by the event detection apparatus according to a moving speed or a moving direction of a tracking object determined by the third control unit (Wong: Paras. [0008]-[0009] disclose the ROI of the image sensor is adjusted based on object speed/direction and Sironi: Paras. [0099]-[0104] disclose selection of detection results based on confidence/reliability from the slow vs. fast process.).
As per claim 7, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the data processing unit changes a framing cycle for generating a framed image from an output of the event detection apparatus according to a moving speed of an object detected by the third control unit (Wong: Paras. [0008]-[0009] disclose the frame rate of the image sensor is adjusted based on object speed and Sironi: Paras. [0020]-[0026], [0034]-[0039], [0093]-[0098] disclose events accumulated over a time interval to form images and the interval is adjusted based on object motion.).
As per claim 8, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the second control unit performs control for changing a size or a shape of a detection region of an event in an image generated by the event detection apparatus, in the control of the detection condition (Wong: Para. [0008] discloses the size and shape of the activated area can also be varied based on characteristics of the object detected by the EBS sensor and Sironi: Paras. [0020]-[0026], [0103] disclose the detection region is continuously re-defined based on object motion and shape.).
As per claim 9, Wong-Sironi-Dahlgren disclose the object tracking apparatus according to claim 1, wherein the event detection apparatus detects, as the event, a case where a signal based on intensity of light incident to an imaging element of the event detection apparatus increases exceeding a threshold or a case where the signal decreases below the threshold (Wong: Paras. [0042], [0057] disclose the event detection sensor responds to a change in intensity asynchronously. Intensity change is correlated with a change in photocurrent, and if this change exceeds a constant threshold value it could be detected as an event. Further, Sironi: Paras. [0007]-[0012], disclose updating the detection threshold in the event-based sensor when luminance change exceeds threshold (positive/negative) polarity.).
As per claim 10, Wong discloses an imaging system (Wong: Abstract) comprising:
an event detection apparatus that has an asynchronous imaging element and that detects an event based on a change in luminance of a pixel (Wong: Paras. [0002], [0006], [0042] disclose a system with an event-based sensor (EBS) that asynchronously detects a change in light intensity for every pixel.);
an imaging apparatus that has a synchronous imaging element and that captures an image of an object at a predetermined frame rate (Wong: Paras. [0002], [0006], [0009] disclose a regular frame-based image sensor that operates at a selected frame rate.); and
at least one processor or circuit configured to function as (Wong: Paras. [0045], [0048] disclose a processor system (130) comprising a CPU and other components configured to control the system and execute programming.):
(1) an acquisition unit configured to acquire an output of the event detection apparatus and an output of the imaging apparatus (Wong: Paras. [0045], [0048] disclose the processor system (130) processing data output from the image sensor, which includes both event detection signals and imaging sensor data, thereby acquiring their outputs.);
(2) a first control unit configured to control the event detection apparatus and the imaging apparatus (Wong: Paras. [0048], [0059] disclose the processor system (130) controls the components of the imaging device, including the drive circuit (211) which drives the unit pixels.);
(4) a data processing unit configured to generate image data from the output of the imaging apparatus and the output of the event detection apparatus (Wong: Paras. [0010], [0048] disclose the processor system (130) processes both event and image sensor data to perform functions like object recognition, which requires generating usable data from the raw sensor outputs.); and
(5) a third control unit configured to perform object detection and tracking control by using image data generated by the data processing unit (Wong: Paras. [0010], [0114], [0120] disclose performing object recognition and classification by evaluating events and using the image sensor to obtain further information about the object.),
wherein the second control unit performs control (Wong: Paras. [0007]-[0009], [0115]-[0119] disclose the concept of setting operating parameters of the image sensor [imaging apparatus] in response to outputs from the EBS sensor [event detection apparatus].).
However, Wong does not explicitly disclose “… a second control unit configured to control a detection condition for the event detection apparatus to detect an event; … wherein the second control unit performs control for setting a detection condition of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate.”.
Further, Sironi is in the same field of endeavor and teaches a second control unit configured to control a detection condition for the event detection apparatus to detect an event (Sironi: Para. [0011] discloses for an event-based sensor, the “activation threshold Q can be fixed, or can be adapted”. Adapting this threshold is equivalent to controlling a detection condition.);
wherein the second control unit performs control for setting a detection condition of the event detection apparatus according to an imaging state (Sironi: Paras. [0007], [0011], [0090], [0101]-[0102] disclose for an event-based sensor, the activation threshold Q can be adapted to an imaging state.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, and having the teachings of Wong and Sironi before him or her, to modify the imaging system of Wong to include the controlling and setting detection condition feature as described in Sironi. The motivation for doing so would have been to improve overall system efficiency and robustness by providing an optimized configuration that modulates event sensor output data based on an imager’s capturing operation.
However, Wong-Sironi do not explicitly disclose “… performs control for setting a detection condition of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate …”.
Furthermore, Dahlgren is in the same field of endeavor and teaches performing control for setting a detection condition (thresholds) of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate (Dahlgren: Paras. [0138], [0145]-[0146], [0216], [0221] disclose the captured high-resolution images at a synchronous rate as well as any captured high-resolution video stream may be analyzed based on thresholds detected in change detectors 231 and automatically adjust the settings of the camera module 300.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, and having the teachings of Wong-Sironi and Dahlgren before him or her, to modify the imaging of Wong-Sironi to include the detection condition of the event detection apparatus according to the output of the imaging apparatus that images an object at the predetermined frame rate feature as described in Dahlgren. The motivation for doing so would have been to improve motion sensitivity at low light conditions by providing a configuration that enables robust exposure adjustment for motion that would otherwise not be detected.
As per claim 11, Wong-Sironi-Dahlgren disclose the imaging system according to claim 10, wherein the second control unit performs control for setting a detection condition of the event detection apparatus according to a moving speed of a tracking object or a relative moving speed between the imaging system and the tracking object (Wong: Paras. [0008]-[0009] disclose the ROI of the image sensor is adjusted based on object speed/direction and Sironi: Paras. [0099]-[0104] disclose selection of detection results based on confidence/reliability from the slow vs. fast process.).
As per claims 12-13, the claims recite analogous limitations to claims 1 & 10 above, and are therefore rejected on the same premise.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure and can be viewed in the list of references.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PEET DHILLON whose telephone number is (571)270-5647. The examiner can normally be reached M-F: 5am-1:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sath V. Perungavoor can be reached at 571-272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PEET DHILLON/Primary Examiner
Art Unit: 2488
Date: 02-23-2026