Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/18/2025 has been entered.
Response to amendment
Submission dated 11/18/2025 amends claims 1, 10 and 11. Claims 3 and 13 were previously cancelled. Claims 1, 2, 4-12, and 14-19 are pending.
Response to Arguments
Applicant’s arguments with respect to the independent claim(s) have been considered but are moot because the new ground of rejection features a reference that has not been applied and hence has not been challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-2, 4-5, 7-9, 11-12, 14-15, and 17-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over us patent application publication no. 2024/0200763 to Shih et al. (hereinafter Shih) in view of us patent application publication no. 2019/0059142 to Chen and further in view of us patent application publication no. 2015/0325092 to Zhevelev.
For claim 1, Shih as applied teaches an electronic monitoring system comprising:
an escutcheon (see, e.g., pars. 30-31 and FIGS. 1, which teach a floodlight device having a main housing connected to a wallplate);
a camera module attached to the escutcheon (see, e.g., pars. 30-31 and FIGS. 1, which teach a camera device attached to the main housing), the camera module including a microcontroller and a camera that is operatively connected to the microcontroller and that is adapted to capture video images within a camera field-of-view when the camera is activated (see, e.g., pars. 11, 30-31, 43-45, 52-53, and 81 and FIGS. 1, 4, 6, and 7 which teach that the camera device is attached to the main housing including a microcontroller unit, and that the camera device captures images or records video within the first PIR FOV of the camera when activated);
a first motion detector and a second motion detector configured to detect motion in respective first and second fields-of-view and being independently attached to the escutcheon at locations spaced from the camera module by articulated joints allowing each of the first motion detector and the second motion detector to swivel independently of each other and the camera module for adjustment of the first and second fields-of-view in elevation and azimuth independent of each other and the camera module (see, e.g., pars. 49-51 and FIGS. 6 and 7, which teach positioning multiple PIR sensors independently of each other and each having a respective FOV), the first motion detector and the second motion detector providing corresponding electronic motion signals to the microcontroller upon detection of motion in the respective first and second fields-of-view (see, e.g., pars. 11, 30-31, 43-45, 49-53, 57, and 81 and FIGS. 1, 4, 6, and 7, which teach that the PIR sensors communicate with the microcontroller upon detection of motion in the second PIR FOV); and
wherein the microcontroller is configured to activate the camera in response to an electronic motion signal of each or any of the first and second motion detectors (see, e.g., pars. 11, 30-31, 43-45, 52-53, and 81 and FIGS. 1, 4, 6, and 7, which teach that, responsive to motion detected by the PIR sensors, the microcontroller unit activates the camera).
While Shih as applied teaches attaching motion detectors independently of each other, it does not explicitly teach that they are attached at locations spaced from a camera module by articulated joints allowing adjustment of the first and second fields-of-view in elevation and azimuth independent of each other and the camera module and that each motion detectors providing respective motion signal to the microcontroller. Chen in the analogous art teaches a triple head design that has side two motion sensing modules independently attached to the junction box at locations spaced from the center module by articulated joints allowing the side detectors to swivel independently of each other and also of the center module (see, e.g., par. 58 and FIGS. 9A-D of Chen) and that each of the motion sensors outputting a motion sensing signal to a centralized controller (see, e.g., pars. 57 and 61-63 of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s side modules to include and function as the motion detectors of Chen because doing so would yield predictable results of achieving a higher detection angle coverage from the motion detectors (see MPEP 2143(I)(D) and par. 58 of Chen).
While Shih in view of Chen does not explicitly teach, Zhevelev in the analogous art teaches that the main controlling system is configured to wirelessly transmit detection signals corresponding to the electronic motion signals to a mobile device (see, e.g., pars. 56, 59, 62, 66, 70-72, and 75-76 and FIGS. 2A-C of Zhevelev, which teach that the system allows the motion detection sensor to wirelessly communicate with the mobile communicator), the mobile device configured to generate instructions for a user to facilitate the movement of the first motion detection and the second motion detector to adjust the first and second fields-of-view to desired locations (see, e.g., pars. 55-56 and 59-63 and FIGS. 2A of Zhevelev, which teach that the dedicated mobile communicator application installed on the mobile communicator is operable for facilitating remotely controlling orientation of the motion detection sensors so that they are aligned to monitor a desired portion of the area).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih in view of Chen to utilize a mobile device as taught by Zhevelev because doing so provide an accessible visual output to the user thereby allowing the user to verify correction orientation of each sensor (see pars. 5 and 62-63 of Zhevelev).
For claim 2, Shih in view of Chen and Zhevelev teaches that the third motion detector is on the camera (see, e.g., pars. 52-53 and FIG. 7 of Shih, which teach that the first PIR FOV is the camera PIR FOV).
For claim 4, While Shih does not explicitly teach, Chen in the analogous art teaches that the motion detector is positioned with the floodlight (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s floodlights to include the motion detectors as taught by Chen because doing so would yield predictable results of improving the security of an area by improving the visibility and coverage of the motion detection and lights (see MPEP 2143(I)(D)).
For claim 5, Shih in view of Chen and Zhevelev teaches an interface connecting the first and second motion detectors to the microcontroller, and wherein the first and second motion detectors and the camera are configured such that operation of the at least one floodlight of the first and second motion detectors can be controlled by the microprocessor via the interface (see, e.g., pars. 11, 43-45, 52-53, 57, and 81 and FIGS. 1 and 4, which teach having a PIR holder for mounting the microcontroller unit and that the PIR sensors can be operated by the microcontroller unit).
For claim 7, Shih in view of Chen and Zhevelev teaches that the motion detectors are passive infrared detectors (see, e.g., FIG. 6 of Shih showing PIRs).
For claim 8, Shih in view of Chen and Zhevelev teaches that the articulated joints provide rotation at constant azimuth and elevation (see, e.g., pars. 58-60, 67 and 99-101 and FIGS. 1-4 and 12A-B of Shih, which teach using a pivoting joint).
For claim 9, Shih as applied teaches that the floodlight of a given motion detector provides an area of illumination having a greatest width along a width axis (see, e.g., pars. 12 and 30, which teach that the light subassembly provides a wide illumination range), wherein the field-of-view of the given motion detector has a greatest width of detection along the width axis (see, e.g., pars. 49-53 and FIGS. 6-7 of Shih, which teach that the angular range of the PIR FOV is approximately 180 degrees horizontal while being 30 degrees vertical) and swiveling of the width axis about an axis of propagation of light from the floodlight (see, pars. 41-42 and FIG. 3, which teach that the floodlight is pivotally mounted and moveable in three difference axes).
Shih as applied, however, does not explicitly teach that at least one of the first and second motion detectors further includes a floodlight, that the area of illumination has the greatest width along a width axis and that the articulated joints allow swiveling of the width axis about an axis of propagation of light from the floodlight. Chen in the analogous art teaches that the motion detector is positioned with the floodlight (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen), the area of illumination has the greatest width along a width axis (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen) and the articulated joints allow swiveling of the width axis about an axis of propagation of light from the floodlight (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s floodlights to include the motion detectors and swivel as taught by Chen because doing so would yield predictable results of improving the security of an area by improving the visibility and coverage of the motion detection and lights (see MPEP 2143(I)(D)).
For claim 11, Shih as applied teaches a method of area monitoring comprising:
positioning a camera module including a microprocessor and a camera for capturing video images within a camera field-of-view when the camera is activated by the microprocessor (see, e.g., pars. 30-40 and 43-45 and FIGS. 1 and 4, which teach positioning the floodlight device including a microcontroller unit MCU communicatively coupled to a camera device);
positioning first and second motion detectors in spaced relation to the camera module and each other, the first and second motion detectors being allowed to swivel independently of each other and the camera module in elevation and azimuth, the first and second motion detectors having respective fields-of-view (see, e.g., pars. 49-51 and FIGS. 6, 7, which teach positioning PIR sensors independently of each other in azimuth and each having a respective FOV) and providing electronic motion signals to the microprocessor upon detection of motion in the respective first and second fields-of-view (see, e.g., pars. 52-53 and FIG. 7, which teach activating the LEDs upon detection of motion by the PIR sensors); and
activating the camera with the microprocessor in response to at least one electronic motion signal of the first and second motion detectors (see, e.g., pars. 52-53 and FIG. 7, which teach signaling the camera device to activate its image sensor and begin capturing images or recording videos upon the detection of motion by the PIR sensors).
While Shih as applied teaches attaching motion detectors independently of each other, it does not explicitly teach that they are positioned at locations spaced from a camera module and each other and allowed to swivel independently of each other and the camera module in elevation and azimuth. Chen in the analogous art teaches a triple head design that has side two motion sensing modules independently attached to the junction box at locations spaced from the center module by articulated joints allowing the side detectors to swivel independently of each other and also of the center module in elevation and azimuth (see, e.g., par. 58 and FIGS. 9A-D of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s side modules to include and function as the motion detectors of Chen because doing so would yield predictable results of achieving a higher detection angle coverage from the motion detectors (see MPEP 2143(I)(D) and par. 58 of Chen).
While Shih in view of Chen does not explicitly teach, Zhevelev in the analogous art teaches that the main controlling system is configured to wirelessly transmit detection signals corresponding to the electronic motion signals to a mobile device (see, e.g., pars. 56, 59, 62, 66, 70-72, and 75-76 and FIGS. 2A-C of Zhevelev, which teach that the system allows the motion detection sensor to wirelessly communicate with the mobile communicator), the mobile device configured to generate instructions for a user to facilitate the movement of the first motion detector and the second motion detector to adjust the first and second fields-of-view to desired locations (see, e.g., pars. 55-56 and 59-63 and FIGS. 2A of Zhevelev, which teach that the dedicated mobile communicator application installed on the mobile communicator is operable for facilitating remotely controlling orientation of the motion detection sensors so that they are aligned to monitor a desired portion of the area).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih in view of Chen to utilize a mobile device as taught by Zhevelev because doing so provide an accessible visual output to the user thereby allowing the user to verify correction orientation of each sensor (see pars. 5 and 62-63 of Zhevelev).
For claim 12, Shih in view of Chen and Zhevelev teaches a third motion detector on the camera (see, e.g., pars. 52-53 and FIG. 7 of Shih, which teach that the first PIR FOV is the camera PIR FOV).
For claim 14, Shih as applied teaches that at least one of the first and second motion detectors further includes a floodlight and activates the floodlight in response to an electronic motion signal motion detector of the associated floodlight (see, e.g., pars. 52-53 and FIG. 7, which teach activating the LEDs in the light subassemblies based on detection of motion by the PIR sensors).
Shih as applied, however, does not explicitly teach that at least one of the first and second motion detectors further includes a floodlight. Chen in the analogous art teaches that the motion detector is positioned with the floodlight (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s floodlights to include the motion detectors as taught by Wang because doing so would yield predictable results of improving the security of an area by improving the visibility and coverage of the motion detection and lights (see MPEP 2143(I)(D)).
For claim 15, Shih in view of Chen and Zhevelev as teaches controlling operation of the floodlight of the at least one first and second motion detectors by transmitting signals to the first and second motion detectors from the microprocessor (see, e.g., pars. 52-53 and FIG. 7 of Shih, which teach that the MCU activates the LEDs in the light subassemblies based on detection of motion by the PIR sensors).
For claim 17, Shih in view of Chen and Zhevelev as teaches that the motion detectors comprises passive infrared detectors (see, e.g., FIG. 6 of Shih showing PIRs).
For claim 18, Shih in view of Chen and Zhevelev teaches positioning the light subassemblies by rotation at constant azimuth and elevation (see, e.g., pars. 28, 33-39, 41, and 54-56 of Shih, which teach independently mounting each modular subassembly of a system onto a single housing via a pivot point that provides each modular subassembly rotation about three axes at a fixed, constant position).
For claim 19, Shih as applied teaches that the floodlight of a given motion detector provides an area of illumination having a greatest width along a width axis (see, e.g., pars. 12 and 30, which teach that the light subassembly provides a wide illumination range), wherein the field-of-view of the given motion detector has a greatest width of detection along the width axis (see, e.g., pars. 49-53 and FIGS. 6-7 of Shih, which teach that the angular range of the PIR FOV is approximately 180 degrees horizontal while being 30 degrees vertical) and swiveling of the width axis about an axis of propagation of light from the floodlight (see, pars. 41-42 and FIG. 3, which teach that the floodlight is pivotally mounted and moveable in three difference axes).
Shih as applied, however, does not explicitly teach that at least one of the first and second motion detectors further includes a floodlight, and that the area of illumination has the greatest width along a width axis.
Shih as applied, however, does not explicitly teach that at least one of the first and second motion detectors further includes a floodlight, that the area of illumination has the greatest width along a width axis and that the articulated joints allow swiveling of the width axis about an axis of propagation of light from the floodlight. Chen in the analogous art teaches that the motion detector is positioned with the floodlight (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen), the area of illumination has the greatest width along a width axis (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen) and the articulated joints allow swiveling of the width axis about an axis of propagation of light from the floodlight ((see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s floodlights to include the motion detectors and swivel as taught by Chen because doing so would yield predictable results of improving the security of an area by improving the visibility and coverage of the motion detection and lights (see MPEP 2143(I)(D)).
Claim(s) 6 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Shih in view of Chen and Zhevelev and further in view of us patent no. 10778887 by Xu.
For claims 6 and 16, while Shih in view of Chen and Zhevelev does not explicitly teach, Xu in the analogous art teaches the camera field-of-view is larger in area than the field-of-view of the first and second motion detectors (see, e.g., lines 36-43 in col. 3, lines 10-36 in col. 4, lines 31-53 in col. 5, and lines 39-49 in col. 8 of Xu, which teach a camera of a home security system having a wide viewing angle, and the coverage zones of the motion sensors being adjusted to fall within the viewing angle, e.g., to not overlap with one another).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih in view of Chen to adjust the coverage zones to be less than the viewing angle of the corresponding camera as taught by Xu because doing so would allow meeting various design criteria of a particular implementation (see, e.g., lines 31-53 in col. 5 of Xu).
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shih in view of Chen and Zhevelev and further in view of us patent application publication no. 2022/0284783 to Wang.
For claim 10, Shih as applied teaches an electronic monitoring system comprising:
a housing adapted to attach to a line voltage (see, e.g., pars. 2, 7, 12, and 30-31 and FIGS. 1, which teach a floodlight device having a main housing connected to a wallplate and includes a power supply unit);
a camera module attached to the housing by an articulated joint movable in elevation and azimuth (see, e.g., pars. 30, 33-40 and FIGS. 1 and 2, which teach a camera device mounted to the main housing with a 3-axis of articulation), the camera module housing a microprocessor and a camera that is adapted to capture video images being within a camera field- of-view when the camera is activated by the microprocessor (see, e.g., pars. 11, 30-31, 43-45, 52-53, and 81 and FIGS. 1, 4, 6, and 7 which teach that the camera device is attached to the main housing including a microcontroller unit, and that the camera device captures images or records video within the first PIR FOV of the camera when activated), the camera module further including having a motion detector having a first motion field-of-view aligned with the camera field-of-view and providing an electronic motion signal to the microprocessor upon detection of motion in the first motion field-of-view (see, e.g., pars. 11-12, 28-29, 52-53, and 81 and FIG. 7, which teach having a PIR within the camera device and the microcontroller activating the camera based on detection of motion);
a second motion detector and third motion detector independent of the camera module, the second and third motion detectors being adapted to detect motion in respective second and third motion fields-of-view and being independently attached to the housing by articulated joints so as to allow the second motion detector and the third motion detector to swivel independently of each other and the camera module such that the second motion detector and the third motion detector are movable in elevation and azimuth (see, e.g., pars. 49-51 and FIGS. 6, 7, which teach positioning multiple PIR sensors independently of each other and each having a respective FOV), the second and third motion detectors each providing a corresponding electronic motion signal to the microprocessor in the camera module upon detection of motion in the respective second and third fields-of-view (see, e.g., pars. 11, 30-31, 43-45, 49-53, 57, and 81 and FIGS. 1, 4, 6, and 7, which teach that the PIR sensors communicate with the microcontroller upon detection of motion in the second PIR FOV);
an interface connecting the second and third motion detectors to the microprocessor (see, e.g., pars. 11, 43-45, 52-53, 57, and 81 and FIGS. 1 and 4, which teach having a PIR holder for mounting the microcontroller unit);
wherein the second and third motion detectors and the microprocessor are configured such that operation of the floodlights of the second and third motion detectors can be controlled by the microprocessor via the interface (see, e.g., pars. 11, 43-45, 52-53, 57, and 81 and FIGS. 1 and 4, which teach h that the PIR sensors can be operated by the microcontroller unit via the PIR holder); and
wherein the microprocessor and each of the motion detectors intercommunicate and are configured such that the microprocessor activates the camera for video recording in response to receipt of electronic motion signals from any of the first, second and third motion detectors (see, e.g., pars. 11, 30-31, 43-45, 52-53, and 81 and FIGS. 1 and 4, which teach that, responsive to motion detected by the PIR sensors, the microcontroller unit activates the camera).
While Shih as applied teaches attaching motion detectors independently of each other, it does not explicitly teach that they swivel independently of each other and the camera module such that they are moveable in elevation and azimuth independent of each other and the camera module, the motion detectors being positioned with the floodlights and that each motion detectors providing respective motion signal to the microcontroller. Chen in the analogous art teaches a triple head design that has side two motion sensing modules independently attached to the junction box at locations spaced from the center module by articulated joints allowing the side detectors to swivel independently of each other and also of the center module (see, e.g., par. 58 and FIGS. 9A-D of Chen), that the motion detectors are positioned with the lights (see, e.g., pars. 56-63 and 65 and FIGS. 8A-12D and 14A-D of Chen, which teach integrating a motion detector with a light) and that each of the motion sensors outputting a motion sensing signal to a centralized controller (see, e.g., pars. 57 and 61-63 of Chen).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih’s side modules to include and function as the motion detectors of Chen because doing so would yield predictable results of achieving a higher detection angle coverage from the motion detectors (see MPEP 2143(I)(D) and par. 58 of Chen).
While Shih in view of Chen does not explicitly teach, Zhevelev in the analogous art teaches that the main controlling system is configured to wirelessly transmit detection signals corresponding to the electronic motion signals to a mobile device (see, e.g., pars. 56, 59, 62, 66, 70-72, and 75-76 and FIGS. 2A-C of Zhevelev, which teach that the system allows the motion detection sensor to wirelessly communicate with the mobile communicator), the mobile device configured to generate instructions for a user to facilitate the movement of the second motion detector and the third motion detector to adjust the second and third fields-of-view to desired locations (see, e.g., pars. 55-56 and 59-63 and FIGS. 2A of Zhevelev, which teach that the dedicated mobile communicator application installed on the mobile communicator is operable for facilitating remotely controlling orientation of the motion detection sensors so that they are aligned to monitor a desired portion of the area).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih in view of Chen to utilize a mobile device as taught by Zhevelev because doing so provide an accessible visual output to the user thereby allowing the user to verify correction orientation of each sensor (see pars. 5 and 62-63 of Zhevelev).
Shih in view of Chen and Zhevelev does not explicitly teach housing the microcontroller within the camera module. Wang in the analogous art teaches having a controller within each camera module (see, e.g., pars. 58-66 and FIG. 1 of Wang).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shih in view of Chen to include a microcontroller within the camera device because doing so yield predictable results of making the combined system more compact and more robust (see MPEP 2143(I)(D)).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WOO RHIM whose telephone number is (571)272-6560. The examiner can normally be reached Mon - Fri 9:30 am - 6:00 pm et.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Henok Shiferaw can be reached at 571-272-4637. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WOO C RHIM/Examiner, Art Unit 2676