Prosecution Insights
Last updated: April 19, 2026
Application No. 18/555,710

DETECTING AN OBJECT IN AN ENVIRONMENT

Non-Final OA §103
Filed
Oct 16, 2023
Examiner
CHOWDHURY, NIGAR
Art Unit
2484
Tech Center
2400 — Computer Networks
Assignee
Essence Security International (E S I ) Ltd.
OA Round
3 (Non-Final)
69%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
86%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
490 granted / 713 resolved
+10.7% vs TC avg
Strong +17% interview lift
Without
With
+17.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
21 currently pending
Career history
734
Total Applications
across all art units

Statute-Specific Performance

§101
7.3%
-32.7% vs TC avg
§103
50.7%
+10.7% vs TC avg
§102
29.4%
-10.6% vs TC avg
§112
3.3%
-36.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 713 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/12/2025 has been entered. Response to Arguments Applicant's arguments filed on 12/12/2025 have been fully considered but they are not persuasive. In re pages 14-15, the applicant argues that “Although Gordon discusses switching from an inactive state to a higher-power, active state, Gordon does not disclose conditions upon which the camera switches back to the lower-power, inactive mode. Furthermore, Gordon fails to disclose the limitation in amended claim 41 of detecting a predefined condition to cause the camera to switch to a lower powered mode, that predefined condition comprising that the object has left the second monitoring region of the active reflected wave detector. Furthermore, Gordon's auxiliary motion sensor 122a may be unable to detect that an object has left the field of view 222. For example, an object that ceases to move within the field of view 222 of the auxiliary motion sensor 122a may cause a ceasing of motion detection. Thus, it may be indeterminable whether a cessation of motion detection is due to an object having left the field of view 222 or having merely stopped moving within the field of view 222. In any case, Gordon does not disclose an ability to detect an object having left the field of view 222 of the auxiliary motion sensor 122a. Since Gordon does not teach any second monitoring region of the active reflective wave detector which is used to switch the camera to a lower powered mode in response to the object having been left, it follows that Gordon fails to disclose that the claimed second monitoring region of the active reflected wave detector extends beyond the first monitoring region of the active reflected wave detector in at least one direction. Since Knasel also fails to disclose these features, Applicant submits that claim 41 is non-obvious over Knasel in view of Gordon.” In response, the examiner respectfully disagrees. Knasel et al. discloses paragraph 0037 that “In certain embodiments, a wide variety of control actions may be implemented or directed by the monitoring application 164 based upon determined movement or motion associated with a monitored subject. For example, a determination may be made that a monitored subject is entering the detectable area of a camera 125 (or other sensor), and the camera 125 (or other sensor) may be initiated (i.e., turned on, taken out of a sleep mode or power conservation mode, etc.).”, paragraph 0048 teaches “In various embodiments, it may be desirable for one or more cameras 125 to operate in a "keep alive" mode or a "sleep mode" until it is triggered by a slave sensor or another camera operating in a peer-to-peer mode on a local network. A camera 125 operating in a keep alive mode may be desirable to allow the camera 125 to operate on a battery and to preserve battery life. As desired in various embodiments, different cameras 125 may be activated and/or woken up as a monitored subject moves through a structure. For example, data collected by one or more wave sensors 135 may be evaluated in order to track the movement of a subject through a structure, and cameras 125 may be selectively activated based upon the movement.”, paragraph 0061 teaches “In operation, one or more sensors, such as the motion detector 425, may identify the presence of a subject or object to be monitored. Based upon the identification of the subject, any number of wave sensors 430, 435 may be activated and utilized to track the location and/or movement of the subject. Additionally, based at least in part upon the tracked location and/or movement, one or more of the cameras 405, 410, 415 may be activated and/or awakened. For example, as the subject enters a viewable area of a camera, the camera may be activated. As another example, as the subject exits the viewable area of a camera, the camera may be deactivated or placed in a sleep mode.” Knasel et al. discloses camera is controllable to switch from one mode to an another mode to prepare the camera to capture an image mode that consumes less power. Gordon et al. discloses fig. 2, col. 10 lines 7-22 teaches “FIG. 2 depicts example fields-of-view of camera device 120 and auxiliary motion sensor 122a, in accordance with various aspects of the present disclosure. In FIG. 2, camera device 120 has a field-of-view 220. In the example of FIG. 2, the fields-of-view of image sensor 250 and motion sensor 258 of camera device 120 may be substantially aligned such that field-of-view 220 represents the field-of-view of both the image sensor 250 and motion sensor 258. Auxiliary motion sensor 122a may have a field-of-view 222. As depicted in FIG. 2, the field-of-view 220 of camera device 120 and the field-of-view 222 of auxiliary motion sensor 122a may overlap at overlap region 230. In various examples, camera device 120 and auxiliary motion sensor 122a may be positioned such that overlap region 230 covers an area-of-interest (e.g., an area that a user desires to monitor with camera device 120). Herein, Gordon et al. teaches the second monitoring region extends beyond the first monitoring region in at least one direction. Therefore, in view of the above, the examiner believes that the features of the claims are taught by the applied arts. See also the Office Action sets for the below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 41-42, 46, 48-50, 53-54, 57, 59-64, 71, 74-75 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2012/0092502 by Knasel et al. in view of US 10,657,784 by Gordon et al. Regarding claim 41, Knasel et al. discloses an apparatus for detecting an object in an environment (fig. 1), the apparatus comprising: a camera, wherein the camera is controllable to switch from a first mode to a second mode to prepare the camera to capture an image (paragraph 0061 teaches “Additionally, based at least in part upon the tracked location and/or movement, one or more of the cameras 405, 410, 415 may be activated and/or awakened. For example, as the subject enters a viewable area of a camera, the camera may be activated. As another example, as the subject exits the viewable area of a camera, the camera may be deactivated or placed in a sleep mode.”); and a processor configured to: determine whether a predetermined condition is met based on processing first measured wave reflection data obtained by an active reflected wave detector, the first measured wave reflection data obtained by an active reflected wave detector being associated with first wave reflections from a first monitoring region of the active reflected wave detector in said environment (in addition to discussion above, paragraph 0020 teaches “In operation, a wave sensor may emit a wave (e.g., a sound wave, etc.), and the wave sensor may monitor reflections of the wave. For example, the time between the output of a wave and the receipt of a wave reflection may be monitored. Based upon a determined time delay from output until reflection receipt, a distance between the wave sensors and an object that caused the reflection (e.g., a monitored subject, etc.) may be determined. In this regard, a wide variety of enhanced monitoring services may be provided. For example, a monitoring system may determine a location of a monitored subject, and the monitoring system may identify changes in the location of the monitored subject. In this regard, the monitoring system may track the movement of a monitored subject.”); and if the predetermined condition is met, perform at least one operation related to camera image data (in addition to discussion above, paragraph 0021 teaches “As desired, a monitoring system may implement or direct the implementation of a wide variety of control actions based upon tracking the movement of a monitored subject. As one example, the monitoring system may determine that a monitored subject is moving into the viewing area of a security camera, and the monitoring system may activate the security camera.”); wherein the processor is further configured to: process second measured wave reflection data obtained by the active reflected wave detector, the second measured wave reflection data obtained by the active reflected wave detector being associated with second wave reflections from a second monitoring region of the active reflected wave detector in said environment to detect a predefined condition comprising that the object has left the second monitoring region of the active reflected wave detector, and in response to detection of the predefined condition cause the camera to switch from the second mode to a mode that consumes less power than the second mode (in addition to discussion above, paragraph 0035-0037 teaches “A wide variety of suitable operations may be performed by the monitoring application 164 as desired in various embodiments of the invention. For example, the monitoring application may identify one or more sensors associated with a monitored area. These sensors may include one or more wave sensors 135. Additionally, the monitoring application 164 may determine a wide variety of profile information associated with the sensors (e.g., a covered area, configuration data, etc.), the monitored area (e.g., positions and/or dimensions of relatively stationary objects, etc.). In certain embodiments, at least a portion of the profile information may be collected during a learning mode and/or configuration mode of the monitoring application 164. For example, wave sensors may be utilized to determine dimensions of one or more objects in a monitored area, and at least a portion of the dimension information (as well as location or position information) may be stored. In certain embodiments, the monitoring application 164 may activate one or more wave sensors 135 based upon a detected presence of a subject to be monitored. For example, data collected from a suitable motion detector 120 may be evaluated in order to determine the presence of a subject, and the wave sensors 135 may be activated by the monitoring application 164 based at least in part upon the detected presence. Once activated, the wave sensors 135 may take measurements of the monitored area (e.g., timing measurements for wave reflections, etc.), and measurements data may be received and processed by the monitoring application 164. In this regard, the monitoring application 164 may track a subject located within the monitored area. For example, a location of the monitored subject may be determined, and changes in the location may be identified in order to track movement of the subject.”). Knasel et al. fails to disclose wherein the second monitoring region of the active reflected wave detector extends beyond the first monitoring region of the active reflected wave detector in at least one direction. Gordon et al. discloses wherein the second monitoring region extends beyond the first monitoring region in at least one direction (fig. 2, col. 10 lines 7-22 teaches “FIG. 2 depicts example fields-of-view of camera device 120 and auxiliary motion sensor 122a, in accordance with various aspects of the present disclosure. In FIG. 2, camera device 120 has a field-of-view 220. In the example of FIG. 2, the fields-of-view of image sensor 250 and motion sensor 258 of camera device 120 may be substantially aligned such that field-of-view 220 represents the field-of-view of both the image sensor 250 and motion sensor 258. Auxiliary motion sensor 122a may have a field-of-view 222. As depicted in FIG. 2, the field-of-view 220 of camera device 120 and the field-of-view 222 of auxiliary motion sensor 122a may overlap at overlap region 230. In various examples, camera device 120 and auxiliary motion sensor 122a may be positioned such that overlap region 230 covers an area-of-interest (e.g., an area that a user desires to monitor with camera device 120).”). It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to incorporate the ability to include the second monitoring region extends beyond the first monitoring region in at least one direction, as taught by Gordon et al. into the system of Knasel et al., because such incorporation would allow for the benefit of monitoring extended region, thus increase user accessibility of the system. Regarding claim 42, the apparatus wherein the processor is configured to receive the camera image data from the camera, and the at least one operation comprises any one of: processing the camera image data for verification that the predetermined condition is met and/or for checking whether another condition is met (in addition to discussion above, Knasel et al., paragraph 0068 teaches “At block 610, the monitoring system may be activated. Additionally, the presence of a subject to be monitored may be detected. As desired, presence may be detected by any number of suitable detection devices associated with a monitored area of interest. For example, one or more suitable motion detectors may be utilized to detect the presence of an individual entering a monitored area and/or located within the monitored area. A motion detector may be, for example, a traditional body heat sensor, a door contact, etc. As desired, the monitoring system (e.g., a control unit 115, etc.) may receive measurements data and/or a presence detection indication from the detection devices. In this regard, the monitoring system may determine that a subject to be monitored is located within an area of interest. Based at least in part upon the detected presence of a subject to be monitored, one or more suitable wave sensors associated with the area, such as the wave sensors 135 illustrated in FIG. 1, may be activated at block 615. For example, a control unit 115 may send one or more suitable signals to the wave sensors 135 in order to awaken and/or activate the wave sensors 135.”); transmitting the camera image data to a remote device for verification that the predetermined condition is met and/or for checking whether another condition is met; and transmitting message to a remote device informing the remote device of the capture of said camera image data (in addition to discussion above, Knasel et al., paragraph 0069 teaches “At block 620, the wave sensors 135 may output sound waves and receive reflection data associated with the output waves. The wave sensors 135 may then communicate the reflection distance data and/or timing data associated with the detected reflections to the control unit 115. The control unit 115, either alone or in conjunction with a central server 110, may receive the measurements data. The received measurements data may be processed and/or evaluated at block 625 utilizing a wide variety of suitable evaluation techniques. For example, the measurements data may be compared to baseline data and/or expected data, such as data stored in a suitable profile. In this regard, deviations from the baseline data may be determined. As one example, the measurements data may be compared to baseline data associated with stationary objects in the monitored area. Differences between the monitored data and the baseline data may then be identified.”, paragraph 0037 teaches “Based upon the alarm condition, a wide variety of additional actions may be taken, including, but are not limited to, the activation of an alarm (e.g., an audible alarm, etc.), the activation of one or more additional sensors and/or monitoring devices (e.g., cameras 125, audio detectors, etc.) that facilitate additional monitoring (e.g., monitoring by security personnel, etc.), the communication of an alert message (e.g., communicating a message to emergency personnel, communicating a message to an individual, communicating a message to monitoring system personnel, etc.), the communication of a message to a user device 150, and/or the escalation of an alert that has not been closed.”, paragraph 0042, 0062). Regarding claim 46, the apparatus wherein the at least one operation comprises controlling the camera to capture of an image of said environment, and in response receive said camera image data that is associated with said image from the camera (in addition to discussion above, Knasel et al., paragraph 0042 teaches “The data files 180 may include any suitable data that facilitates the general operation of a central server 110, and/or a determination of the response of the monitoring system to various sensor states (e.g. determining and/or processing various locations and/or movements, processing received alert and/or event data, etc.). For example, the data files 180 may include various settings information associated with any number of household monitoring systems. As another example, the data files 180 may include contact information and/or network data associated with the household monitoring systems and/or individual sensors. As other examples, the data files 180 may include received measurements data (e.g., data collected by the sensors 120, 125, 130, 135) and/or received data associated with determined locations and/or tracked movements (i.e., location changes). A customer profile database 182 may include, for example, various application rules, preferences, and/or user profiles associated with one or more customers and/or profile information associated with desired control actions to take based upon identified alerts. The event data database 184 may include, for example, data associated with identified events, (e.g., identified alert events, change in area events, etc.) and/or information associated with received alert events.”). Regarding claim 48, the apparatus wherein in response to the processor detecting an activity in said environment, the processor is further configured to control the camera to capture an image of said environment, and in response receive, from the camera, said camera image data that is associated with said image (in addition to discussion above, Knasel et al., paragraph 0024 teaches “For example, an algorithm may be established that evaluates measurements data to determine a location or likely location of a monitored subject. Additionally, the algorithm may evaluate changes in measurements data to track movement of the monitored subject and, as desired, identify events (e.g., a subject entering a monitored area, a subject leaving a room, etc.) based upon the tracked movement. As desired in certain embodiments, stored profile information associated with relatively stationary objects within a monitored area, such as furniture, may be accessed. In this regard, received measurements data indicating these objects may be filtered out from an orientation analysis.”). Regarding claim 49, the apparatus wherein in response to the processor detecting an activity in said environment, the processor is further configured to activate the active reflected wave detector to measure wave reflections from the environment to accrue said first measured wave reflection data (in addition to discussion above, Knasel et al. paragraph 0030 teaches “Additionally, in certain embodiments, the control unit 115 may be a suitable processor-driven device that facilitates the evaluation of parameters and/or monitoring data in order to determine presence, position, and/or movement associated with one or more monitored subjects.”, paragraph 0034 teaches “As desired, a monitoring application 164 associated with the control unit 115 and/or a central server 110 in communication with the control unit 115 may facilitate the collection of monitoring data (e.g., measurements data, motion detection data, etc.), the evaluation of monitoring data to detect the presence of a subject to be monitored, and evaluation of monitoring data to detect a location and/or to track movement or motion of the subject, the identification of alarm events and/or other events (e.g., room entering events, room exit events, entering the viewing area of a camera, etc.), and/or the execution of one or more control actions. The monitoring application 164 may be a suitable software module that receives the various inputs from sensors 120, 125, 130, 135 and executes one or more action(s) based at least in part upon the evaluation of the received inputs and/or instructions received from the central server 110.”). Regarding claim 50, the apparatus wherein in response to the processor detecting an activity in said environment, the processor is further configured to control the camera to switch from the first mode to the second mode (in addition to discussion above, Knasel et al., paragraph 0037 teaches “In certain embodiments, a wide variety of control actions may be implemented or directed by the monitoring application 164 based upon determined movement or motion associated with a monitored subject. For example, a determination may be made that a monitored subject is entering the detectable area of a camera 125 (or other sensor), and the camera 125 (or other sensor) may be initiated (i.e., turned on, taken out of a sleep mode or power conservation mode, etc.).”, paragraph 0048). Regarding claim 53, the apparatus wherein said activity is motion detected in a motion detection monitoring region in said environment, and the processor is arranged to receive motion detection data associated with the motion detection monitoring region in said environment (in addition to discussion above, Knasel et al., paragraph 0034 teaches “As desired, a monitoring application 164 associated with the control unit 115 and/or a central server 110 in communication with the control unit 115 may facilitate the collection of monitoring data (e.g., measurements data, motion detection data, etc.), the evaluation of monitoring data to detect the presence of a subject to be monitored, and evaluation of monitoring data to detect a location and/or to track movement or motion of the subject, the identification of alarm events and/or other events (e.g., room entering events, room exit events, entering the viewing area of a camera, etc.), and/or the execution of one or more control actions.”, paragraph 0037 teaches “In certain embodiments, a wide variety of control actions may be implemented or directed by the monitoring application 164 based upon determined movement or motion associated with a monitored subject. For example, a determination may be made that a monitored subject is entering the detectable area of a camera 125 (or other sensor), and the camera 125 (or other sensor) may be initiated (i.e., turned on, taken out of a sleep mode or power conservation mode, etc.). As another example, the pan, tilt, and/or other motion of a camera 125 may be controlled based upon the monitored movement of a subject.”, paragraph 0047). Regarding claim 54, the apparatus the apparatus further comprising a motion sensor configured to output said motion detection data (in addition to discussion above, Knasel et al., paragraph 0047 teaches “As desired in certain embodiments of the invention, a monitoring system 100 may include any number of sensors and/or cameras 125 that may function in a peer-to-peer mode on a local network or as a combination of peer-to-peer devices. Any number of the peer devices may have slave devices. For example, if a presence sensor (e.g., a motion detector, etc.) is triggered that would indicate the presence of subject, then one or more wave sensors 135 may be activated in order to determine a location and/or to track movement associated with the subject.”). Regarding claim 57, the apparatus wherein the camera consumes more power when in the second mode than in the first mode (in addition to discussion above, Knasel et al.paragraph 0037 teaches “For example, a determination may be made that a monitored subject is entering the detectable area of a camera 125 (or other sensor), and the camera 125 (or other sensor) may be initiated (i.e., turned on, taken out of a sleep mode or power conservation mode, etc.).”). Regarding claim 59, the apparatus wherein the second monitoring region of the active reflected wave detector encompasses the first monitoring region of the active reflected wave detector (in addition to discussion above, Knasel et al.paragraph 0019 teaches “As another example, a plurality of wave sensors may be arranged in a grid or other configuration about a monitored area (e.g., a room, etc.). For example, each sensor in a grid may be positioned at a different angle in order to provide coverage for a desired area. As yet another example, a plurality of wave sensors may be utilized to form a phased array that is positioned within a monitored area (e.g., positioned on a wall, on a ceiling, in a corner, etc.). The wave sensors may be used to determine presence or may be used in conjunction with another sensor or motion detector to determine presence. Once presence is determined, the wave sensors may be monitored to detect location and movement (e.g., changes in position) of an object of interest.”, paragraph 0024 teaches “For example, an algorithm may be established that evaluates measurements data to determine a location or likely location of a monitored subject. Additionally, the algorithm may evaluate changes in measurements data to track movement of the monitored subject and, as desired, identify events (e.g., a subject entering a monitored area, a subject leaving a room, etc.) based upon the tracked movement. As desired in certain embodiments, stored profile information associated with relatively stationary objects within a monitored area, such as furniture, may be accessed. In this regard, received measurements data indicating these objects may be filtered out from an orientation analysis”, paragraph 0033 teaches “For example, a camera 125 may be activated based upon a determination that a monitored subject has entered a coverage or monitoring area associated with the camera 125. As another example, the pan and/or tilt of a camera may be controlled based upon the determined movement of the monitored subject. As yet another example, an alarm event may be triggered by a security monitoring system based upon the identification of a break-in or unauthorized entry to a monitored area (e.g., detected presence combined with a determination that a monitored subject is entering a monitored area, etc.).”, paragraph 0068 teaches “As desired, presence may be detected by any number of suitable detection devices associated with a monitored area of interest. For example, one or more suitable motion detectors may be utilized to detect the presence of an individual entering a monitored area and/or located within the monitored area. A motion detector may be, for example, a traditional body heat sensor, a door contact, etc. As desired, the monitoring system (e.g., a control unit 115, etc.) may receive measurements data and/or a presence detection indication from the detection devices. In this regard, the monitoring system may determine that a subject to be monitored is located within an area of interest. Based at least in part upon the detected presence of a subject to be monitored, one or more suitable wave sensors associated with the area, such as the wave sensors 135 illustrated in FIG. 1, may be activated at block 615.”). Regarding claim 60, the apparatus wherein the predetermined condition comprises that the object is detected in the first monitoring region of the active reflected wave detector (in addition to discussion above, Knasel et al., paragraph 0024 teaches “For example, an algorithm may be established that evaluates measurements data to determine a location or likely location of a monitored subject. Additionally, the algorithm may evaluate changes in measurements data to track movement of the monitored subject and, as desired, identify events (e.g., a subject entering a monitored area, a subject leaving a room, etc.) based upon the tracked movement. As desired in certain embodiments, stored profile information associated with relatively stationary objects within a monitored area, such as furniture, may be accessed. In this regard, received measurements data indicating these objects may be filtered out from an orientation analysis”). Regarding claim 61, the apparatus wherein the predetermined condition further comprises that the object is determined to be human (in addition to discussion above, Knasel et al., paragraph 0003, 0021, 0023 teaches “In certain embodiments, the wave sensors may be activated by sensing gross activity or presence of a subject to be monitored. For example, by using a device such as a body heat motion detector or another convenient device/method, the wave sensors may be activated once a subject enters a monitoring area.”). Regarding claim 62, the apparatus wherein the predetermined condition comprises that the object is located in a predetermined area within the first monitoring region of the active reflected wave detector (in addition to discussion above, Knasel et al., paragraph 0024 teaches “For example, an algorithm may be established that evaluates measurements data to determine a location or likely location of a monitored subject. Additionally, the algorithm may evaluate changes in measurements data to track movement of the monitored subject and, as desired, identify events (e.g., a subject entering a monitored area, a subject leaving a room, etc.) based upon the tracked movement. As desired in certain embodiments, stored profile information associated with relatively stationary objects within a monitored area, such as furniture, may be accessed. In this regard, received measurements data indicating these objects may be filtered out from an orientation analysis.”). Regarding claim 63, the apparatus wherein the predefined condition comprises that the object has stayed outside of the second monitoring region of the active reflected wave detector for a predetermined amount of time (in addition to discussion above, Knasel et al., paragraph 0071-0073 teaches “If it is determined at block 635 that a location has not changed, then operations may continue at block 620, and monitoring may continue. If, however, it is determined at block 635 that a location has changed (i.e., movement is detected), then operations may continue at block 640. Although block 635 described the detection of movement or changes in location, in certain embodiments, measurements data may be evaluated in order to determine when a moving subject has stopped. As desired, any number of suitable control actions may be taken based upon the determination that the subject has stopped”). Regarding claim 64, the apparatus wherein the predefined condition comprises that the object has left a third monitoring region of the active reflected wave detector that extends beyond the second monitoring region of the active reflected wave detector in at least one direction (in addition to discussion above, Knasel et al., paragraph 0024 teaches “Additionally, the algorithm may evaluate changes in measurements data to track movement of the monitored subject and, as desired, identify events (e.g., a subject entering a monitored area, a subject leaving a room, etc.) based upon the tracked movement. As desired in certain embodiments, stored profile information associated with relatively stationary objects within a monitored area, such as furniture, may be accessed. In this regard, received measurements data indicating these objects may be filtered out from an orientation analysis.”, paragraph 0034 teaches “As desired, a monitoring application 164 associated with the control unit 115 and/or a central server 110 in communication with the control unit 115 may facilitate the collection of monitoring data (e.g., measurements data, motion detection data, etc.), the evaluation of monitoring data to detect the presence of a subject to be monitored, and evaluation of monitoring data to detect a location and/or to track movement or motion of the subject, the identification of alarm events and/or other events (e.g., room entering events, room exit events, entering the viewing area of a camera, etc.), and/or the execution of one or more control actions.”). Regarding claim 71, the apparatus wherein the active reflected wave detector is a ranging sensor (in addition to discussion above, Knasel et al., paragraph 0020, paragraph 0052 teaches “In certain embodiments, a wave sensor 205 may measure the time it takes for a wave reflection to be detected after a wave is emitted. In this regard, the wave sensor 205 may determine a distance to an object. Additionally, in certain embodiments, a distance or time to a known object, such as a door or wall, may be utilized in order to identify an object between the wave sensor and the known object.”). Regarding claim 73, the apparatus comprising a housing holding the processor, the housing additionally holding one or any combination of: the active reflected wave detector, and the camera (in addition to discussion above, Knasel et al., paragraph 0019 teaches “For example, one or more wave sensors may be integrated into or otherwise associated with a security camera (e.g., incorporated into a camera housing, placed in proximity to a camera, etc.). As another example, a plurality of wave sensors may be arranged in a grid or other configuration about a monitored area (e.g., a room, etc.). For example, each sensor in a grid may be positioned at a different angle in order to provide coverage for a desired area.”, paragraph 0026-0027 teaches “For example, various system components may be situated within a household 105. Additionally, the system 100 may include a central server 110 configured to receive data, such as sensor data, monitoring data, and/or generated alerts or other communications from devices associated with the household 105”). Claim 74 is rejected for the same reason as discussed in the corresponding claim 41 above. Claim 75 is rejected for the same reason as discussed in the corresponding claim 41 above. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NIGAR CHOWDHURY whose telephone number is (571)272-8890. The examiner can normally be reached Monday-Friday 9AM-5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thai Tran can be reached on 571-272-7382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NIGAR CHOWDHURY/Primary Examiner, Art Unit 2484
Read full office action

Prosecution Timeline

Oct 16, 2023
Application Filed
Feb 14, 2025
Non-Final Rejection — §103
Jun 11, 2025
Response Filed
Oct 09, 2025
Final Rejection — §103
Dec 12, 2025
Response after Non-Final Action
Jan 22, 2026
Request for Continued Examination
Jan 29, 2026
Response after Non-Final Action
Mar 06, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601856
DEVICE AND METHOD FOR MULTI-ANGLE STEREOSCOPIC IMAGING MEASUREMENT OF PRECIPITATION PARTICLES
2y 5m to grant Granted Apr 14, 2026
Patent 12600311
VIDEO RECORD SYSTEM FOR VEHICLE, METHOD OF CONTROLLING THE VIDEO RECORD SYSTEM, AND USER TERMINAL
2y 5m to grant Granted Apr 14, 2026
Patent 12604071
SYSTEM AND METHOD FOR GENERATING A CUSTOM SUMMARY OF UNCONSUMED PORTIONS OF A SERIES OF MEDIA ASSETS
2y 5m to grant Granted Apr 14, 2026
Patent 12592260
VIDEO GENERATION METHOD AND APPARATUS, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12591167
ENCLOSURES FOR ACCOMMODATING BOARD STACKS
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
69%
Grant Probability
86%
With Interview (+17.3%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 713 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month