DETAILED ACTION
This action is in response to communication filed on 22 October 2025. Claim 1 is amended. Claims 5-12 are added. No claim has been canceled. Claims 1-12 are pending in the application and have been considered below.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment to the specification filed on 22 October 2025 has been entered.
Response to Arguments
Applicant argues that ["For example, claim 1 recites in part: determining a duration of activity that occurred at each of the individual locations during the time period. The cited references have not been shown to recite these features. The rejection of claim 1 relies on Yan as allegedly disclosing determining a duration of activity that occurred at each of multiple individual locations during a time period. But the cited disclosure of Yan is deficient because Yan does not measure a duration of activity and is limited to activity at a single location" (Pages 7-8)]. Examiner respectfully disagrees.
Applicant’s own specification discloses “The motion data includes motion indicator values and motion localization values for the plurality of locations. The motion indicator values may be indicative of a degree of motion that occurred in the space for each time point in a series of time points within the time period. The motion localization value for each individual location may represent a relative degree of motion detected at the individual location for each time point in the series of time points within the time period” (par. 0137).
Examiner relies on a combination of KRAVETS and YAN for teaching the limitations of the independent claim 1. In particular, KRAVETS teaches determining an aggregate degree of motion that occurred at each of the individual locations during the time period (see cited par. 0070, wherein if motion is detected in the space by a device, e.g. at a neighbor device, then a motion indicator value (MIV) is computed by the device. The MIV represents a degree of motion detected by the device based on the wireless signals transmitted or received by the device. For instance, higher MIVs can indicate a high level of channel perturbation (due to the motion detected), while lower MIVs can indicate lower levels of channel perturbation. Higher levels of channel perturbation may indicate motion in close proximity to the device. The MIVs may include aggregate MIVs (representing a degree of motion detected in the aggregate by the respective device 402) (emphasis added). Furthermore, par. 0057 of KRAVETS further discloses “FIG. 7 is a diagram showing the example system 400 of FIGS. 4A-4B with a person 702 in the space. Also shown in FIG. 7 is a spatial map 710 for the mode represented by the object 410. When the person 702 moves within the space as shown, the incident angle 720 of the wireless signals to the antennas of the receiver 404 will change, causing a corresponding change in the maximum 730 of the spatial map 710. Thus, by analyzing or tracking changes in the maxima of spatial maps, motion of an object (e.g., the person 702 or another type of object) may be detected in a space accessed by wireless signals. In addition, a relative location of the detected motion may be determined using the spatial maps, since the maxima may indicate a direction of the object scattering the signals in the space” (par. 0057; emphasis added). Moreover, par. 0045 of KRAVETS further discloses “Motion may also be localized by the WAP 302 based on changes in the respective beamforming matrices for each connection with a client device 304 … The motion information can then be sent to a hub device (e.g., one of the WAPs 302) or another device (e.g., the server 308) to analyze the motion information and make an overall determination of whether motion has occurred in the space, detect a location of detected motion …” (emphasis added, see also pars. 0017-0019 and 0036-0039).
Additionally, Examiner relies on YAN for teaching determining a duration of activity that occurred at each of the individual locations during the time period (see par. 0033, wherein motion sensor device 104 can include a housing and various electronic components encased within the housing including at least a sensor module configured to capture motion data in response to motion of the motion sensor device 104 over a period of time (e.g., sampling period). When worn by user 102, the captured motion data directly corresponds to motion of the user 102. The sensor module of motion sensor device 104 can include various motion sensors including one or more of an accelerometer, a gyroscope, a magnetometer, and/or an IMU. Thus captured motion data can include information identifying acceleration, rotation/orientation, and/or velocity of the motion sensor device 104. Motion sensor device 104 can also include a timer to relate captured motion data as a function of time; emphasis added). Furthermore, par. 0049 further discloses “sampling periods can be programmed/controlled at a remote device” (emphasis added). Par. 0047 further discloses “sensor module 204 can also facilitate determining a location of the motion sensing device 104. For example, sensor module 204 can include a global positioning system (GPS) client to facilitate determining its GPS location at a time when motion data is captured” (emphasis added) (see also pars. 0041-0043). Therefore, the combination of KRAVETS and YAN teaches determining a duration of activity that occurred at each of the individual locations during the time period.
Thus, the combination of KRAVETS and YAN adequately discloses applicant's claimed limitation. Examiner respectfully reminds Applicants that during examination, the claims must be interpreted as broadly as their terms reasonably allow. In re American Academy of Science Tech Center, 367 F.3d 1359, 1369, 70 U.S.P.Q.2d 1827, 1834 (Fed. Cir. 2004).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-12 are rejected under 35 U.S.C. 103 as being unpatentable over KRAVETS et al. (US20190146075A1) in view of YAN et al. (US20160210838A1).
As to claim 1, KRAVETS teaches a method of operating a motion detection system (See figs. 5A-5B, par. 0008, wherein FIGS. 5A-5B are diagrams showing an example spatial map generation process for a first mode of a motion detection system; as taught by KRAVETS),
the method comprising: by operation of one or more communication interfaces (See fig. 1, par. 0022, wherein FIG. 1 illustrates an example wireless communication system 100. The example wireless communication system 100 includes three wireless communication devices—a first wireless communication device 102A, a second wireless communication device 102B, and a third wireless communication device 102C; as taught by KRAVETS),
receiving radio frequency wireless signals (See fig. 1, par. 0035, wherein the wireless communication devices 102A, 102B transmit wireless signals (e.g., according to a wireless network standard, a motion detection protocol, or otherwise). For instance, wireless communication devices 102A, 102B may broadcast wireless signals (e.g., reference signals, beacon signals, status signals, etc.), or they may send wireless signals addressed to other devices (e.g., a user equipment, a client device, a server, etc.), and the other devices (not shown) as well as the wireless communication device 102C may receive the wireless signals transmitted by the wireless communication devices 102A, 102B; see also par. 0038, wherein each wireless communication device 102 detects motion in the motion detection fields 110 accessed by that device by processing received signals that are based on wireless signals transmitted by the wireless communication devices 102 through the motion detection fields 110. For example, when the person 106 shown in FIG. 1 moves in the first motion detection field 110A and the third motion detection field 110C, the wireless communication devices 102 may detect the motion based on signals they received that are based on wireless signals transmitted through the respective motion detection fields 110; as taught by KRAVETS)
communicated through a space (See fig. 1, par. 0036, wherein the wireless communication device 102C processes the wireless signals from the wireless communication devices 102A, 102B to detect motion of an object in a space accessed by the wireless signals, to determine a location of the detected motion, or both; as taught by KRAVETS)
over a time period by a plurality of wireless communication devices (See fig. 9, par. 0064, wherein motion may be detected based on a change over time seen in a feedback matrix obtained at 902. As another example, motion may be detected based on a change over time seen in a spatial map generated at 904; as taught by KRAVETS),
the space comprising a plurality of locations (See par. 0036, wherein the space can be or can include an interior of a room, multiple rooms, a building, or the like; as taught by KRAVETS);
processing the radio frequency wireless signals to generate channel information (See par. 0027, wherein the modem 112 may be configured to communicate radio frequency (RF) signals formatted according to a wireless communication standard (e.g., Wi-Fi or Bluetooth). The modem 112 may be implemented as the example transmitter 212 or receiver 222 shown in FIG. 2B; see also fig. 2, par. 0041, wherein the beamformee 220 determines channel state information (CSI) 224 based on the signal(s) received at the receiver 222. The beamformee 220 then computes, using the feedback matrix calculator 226, a feedback matrix 204 based on the CSI 224; as taught by KRAVETS);
by operation of a motion detection engine of a motion detection system, generating motion data based on the channel information (See par. 0038, wherein each wireless communication device 102 detects motion in the motion detection fields 110 accessed by that device by processing received signals that are based on wireless signals transmitted by the wireless communication devices 102 through the motion detection fields 110. For example, when the person 106 shown in FIG. 1 moves in the first motion detection field 110A and the third motion detection field 110C, the wireless communication devices 102 may detect the motion based on signals they received that are based on wireless signals transmitted through the respective motion detection fields 110; as taught by KRAVETS),
the motion data comprising a series of vectors in a series of time points within the time period (See par. 0042, wherein the system 200 can be modeled by Equation: Yk =Hk Xk +n; where Xk represents a vector [x1, x2, . . . , xn] transmitted in subcarrier frequency k by the transmitter 212, Yk represents a vector [y1, y2 . . . , yn] received by the receiver 222, Hk represents a channel response matrix of dimensions NRX×NTX … The system 200 can thus be modeled by Equation: Yk =Hk Qk Xk +n; where Qk is a matrix of dimension NTX×NSTS (where NSTS is the number of elements in Xk); as taught by KRAVETS),
the motion data comprising: motion indicator values indicative of a degree of motion that occurred in the space for each time point in the series of time points within the time period (See par. 0070, wherein if motion is detected in the space by a device, e.g. at a neighbor device, then a motion indicator value (MIV) is computed by the device. The MIV represents a degree of motion detected by the device based on the wireless signals transmitted or received by the device; as taught by KRAVETS);
and motion localization values for the plurality of locations (See par. 0045, wherein Motion may also be localized by the WAP 302 based on changes in the respective beamforming matrices for each connection with a client device 304. In mesh examples (e.g., the wireless communication system 320), sounding and beamforming is performed between WAPs 302 and their respective client devices 304 and motion information is determined at each of the WAPs 302. The motion information can then be sent to a hub device (e.g., one of the WAPs 302) or another device (e.g., the server 308) to analyze the motion information and make an overall determination of whether motion has occurred in the space, detect a location of detected motion, or both; see also par. 0075, wherein a motion detection system with the configuration of FIG. 12C may collect enough data to be able to provide unsupervised multi-target localization and target count (e.g. track multiple movements and objects), provide statistical target identification (e.g. based on labeling a movement that is statistically the same overtime, such as, a slow moving person compared to a fast moving person, or an electric fan), and reliably detect respiratory activity (e.g. when a person or animal is stationary; as taught by KRAVETS),
the motion localization value for each individual location representing a relative degree of motion detected at the individual location for each time point in the series of time points within the time period (See figs. 12A-12C, par. 0074, wherein improved motion detection devices 1240 may be configured with motion detection hardware, e.g. dedicated hardware, but may additionally support connectivity to APs 1210. This type of configuration utilizing additional nodes, and therefore, additional data for detecting motion, in the motion detection system as illustrated in FIG. 12B provides improved data collection and thus, improved features, such as, occupancy detection (e.g. whether a person or object is present), localization of motion based on unsupervised machine learning (determining where motion is occurring without human input/assistance), and enhanced motion type detection (e.g. is a human/dog/cat/etc. moving); as taught by KRAVETS);
wherein processing the series of vectors comprises: determining an aggregate degree of motion that occurred at each of the individual locations during the time period (See par. 0070, wherein if motion is detected in the space by a device, e.g. at a neighbor device, then a motion indicator value (MIV) is computed by the device. The MIV represents a degree of motion detected by the device based on the wireless signals transmitted or received by the device. For instance, higher MIVs can indicate a high level of channel perturbation (due to the motion detected), while lower MIVs can indicate lower levels of channel perturbation. Higher levels of channel perturbation may indicate motion in close proximity to the device. The MIVs may include aggregate MIVs (representing a degree of motion detected in the aggregate by the respective device 402), link MIVs (representing a degree of motion detected on particular communication links between respective devices 402), path MIVs (representing a degree of motion detected on particular communication paths between hardware signal paths of respective devices 402), or a combination thereof; see also par. 0060; as taught by KRAVETS).
KRAVETS does not expressly teach by operation of a pattern extraction engine of the motion detection system, processing the series of vectors to generate output data for the time period, determining a duration of activity that occurred at each of the individual locations during the time period; and based on the output data generated by the pattern extraction engine, sending a notification from the motion detection system for processing by an external system.
In similar field of endeavor, YAN teaches by operation of a pattern extraction engine of the motion detection system (See Figs. 1 and 7-9, par. 0018, wherein captured motion data is analyzed using pattern recognition analysis to identify various patterns in the motion data that correspond to known movements or motions, such as walking, running, jumping, rolling, walking stairs, falling, standing up, laying or sitting down, etc. Machine learning techniques can be employed in association with pattern analysis to enhance accuracy of pattern based motion determinations; as taught by YAN),
processing the series of vectors to generate output data for the time period (See Figs. 5-7, par. 0033, wherein motion sensor device 104 can include a housing and various electronic components encased within the housing including at least a sensor module configured to capture motion data in response to motion of the motion sensor device 104 over a period of time (e.g., sampling period). When worn by user 102, the captured motion data directly corresponds to motion of the user 102. The sensor module of motion sensor device 104 can include various motion sensors including one or more of an accelerometer, a gyroscope, a magnetometer, and/or an IMU. Thus captured motion data can include information identifying acceleration, rotation/orientation, and/or velocity of the motion sensor device 104. Motion sensor device 104 can also include a timer to relate captured motion data as a function of time; see also par. 0041, wherein motion sensor device 104 can transmit raw motion data to remote device 106 over a PAN using short range radio waves and remote device 106 can relay the raw motion data to motion analysis service provider 108 via a WAN. Upon receipt of the raw motion data, motion analysis service provider 108 can process the raw motion data to determine whether a falling motion has occurred or is occurring (e.g., using pattern recognition) for user 102; as taught by YAN),
determining a duration of activity that occurred at each of the individual locations during the time period (See Figs. 3-7, par. 0033 wherein motion sensor device 104 can include a housing and various electronic components encased within the housing including at least a sensor module configured to capture motion data in response to motion of the motion sensor device 104 over a period of time (e.g., sampling period). When worn by user 102, the captured motion data directly corresponds to motion of the user 102. The sensor module of motion sensor device 104 can include various motion sensors including one or more of an accelerometer, a gyroscope, a magnetometer, and/or an IMU. Thus captured motion data can include information identifying acceleration, rotation/orientation, and/or velocity of the motion sensor device 104. Motion sensor device 104 can also include a timer to relate captured motion data as a function of time; see also par. 0049 wherein sampling periods can be programmed/controlled at a remote device; see also par. 0047 wherein sensor module 204 can also facilitate determining a location of the motion sensing device 104. For example, sensor module 204 can include a global positioning system (GPS) client to facilitate determining its GPS location at a time when motion data is captured; as taught by YAN);
and based on the output data generated by the pattern extraction engine, sending a notification from the motion detection system for processing by an external system (See Figs. 7-9, par. 0093, wherein at 904, the system analyzes the motion data to determine a type of the motion (e.g., via analysis component 504). At 906, the system activates a notification mechanism to notify another entity that the human has fallen in response to a determination that the motion is a falling motion (e.g., using notification 404 or notification component 604); as taught by YAN).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the KRAVETS apparatus to include the teachings of YAN wherein by operation of a pattern extraction engine of the motion detection system, processing the series of vectors to generate output data for the time period, determining a duration of activity that occurred at each of the individual locations during the time period; and based on the output data generated by the pattern extraction engine, sending a notification from the motion detection system for processing by an external system. Such a person would have been motivated to make this combination as there is a strong need for mechanisms to reduce the amount of time associated with finding and attending to victims of accidental falling after the occurrence of an accident (see also YAN, par. 0002).
As to claim 2, KRAVETS and YAN teach the limitations of claim 1. KRAVETS further teaches wherein generating the channel information comprises generating a set of channel responses that characterize communication paths between wireless transmitters and wireless receivers in a wireless communication network (see fig. 4, par. 0070, wherein higher MIVs can indicate a high level of channel perturbation (due to the motion detected), while lower MIVs can indicate lower levels of channel perturbation. Higher levels of channel perturbation may indicate motion in close proximity to the device. The MIVs may include … path MIVs (representing a degree of motion detected on particular communication paths between hardware signal paths of respective devices 402); as taught by KRAVETS).
As to claim 3, KRAVETS and YAN teach the limitations of claim 1. KRAVETS further teaches wherein generating the channel information comprises generating beamforming state information based on a wireless beamforming system (See fig. 1, par. 0022, wherein in some instances, the wireless communication devices 102 perform beamforming operations to increase network efficiency (e.g., through higher SNR) or for other purposes; as taught by KRAVETS).
As to claim 4, KRAVETS and YAN teach the limitations of claim 1. KRAVETS further teaches wherein receiving radio frequency wireless signals comprises receiving radio frequency wireless signals that are formatted according to a wireless communication standard for communication in a wireless communication network (See fig. 1, par. 0016, wherein in some aspects of what is described, motion in a space is detected based on beamforming dynamic information. Beamforming dynamic information may refer to the behavior of, or information generated or used by, wireless communication devices in performing beamforming operations over time. For example, beamforming dynamic information may include feedback or steering matrices generated by wireless communication devices communicating according to an IEEE 802.11 standard (e.g., the IEEE 802.11-2012 standard or the IEEE 802.11ac-2013 standard, which are both hereby incorporated by reference); see also pars. 0018, 0020-0025 and 0042-0044, as taught by KRAVETS).
Claims 5-8 and 9-12 amount to a non-transitory computer-readable storage medium storing one or more programs, and a computer system, for executing the method of claims 1-4, respectively. Accordingly, claims 5-8 and 9-12 are rejected for substantially the same reasons as presented above for claims 1-4 and based on the references’ disclosure of the necessary supporting hardware and software.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Publication Number
Filing Date
Title
US10438468B2
2018-09-05
Motion localization in a wireless mesh network based on motion indicator values
US10064014B2
2017-09-22
Detecting location within a network
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KOOROSH NEHCHIRI whose telephone number is (408)918-7643. The examiner can normally be reached M-F, 11-7 PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William L. Bashore can be reached at 571-272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KOOROSH NEHCHIRI/Examiner, Art Unit 2174
/WILLIAM L BASHORE/ Supervisory Patent Examiner, Art Unit 2174