DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This action is responsive to the Application filed on 09/08/2022
Claims 1-17 are pending in the case. Claims 1, 4, 8 and 9 are
independent claims.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4, 9 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Chari et al. (US Pub No.: 20220204015 A1), hereinafter referred to as Chari, in view of Mezaael et al. (US Pub No.: 20160350650 A1), hereinafter referred to as Mezaael and further in view of Doh et al (US Pub No.: 20120166610 A1), hereinafter referred to as Doh.
With respect to claim 1, Chari disclose:
A sensor platform comprising: a memory, the memory storing instructions for generating event detection models used to detect events in captured sensor data (In paragraph [0026], Chari discloses the memory of the remote server storing executable instructions, the remote server receives sensor data, trains a machine learning program using sensor data. The machine learning program is important or interesting (e.g., relevant to identifying objects, conditions around a vehicle, etc.))
a sensor interface communicatively coupled to the memory, the sensor interface configured to capture data received from sensors connected to the sensor interface and to store the captured sensor data in the memory (“Based on the examiners' broadest reasonable Interpretation (BRI) and the lack of details in the specification, sensor interface is interpreted as a hardware, software, or combination thereof that collects sensor data. In paragraph [0036], Chari discloses a vehicle gateway module programmed to receive/collect sensor data. The vehicle gateway module stores the sensor data in the memory.)
and one or more processors communicatively coupled to the memory, the processors configured to execute instructions stored in the memory, the instructions when executed causing the processors to: generate and train an event detection model from the instructions (In paragraph [0028], Chari discloses training the machine-learning program so that the machine-learning program determines whether sensor data satisfies at least one criterion. In paragraph [0047], disclose that a processor receives instructions from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein.)
retrieve the captured sensor data from memory (In paragraph [0036], Chari discloses collecting sensor data in memory and adding extra information to it.)
apply the trained event detection model to the captured sensor data, the trained event detection model configured to detect an event from within the captured sensor data (In paragraph [0042], Chari discloses that the vehicle control module receives sensor data from the sensor or sensors to which the vehicle control module is directly connected; The module runs a machine-learning program. This program determines if the sensor data meets at least one rule in the program, like recognizing a certain object or sound.)
transmit notice of the (In paragraph [0038], Chari discloses transmitting the sensor data, along with the metadata, to the remote server. )
With respect to claim 1, Chari do not explicitly disclose:
detected event
transmit captured sensor data associated with the detected event in response to a request from the remote observer for sensor data corresponding to the detected event
However, it is known by Doh to disclose:
Detected events (In paragraph [0027], Doh discloses transmitting the information of a detected event.)
Chari and Doh are analogous pieces of art because both references concern the detecting prespecified events occurred. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Chari, with executing a machine-learning program trained to determine whether the sensor data satisfies at least one criterion as taught by Chari, with a receiving information of a detected events from sensors as taught by Doh. The motivation for doing so would have been to improve machine-learning program by training with sensor data collected (See [0040] of Chari.)
With respect to claim 1, Chari in view of Doh do not explicitly disclose:
transmit captured sensor data associated with the detected event in response to a request from the remote observer for sensor data corresponding to the detected event
However, it is known by Mezaael to disclose:
Transmit captured sensor data associated with the detected event in response to a request from the remote observer for sensor data corresponding to the detected event (In paragraph [0039], Mezaael discloses that the query can specify for the vehicle to obtain sensor data from at least one of a second vehicle and an infrastructure element. The query can request vehicle sensor data.)
Chari in view of Doh and Mezaael and are analogous pieces of art because both references concern the detecting prespecified events occurred. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Chari, with executing a machine-learning program trained to determine whether the sensor data satisfies at least one criterion as taught by Chari, with a receiving information of a detected events from sensors as taught by Doh, with receiving the vehicle data in response to the query and confirm or disprove the event in the vehicle based on the vehicle data as taught by Mezaael. The motivation for doing so would have been to transmit information of a detected event by detecting information of a recognized object or information of recognized surrounding environment (See[0027] of Doh).
With respect to claim 4, Chari disclose:
A method, comprising: receiving captured sensor data at a remote location (In paragraph [0026], Chari discloses the memory of the remote server storing executable instructions, the remote server receives sensor data, trains a machine learning program using sensor data. The machine learning program is important or interesting (e.g., relevant to identifying objects, conditions around a vehicle, etc.))
generating and training, at the remote location, an event detection model, the trained event detection model configured to detect an event from within the captured sensor data (In paragraph [0028], Chari discloses training the machine-learning program so that the machine-learning program determines whether sensor data satisfies at least one criterion. In paragraph [0047], disclose that a processor receives instructions from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein.)
applying the trained event detection model at the remote location to the captured sensor data to detect an event from within the captured sensor data (In paragraph [0042], Chari discloses that the vehicle control module receives sensor data from the sensor or sensors to which the vehicle control module is directly connected; The module runs a machine-learning program. This program determines if the sensor data meets at least one rule in the program, like recognizing a certain object or sound.)
transmitting notice of the (In paragraph [0038], Chari discloses transmitting the sensor data, along with the metadata, to the remote server. )
With respect to claim 4, Chari do not explicitly disclose:
detected event
transmitting captured sensor data associated with the detected event to the remote observer in response to a request from the remote observer for some or all of the sensor data associated with to the detected event
However, it is known by Doh to disclose:
Detected event (In paragraph [0027], Doh discloses transmitting the information of a detected event.)
Chari and Doh are analogous pieces of art because both references concern the detecting prespecified events occurred. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Chari, with executing a machine-learning program trained to determine whether the sensor data satisfies at least one criterion as taught by Chari, with a receiving information of a detected events from sensors as taught by Doh. The motivation for doing so would have been to improve machine-learning program by training with sensor data collected (See [0040] of Chari.)
With respect to claim 4, Chari in view of Doh do not explicitly disclose:
transmitting captured sensor data associated with the detected event to the remote observer in response to a request from the remote observer for some or all of the sensor
However, it is known by Mezaael to disclose:
Transmitting captured sensor data associated with the detected event to the remote observer in response to a request from the remote observer for some or all of the sensor (In paragraph [0039], Mezaael discloses that the query can specify for the vehicle to obtain sensor data from at least one of a second vehicle and an infrastructure element. The query can request vehicle sensor data.))
Chari in view of Doh and Mezaael and are analogous pieces of art because both references concern the detecting prespecified events occurred. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Chari, with executing a machine-learning program trained to determine whether the sensor data satisfies at least one criterion as taught by Chari, with a receiving information of a detected events from sensors as taught by Doh, with receiving the vehicle data in response to the query and confirm or disprove the event in the vehicle based on the vehicle data as taught by Mezaael. The motivation for doing so would have been to transmit information of a detected event by detecting information of a recognized object or information of recognized surrounding environment (See[0027] of Doh).
With respect to claim 9, Chari disclose:
A sensor platform (In paragraph [0026], Chari discloses the memory of the remote server storing executable instructions, the remote server receives sensor data, trains a machine learning program using sensor data.)
An observer station remote from the sensor platform (In paragraph [0006], Chari disclose a remote server.)
A communications channel connected to the sensor platform and the observer station (In paragraph [0016], Chari disclose establishing a connection with a remote server.)
Wherein the sensor platform includes: a memory, the memory storing instructions for generating event detection models used to detect events in the captured sensor data (In paragraph [0026], Chari discloses the memory of the remote server storing executable instructions, the remote server receives sensor data, trains a machine learning program using sensor data. The machine learning program is important or interesting (e.g., relevant to identifying objects, conditions around a vehicle, etc.))
An interface, the interface configured to receive captured sensor data and store the captured sensor data to memory (“Based on the examiners' broadest reasonable Interpretation (BRI) and the lack of details in the specification, sensor interface is interpreted as a hardware, software, or combination thereof that collects sensor data. In paragraph [0036], Chari discloses a vehicle gateway module programmed to receive/collect sensor data. The vehicle gateway module stores the sensor data in the memory.)
One or more processors communicatively coupled to the memory, the processors configured to execute instructions stored in the memory, the instructions when executed causing the one or more processors to: generate and train an event detection model from the instructions; retrieve the captured sensor data from memory (In paragraph [0028], Chari discloses training the machine-learning program so that the machine-learning program determines whether sensor data satisfies at least one criterion. In paragraph [0047], disclose that a processor receives instructions from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein.)
Apply the trained event detection model to the captured sensor data, the trained event detection model configured to detect an event from within the captured sensor data (In paragraph [0042], Chari discloses that the vehicle control module receives sensor data from the sensor or sensors to which the vehicle control module is directly connected; The module runs a machine-learning program. This program determines if the sensor data meets at least one rule in the program, like recognizing a certain object or sound.)
Transmit notice of the (In paragraph [0038], Chari discloses transmitting the sensor data, along with the metadata, to the remote server. )
With respect to claim 9, Chari do not explicitly disclose:
detected event
transmit captured sensor data associated with the detected event to the remote observer in response to a request from the remote observer for some or all of the sensor data associated with to the detected event
However, it is known by Doh to disclose:
Detected event (In paragraph [0027], Doh discloses transmitting the information of a detected event.)
Chari and Doh are analogous pieces of art because both references concern the detecting prespecified events occurred. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Chari, with executing a machine-learning program trained to determine whether the sensor data satisfies at least one criterion as taught by Chari, with a receiving information of a detected events from sensors as taught by Doh. The motivation for doing so would have been to improve machine-learning program by training with sensor data collected (See [0040] of Chari.)
With respect to claim 9, Chari in view of Doh do not explicitly disclose:
Transmit captured sensor data associated with the detected event to the remote observer in response to a request from the remote observer for some or all of the sensor data associated with to the detected event
However, it is known by Mezaael to disclose:
Transmit captured sensor data associated with the detected event to the remote observer in response to a request from the remote observer for some or all of the sensor data associated with to the detected event (In paragraph [0039], Mezaael discloses that the query can specify for the vehicle to obtain sensor data from at least one of a second vehicle and an infrastructure element. The query can request vehicle sensor data.))
Chari in view of Doh and Mezaael and are analogous pieces of art because both references concern the detecting prespecified events occurred. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Chari, with executing a machine-learning program trained to determine whether the sensor data satisfies at least one criterion as taught by Chari, with a receiving information of a detected events from sensors as taught by Doh, with receiving the vehicle data in response to the query and confirm or disprove the event in the vehicle based on the vehicle data as taught by Mezaael. The motivation for doing so would have been to transmit information of a detected event by detecting information of a recognized object or information of recognized surrounding environment (See[0027] of Doh).
Regarding claim 12, Chari in view of Doh and Mezaael disclose elements of claim 9. In addition, Mezaael disclose:
The system of claim 9, wherein the observer station comprises: a memory, the memory storing instructions for generating event detection models used to detect events in the captured sensor data (In paragraph [0006], Mezaael discloses a processor and a memory, the memory storing instructions executable by the processor to receive a first message from a vehicle specifying an event in the vehicle.)
one or more processors communicatively coupled to the memory, the processors configured to execute instructions stored in the memory, the instructions when executed causing the one or more processors to: receive the notices of detected events from the sensor platform (In paragraph [0041], Mezaael discloses receiving vehicle data in response to the query and confirm or disprove the event in the vehicle based on the vehicle data.)
Request sensor data corresponding to one or more of the detected events (In paragraph [0051], Mezaael disclose data specifying an event in a vehicle and/or other data such as sensor data from the vehicle, and to output one or more parameters for a query to the vehicle )
Claims 2, 5 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Chari, in view of Mezaael, Doh and further in view of Johnson et al (US Patent No.: 9,467,663 B2), hereinafter referred to as Johnson.
Regarding claim 2, Chari in view of Doh and Mezaael disclose elements of claim 1. Chari in view of Doh and Mezaael do not explicitly disclose:
The sensor platform of claim 1, wherein the instructions that when executed cause the processors to transmit notice of the detected event to a remote observer further include instructions that when executed cause the processors to associate portions of the captured sensor data with the detected event and to transmit a lower resolution version of the portions of captured sensor data associated the detected event to the remote observer
However, Johnson disclose the limitation (In Fig. 2 and In Col.5, lines 11-15, Johnson disclose video is first converted to a low-quality/low resolution version, the low quality level allows the video data to be sent using minimal bandwidth and is sufficient when nothing significant is happening.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include Johnson, with transmitting low-quality/resolution video. The motivation for doing so would have been to conserve bandwidth and change the video into a lower quality feed to send to the remote site (See (Col. 1, lines 36-41) of Johnson).
Regarding claim 5, Chari in view of Doh and Mezaael disclose elements of claim 4. Chari in view of Doh and Mezaael do not explicitly disclose:
The method of claim 4, wherein transmitting notice of the detected event to a remote observer includes associating portions of the captured sensor data with the detected event and transmitting a lower resolution version of the portions of captured sensor data associated the detected event to the remote observer
However, Johnson disclose the limitation (In Fig. 2 and In Col.5, lines 11-15, Johnson disclose video is first converted to a low-quality/low resolution version, the low quality level allows the video data to be sent using minimal bandwidth and is sufficient when nothing significant is happening.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include Johnson, with transmitting low-quality/resolution video. The motivation for doing so would have been to conserve bandwidth and change the video into a lower quality feed to send to the remote site (See (Col. 1, lines 36-41) of Johnson).
Regarding claim 10 Chari in view of Doh and Mezaael disclose elements of claim 9. Chari in view of Doh and Mezaael do not explicitly disclose:
The system of claim 9, wherein the instructions that when executed cause the one or more processors to transmit notice of the detected event to a remote observer further include instructions that when executed cause the processors to associate portions of the captured sensor data with the detected event and to transmit a lower resolution version of the portions of captured sensor data associated the detected event to the remote observer with the notice
However, Johnson disclose the limitation (In Fig. 2 and In Col.5, lines 11-15, Johnson disclose video is first converted to a low-quality/low resolution version, the low quality level allows the video data to be sent using minimal bandwidth and is sufficient when nothing significant is happening.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include Johnson, with transmitting low-quality/resolution video. The motivation for doing so would have been to conserve bandwidth and change the video into a lower quality feed to send to the remote site (See (Col. 1, lines 36-41) of Johnson).
Claims 3, 6 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Chari, in view of Mezaael, Doh and further in view of NGUYEN et al (WO 2020236131 A1), hereinafter referred to as NGUYEN.
Regarding claim 3, Chari in view of Doh and Mezaael disclose elements of claim 1. Chari in view of Doh and Mezaael do not explicitly disclose:
The sensor platform of claim 1, wherein the processor is further configured to execute instructions stored in the memory that, when executed, cause the processors to: determine that one of the event detection models needs retraining
retrain the event detection model
However, NGUYEN disclose the limitations:
The sensor platform of claim 1, wherein the processor is further configured to execute instructions stored in the memory that, when executed, cause the processors to: determine that one of the event detection models needs retraining (In paragraph [0046], NGUYEN discloses that the event detection system also provides the sensor measurements and the events (measurements and events) to the monitoring system. The monitoring system 504 determines whether to retrain the machine learning model of the event detection system. )
Retrain the event detection model (In paragraph [0046], NGUYEN discloses where event detection performance may be improved by retraining the machine learning model based on sensor measurements acquired in the environment in which the sensors operate. )
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include NGUYEN, with providing data indicating the event, and the query requests at least some of the vehicle data from the controller via the API as taught by NGUYEN. The motivation for doing so would have been to determine whether it has received a response from the vehicle to the query sent in the block (See [0060] of NGUYEN.)
Regarding claim 6, Chari in view of Doh and Mezaael disclose elements of claim 4. Chari in view of Doh and Mezaael do not explicitly disclose:
The method of claim 4, wherein the method further comprises: determine that one of the event detection models needs retraining
retrain the event detection model
However, NGUYEN disclose the limitations:
The method of claim 4, wherein the method further comprises: determine that one of the event detection models needs retraining (In paragraph [0046], NGUYEN discloses that the event detection system also provides the sensor measurements and the events (measurements and events) to the monitoring system. The monitoring system 504 determines whether to retrain the machine learning model of the event detection system. )
retrain the event detection model (In paragraph [0046], NGUYEN discloses where event detection performance may be improved by retraining the machine learning model based on sensor measurements acquired in the environment in which the sensors operate.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include NGUYEN, with providing data indicating the event, and the query requests at least some of the vehicle data from the controller via the API as taught by NGUYEN. The motivation for doing so would have been to determine whether it has received a response from the vehicle to the query sent in the block (See [0060] of NGUYEN.)
Regarding claim 11, Chari in view of Doh and Mezaael disclose elements of claim 9. Chari in view of Doh and Mezaael do not explicitly disclose:
The system of claim 9, wherein the one or more of the processors are further configured to execute instructions stored in the memory that, when executed, cause the processors to: determine that one of the event detection models needs retraining
retrain the event detection model
However, NGUYEN disclose the limitations:
The system of claim 9, wherein the one or more of the processors are further configured to execute instructions stored in the memory that, when executed, cause the processors to: determine that one of the event detection models needs retraining (In paragraph [0046], NGUYEN discloses that the event detection system also provides the sensor measurements and the events (measurements and events) to the monitoring system. The monitoring system 504 determines whether to retrain the machine learning model of the event detection system. )
retrain the event detection model (In paragraph [0046], NGUYEN discloses where event detection performance may be improved by retraining the machine learning model based on sensor measurements acquired in the environment in which the sensors operate.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include NGUYEN, with providing data indicating the event, and the query requests at least some of the vehicle data from the controller via the API as taught by NGUYEN. The motivation for doing so would have been to determine whether it has received a response from the vehicle to the query sent in the block (See [0060] of NGUYEN.)
Claims 13-17 are rejected under 35 U.S.C. 103 as being unpatentable over Chari, in view of Mezaael, Doh and further in view of ElHattab et al (US Patent No.: 11,494,921 B2), hereinafter referred to as ElHattab.
Regarding claim 13, Chari in view of Doh and Mezaael disclose elements of claim 12. Chari in view of Doh and Mezaael do not explicitly disclose:
The system of claim 12, wherein the observer station further comprises a user interface, the user interface configured to receive the notices of detected events and to select one or more of the detected events for review of the sensor data corresponding to the event
However, ElHattab disclose the limitation (In Col. 10, lines 13–19, ElHattab discloses that receiving the multiplexed sensor data, the server system 108 performs further analysis of the sensor data, and in some embodiments may perform operations that include presenting a notification at a client device and causing display of a graphical user interface that comprises a presentation of visualizations based on the sensor data generated by the sensor devices)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include ElHattab, with detecting events based on sensor data collected at one or more sensor devices. The motivation for doing so would have been to increase the data rate and resolution in which sensor data is generated and accessed by the event detection system ((In Col. 8, lines 31-35) of ElHattab.)
Regarding claim 14, Chari in view of Doh and Mezaael disclose elements of claim 13. Chari in view of Doh and Mezaael do not explicitly disclose:
The system of claim 13, wherein the user interface is further configured to notify the sensor platform of the selected events
However, ElHattab disclose the limitation (In Col. 10, lines 13-10, disclose performing operations that include presenting a notification to a client device and causing the display of a graphical user interface that comprises a presentation of visualizations based on the sensor data generated by the sensor devices.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include ElHattab, with detecting events based on sensor data collected at one or more sensor devices. The motivation for doing so would have been to increase the data rate and resolution in which sensor data is generated and accessed by the event detection system ((In Col. 8, lines 31-35) of ElHattab.)
Regarding claim 15, Chari in view of Doh, Mezaael and ElHattab disclose elements of claim 13. In addition, Chari disclose:
The system of claim 13, wherein the observer station further comprises a model tracker, the model tracker configured to enable a user to detect false positives in detected events and to notify the sensor platform of the false positives (In paragraph [0037], disclose a connection is not established with the remote server 112, the process 300 returns to the decision block 315 to continue monitoring for prespecified events and additional sensor data while waiting for the connection to be established.)
Regarding claim 16, Chari in view of Doh, Mezaael and ElHattab disclose elements of claim 13. In addition, Chari disclose:
The system of claim 13, wherein the observer station further comprises an event prototype, wherein the event prototype is configured to enable a user to prototype new event detection models (In paragraph [0032], Chari discloses executing a second machine-learning program trained to determine whether the second sensor data satisfies at least one second criterion, and transmitting the second sensor data satisfying the at least one second criterion to the vehicle gateway module.)
Regarding claim 17, Chari in view of Doh and Mezaael disclose elements of claim 13. Chari in view of Doh and Mezaael do not explicitly disclose:
The system of claim 13, wherein the observer station further comprises a model tracker, the model tracker configured to: identify new types of interesting events in captured sensor data
and label relevant time intervals as an example of the event
wherein the sensor platform further comprises an event modeling application, the event modeling application configured to receive the labeled time intervals from the observer station and to train a new detection model based on the captured sensor data from the labeled time intervals
However, ElHattab disclose the limitation:
The system of claim 13, wherein the observer station further comprises a model tracker, the model tracker configured to: identify new types of interesting events in captured sensor data (In Col. 6-7, lines 59-3, ElHattab discloses that when the sensor data module 202 finds an event or signs of an event from the sensor data, the object model module 204 looks at an object model that matches the detected event or its signs.)
Label relevant time intervals as an example of the event (In Col. 8, lines 51–57, ElHattab discloses that the event detection system may be configured to cause the gateway to stream snapshots of sensor data from the sensor devices at a predefined interval or data rate)
wherein the sensor platform further comprises an event modeling application, the event modeling application configured to receive the labeled time intervals from the observer station and to train a new detection model based on the captured sensor data from the labeled time intervals (In Col. 8, lines 41–49, disclose detects a feature within a frame of a video stream from one or more of the sensor devices 102. In some embodiments, the sensor data module 202 may include a neural network or machine-learned model trained to recognize certain features that correspond to events and precursors to events.)
Accordingly, it would have been obvious to a person having ordinary skills in the art before the effective filling date of the claimed invention, having the teaching of Chari in view of Doh and Mezaael to include ElHattab, with detecting events based on sensor data collected at one or more sensor devices. The motivation for doing so would have been to increase the data rate and resolution in which sensor data is generated and accessed by the event detection system ((In Col. 8, lines 31-35) of ElHattab.)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to EVEL HONORE whose telephone number is (703)756-1179. The examiner can normally be reached Monday-Friday 8 a.m. -5:30 p.m.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mariela D Reyes can be reached at (571) 270-1006. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
EVEL HONORE
Examiner
Art Unit 2142
/HAIMEI JIANG/Primary Examiner, Art Unit 2142