DETAILED ACTION
A. This action is in response to the following communications: Transmittal of New Application filed 12/01/2023.
B. Claims 1-15 remains pending.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-12 and 14-15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
The claim recites generating a command depending on an acquired audio signal, the generating of the command being triggered depending on a type of audio signal acquired.
The limitation generating a command is executed by a generic computer, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, other than reciting “by a processor,” nothing in the claim element precludes the step from practically being performed in the mind. For example, but for the “by a processor” language, “in response to generating” in the context of this claim encompasses the user manually organizing data.
If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, the claim only recites one additional element – using a processor transmit the generated command steps. The processor in both steps is recited at a high-level of generality (i.e., as a generic processor performing a generic computer function of ranking information based on a determined amount of use) such that it amounts no more than mere instructions to apply the exception using a generic computer component.
Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a processor to perform both the generating and transmitting steps amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept.
The claim is not patent eligible.
Claims 2-12 and 14-15 do not include elements that amount to significantly more than the abstract idea and are also rejected under the same rational.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-15 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Marten, Neil (US Pub. 2016/0334811 A1), herein referred to as "Marten".
As for claims 1, 11 and 14, Marten teaches. A control method and corresponding device of 11 and system of 14 for automatically controlling a household apparatus position, the control method being implemented by a control device and comprising (par. 3 a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device):
Specifically for claim 11: at least one processor; and
at least one non-transitory computer readable medium comprising instructions stored thereon which when executed by the at least one processor configure the automatic controller device to (par. 7 The instructions may be executable by one or more processors for determining a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device):
Specifically for claim 14: an audio sensor co-located with a household apparatus;
an actuator of the household apparatus connected to modify a position of the household apparatus; and an automatic controller device comprising at least one processor configured to (par. 8 the instructions of the computer-readable medium may be executable by one or more processors for receiving audio data from a microphone in operative communication with the host system, where the audio data corresponds to the received image data, and analyzing the image data with the audio data to identify the outdoor event):
generating a command depending on an acquired audio signal, the generating of the command being triggered depending on a type of audio signal acquired (par. 101 receiving audio from microphone, the method 400 may include receiving audio data at the host system from a microphone in operative communication with the host system, where the audio data corresponds to the received image data); and
transmitting the generated command to an actuator of the household apparatus, the generated command being able to trigger control by the actuator of the household apparatus, the actuator moving the household apparatus into a position that is dependent on the command (par. 101 sending the audio to the host system: the audio data and the image data may be taken at approximately a same time, and/or during the same predetermined time intervals. In another aspect, the microphone may be provided by the camera, such that the image data and the audio data are sent to the host system together as a video stream. Other examples are possible).
Examiner notes that paragraph 112 describes utilizing only audio to make determination of weather event in order to activate an home automation. Paragraph 117 notes that home automation consists of opening and/or closing windows among other home automations.
[0112]
As further described herein, the method 400 may provide for weather detection via an analysis of sound, which may identify and/or confirm a particular weather event. For instance, such sound data may be detected by a microphone from a security camera, cell phone camera, and/or any other camera in operative communication with the host system. The microphone may capture sounds of rain, hail, high wind, and so on, which may be analyzed by the host system to confirm other possible triggers and/or weather events. In another aspect, the host system may provide an audio decibel comparison to confirm or determine various characteristics of weather events. For example, the host system may determine a general direction and/or position of a weather event, such as whether a storm is approaching the home and/or moving further away from the home. In another aspect, the host system may determine, based on the sound data, whether the storm is increasing in intensity, decreasing in intensity, and/or already has already passed.
[0117]
In another example, the home automation rules may include rules that trigger various devices to operate and/or respond to various weather events. Such rules may be user-defined and/or modified further via the mobile application. Merely by way of example, the home automation rules may include: closing a garage door upon detection of a weather event indicating rain; opening and/or closing automated doors, windows, and/or skylights based on detection of rain, sun, and/or temperature; closing automated shades on a west side of the house if the detected weather event indicates a hot and sunny afternoon; delaying an automated watering system if the weather event indicates rain that lasts for longer than a predetermined period of time, such as two minutes; turning on outdoor and/or indoor lights based on a detected level of natural light; adjusting a thermostat for heating and/or air conditioning systems, for instance, to prevent pipes from freezing when the detected weather event indicates extreme temperatures and weather; and so on. Further, the host system may provide weather triggers that are activated when certain descriptive tags and/or weather events are detected, such as rain, snow, hail, high wind, and/or any other user-defined situation. Such weather triggers may include or call out specific home automation rules associated with a start of a weather event, and/or an end of the weather event. Such weather triggers may further include a minimum amount of time or duration for the event, and/or a minimum threshold amount of precipitation. Other examples are possible.
As for claim 2, Marten teaches. The control method as claimed in the claim 1, the control method comprising audio-signal analysis of the acquired audio signal, the generated command being dependent on a result of the analysis of the acquired audio signal (par. 102 400 may include analyzing the received data for one or more weather events (step 404). For example, the method may include analyzing the image data and/or audio data to identify an outdoor weather event, such as various forms of precipitation, cloud coverage, wind, level of sunshine, and/or temperature data).
As for claim 3, Marten teaches. The control method as claimed in claim 1, the control method comprising audio-signal recognition of the acquired audio signal, the generated command being dependent on the recognized audio signal (par. 102 identify an outdoor weather event, such as various forms of precipitation, cloud coverage, wind, level of sunshine, and/or temperature data).
As for claim 4, Marten teaches. The control method as claimed in claim 1, the control method comprising a location of an audio source of the acquired audio signal, the generated command being dependent on the location of the audio source of the acquired audio signal with respect to the position of the household apparatus (par. 103 the method 400 may include receiving a third-party weather forecast that includes current conditions for a geographic location of the host system, such as a zip-code of a home having the host system, and/or determining whether the identified outdoor weather event is consistent with the third-party weather forecast; par. 101 includes sensors at various locations (inside/outside)).
As for claim 5, Marten teaches. The control method as claimed in claim 1, the control method comprising prediction of how the acquired audio signal will vary after a time at which the audio signal is acquired (par. 101 the image data includes a moving image, such as a video file, that is captured over a predefined length of time by the camera at predetermined time intervals. In still other examples, the method 400 may include receiving audio data at the host system from a microphone in operative communication with the host system, where the audio data corresponds to the received image data. For instance, the audio data and the image data may be taken at approximately a same time, and/or during the same predetermined time intervals. In another aspect, the microphone may be provided by the camera, such that the image data and the audio data are sent to the host system together as a video stream).
As for claim 6, Marten teaches. The control method as claimed in claim 5, wherein the generated command is a command set depending on at least one predicted parameter of the acquired audio signal (par 107 It is noted that the comparison of baseline images with incoming image data may contribute toward the determination of weather events and/or trigger additional levels of scrutiny and comparison. The method 400 may be utilized in conjunction with other weather-prediction and/or confirmation features, as described further within description).
As for claim 7, Marten teaches. The control method as claimed in claim 1, the control method comprising detecting a context of a modification of the position of the household apparatus depending on the acquired audio signal, wherein the detecting of the context triggers the command generation (par. 117 automatic opening/closing window among other household apparatuses).
As for claim 8, Marten teaches. The control method as claimed in claim 7, wherein detection of the context is dependent on a criterion relating to the acquired audio signal from among the following criteria: audio signal level of the acquired audio signal, the type of audio signal acquired, predicted duration of the acquired audio signal, predicted frequency of appearance of the acquired audio signal (par. 112 detection of audio and type of audio including frequencies compared against a database to make prediction of weather events in which home automation can be activated based upon prediction).
As for claim 9, Marten teaches. The control method as claimed in claim 7, wherein detection of the context is dependent on at least one preconfigured control parameter (par. 116 based upon detected context the user on a mobile device can change a parameter of a household apparatus to open/close as further mentioned in par. 117).
As for claim 10, Marten teaches. A non-transitory computer readable medium comprising a program stored thereon comprising program code instructions for executing the control method as claimed in claim 1 when said program is executed by a processor of the control device (par. 127 non-transitory storage devices of the computer system used to carry out invention).
As for claim 12, Marten teaches. The automatic controller device as claimed in claim 11, wherein the instructions further configure the device to generate a plurality of commands depending on the at least one acquired audio signal, each generated command being able to trigger control by a separate actuator of a separate household apparatus (par. 117 the home automation rules may include rules that trigger various devices to operate and/or respond to various weather events. Such rules may be user-defined and/or modified further via the mobile application).
As for claim 13, Marten teaches. The automatic controller as claimed in claim 12, wherein a household apparatus of the at least one household apparatus is an apparatus from among the following: an openable device a noise reducer; an echo-cancelling device (par. 117 opening/closing door/window/etc.…; par. 120 The method 500 may compare the visual data with the baseline image(s) to determine one or more weather events (step 508). In another aspect, the method may analyze the audial data, for instance, by determining a sound profile and matching the sound profile to known sound profiles for various weather events (step 510). Such sound profiles may include rain, sleet, hail, wind, crickets, and/or other sounds that may indicate a weather condition. The method 500 may include determining one or more weather events based on the analyzed audial data).
As for claim 15, Marten teaches. The control system as claimed in claim 1, the control system comprising a plurality of audio sensors co-located with separate household apparatuses, the automatic controller being configured to generate a command for controlling the household apparatus co-located with the audio sensor that acquired the acquired audio signal (par. 117 automatic opening/closing window among other household apparatuses).
(Note :) It is noted that any citation to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006,1009, 158 USPQ 275, 277 (CCPA 1968)).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Asset System Control Arrangement And Method
Document ID
US 20050046584 A1
Date Published
2005-03-03
Abstract
System and method for wirelessly controlling systems in an asset, such as a house or trailer, in which a movable device, such as a PDA, cellular telephone or vehicle, includes a transmitter arranged to transmit signals, and a control unit is arranged on or in connection with the asset and includes a receiver which communicates with the transmitter and a processor coupled to the receiver and which generates different command signals based on signals generated by the transmitter and received by the receiver. Each system is arranged on or in connection with the asset and coupled to the control unit and is responsive to command signals from the processor to perform a function relating to or affecting the asset.
HOME AUTOMATION WEATHER DETECTION
Document ID
US 20160334811 A1
Date Published
2016-11-17
Abstract
Systems and methods for controlling a device in a home automation network based on detection of a weather event include receiving image data from a camera in operative communication with a host system, where the image data is representative of an outdoor weather event that is captured by the camera, and analyzing the image data to identify the outdoor weather event. Systems and methods may include determining a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device, and instructing the home automation device based on the determined home automation rule via a home automation network.
Inquires
Any inquiry concerning this communication should be directed to NICHOLAS AUGUSTINE at telephone number (571)270-1056.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
PNG
media_image1.png
213
559
media_image1.png
Greyscale
/NICHOLAS AUGUSTINE/Primary Examiner, Art Unit 2178 January 27, 2026