DETAILED ACTION
The following is a Final Office Action in response to the Amendment/Remarks received on 23 October 2025. Claims 1, 15, 18, and 21 have been amended. Claims 12 was previously cancelled. Claims 1-11 and 13-21 remain pending in this application.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments, see Remarks, pgs. 9-12, filed 23 October 2025 with respect to rejected claims 1-11 and 13-21 under 35 U.S.C. 103 have been fully considered but they are not persuasive.
With respect to the Applicant’s arguments,
Obviousness requires that all claim features are taught or suggested by the combination of cited references. Applicant respectfully submits that the combination of cited references fails to teach or suggest at least the bolded claim features. (see Remarks, pg. 10, paragraph 2)
Saxena does not remedy the shortcomings of Cardinal, Bangalore, and Kohler with respect to amended claim 1. (see Remarks, pg. 10, paragraph 4)
Applicant further submits that the feature of “receiving, by the processing device from at least one imaging sensor, image data associated with one or more of the plurality of irrigation zones”, as recited in amended claim 1, is not taught or suggested by any of the cited references. Furthermore, the cited references are silent regarding the features of “user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises image data captured by the at least one imaging sensor’, as taught by amended claim 1. (see Remarks, pg. 11, paragraph 2)
The Examiner respectfully disagrees.
The Examiner emphasizes that all anticipated components and limitations
of pending claims are present in the prior art as supported below. In addition, the Examiner notes the limitations of “the one or more sensors comprising at least one of a rainfall sensor, a humidity sensor, or a soil moisture sensor”; “receiving, by the processing device from at least one imaging sensor, image data associated with one or more of the plurality of irrigation zones”; “… wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor”; and “… generate an updated irrigation schedule based on weighing, using one or more conflict rules, the sensor data, the user-related event data, and the environmental data to determine a hierarchy of the sensor data, the user-related event data, and the environmental data” were newly presented in the Amendment After Non-Final received on 23 October 2025 by the Office, and has been addressed as set forth in the Office Action below.
In regards to the Applicant’s argument,
However, Saxena’s conflict resolution relates to conflicting automation rules for controllable devices, not to “weighing, using one or more conflict rules, the sensor data, the user-related event data, and the environmental data to determined a hierarchy of the sensor data, the user-related event data, and the environmental data” for irrigation scheduling, as taught in amended claim 1. Saxena’s system “correlates identified states with corresponding controllable device states” but does not teach applying conflict resolution rules to establish a hierarchy among sensor data, user-related event data, and environmental data. (Saxena, [0048]) (see Remarks, pg. 11, paragraph 1)
The Examiner respectfully disagrees.
U.S. Patent Publication No. 2025/0081906 A1 (instant application) discloses:
The server 140 may include conflict resolution rules that are applied and that weigh different events and sensor data 162 to accommodate different preferences. For instance, a detected distress of grass or other plants may outweigh a detected user event that is occurring in the area (e.g., an unscheduled gathering) and still cause the irrigation control system 100 to trigger or allow irrigation on the related area (e.g., irrigation zone). Likewise, an anticipated rainstorm that is being forecast can cause a continued pause in irrigation, despite a user-directed input that scheduled the sprinklers to run on a certain day (e.g., the day before the forecast rainstorm). (pg. 4, par. [0048])
U.S. Patent Publication No. 2016/0248847 A1 (Saxena) teaches:
At 402, identified states are correlated with corresponding controllable device states. For example, a state that has been identified as being the actual state of a subject in 310 of FIG. 3 is correlated with a corresponding status, a configuration, a functional state, a parameter, and/or any other data of a controllable device of devices 102 of FIG. 1A. In some embodiments, the identified states are correlated with corresponding controllable device states by analyzing a stored history of identified states and controllable device states to identify controllable device states that correspond with identified states. For example, corresponding pairings between an identified state (e.g., state vector) and a corresponding controllable state (e.g., status, configurations, functional states, parameters, and/or any other data) of a controllable device are determined. In some embodiments, correlating the identified states with corresponding controllable device states includes performing machine learning to discover correlations. For example, statistical and/or deep learning techniques are utilized to discover temporal correlations between identified states and controllable device states. In some embodiments, the identified states include state vectors that may include one or more of the following: a time value, a weather forecast, a date value, and other data associated with time and/or environment conditions. In some embodiments, a historical probability that an identified state corresponds to a specific controllable device state is determined. (pg. 6, par. [0048])
At 504, it is determined whether the automation rule conflicts with another automation rule to be active at the same time. For example, two or more automation rules are be activated because the identified state triggers these automation rules, but the rules specify conflicting controllable device states that cannot be all implemented at the same time (e.g., one rule specifies an “on” state while another rule specifies an “off” state). In some embodiments, the conflict is resolved. In some embodiments, the each automation rule is associated with a priority value that specifies a priority ordering in the event of a conflict. In some embodiments, the priority value of an automation rule is increased as the automation rule exists in a rule database for a longer period of time. For example, automation rules are dynamically updated and a rule that is renewed in the database is given priority over new rules because older rules have been validated over time. This may be implemented by increasing the priority of a rule every time the rule is revalidated and/or after a period of time passes since being included in a rule database. In some embodiments, a feedback received from a user is utilized to update a priority value of a rule. For example, the priority value is decreased in the event a user modifies a controllable device state to undo a modification to the controllable state caused by the rule being activated. In another example, the priority value is increased in the event a controllable device state is not modified after a modification to the controllable state caused by the rule being activated. In another example, a user may confirm via a user indication whether an activated automation rule is correct. For example, using a user device, a user may specify whether a recently activated automation rule has resulted in a correct automation behavior desired by the user (e.g., a positive indication increases priority value of the rule while a negative indication decreases priority value of the rule). In some embodiments, if the priority value of a rule is below a threshold, the rule is not eligible to be activated. (pgs. 5-6, par. [0053])
At 506, a conflict resolved triggered automation rule, if any, is activated. In some embodiments, for a group of conflicting automation rules, the automation rule with the highest priority value is activated. In some embodiments, activating an automation rule includes modifying state(s) of one or more controllable devices to be the state(s) specified by the activated automation rule. A feedback received from a user regarding the activation may be utilized to modify future activation of the rule. (pg. 6, par. [0054])
The claimed limitation of “… weighing, using one or more conflict rules, the sensor data, the user-related event data, and the environmental data to determine a hierarchy of the sensor data, the user-related event data, and the environmental data” has been interpreted, given the broadest reasonable interpretation in light of the specification, as determining a priority (i.e. weighing), using one or more conflict resolution rules, to determine an order (i.e. hierarchy) of data (i.e. the sensor data, the user-related event data, and the environmental data). Hence, the newly presented limitation of “… weighing, using one or more conflict rules, the sensor data, the user-related event data, and the environmental data to determine a hierarchy of the sensor data, the user-related event data, and the environmental data” is taught by Saxena’s teaching of prioritizing data to specify an order using conflict resolution rules in combination with the prior art of U.S. Patent Publication No. 2004/0181315 A1 (Cardinal) teaching of environmental data and one or more sensors (i.e. rain and soil moisture sensors) to update an irrigation schedule, U.S. Patent Publication No. 2018/0039243 A1 (Bangalore) teaching of an irrigation schedule comprising a corresponding time duration for irrigating each irrigation zone of a plurality of irrigation zones and updating the irrigation schedule, and U.S. Patent Publication No. 2017/0112079 A1 (Eyring) teaching of user-related event data associated with a user activity comprising of image data captured by a imaging sensor as set forth below. Hence, the Applicant’s argument is found unpersuasive.
With respect to the Applicant’s argument,
Furthermore, neither Saxena nor the other cited references teach “wherein at least one of the sensor data … comprises image data associated with at least one of vegetation in a respective irrigation zone …”, as recited in amended claim 1. (see Remarks, pg. 11, paragraph 1)
The Examiner respectfully disagrees.
The Applicant's argument is directed to the prior art references’ failure to show certain features of the invention, it is noted that the features upon which Applicant relies (i.e., “… ‘wherein at least one of the sensor data … comprises image data associated with at least one of vegetation in a respective irrigation zone …’ …”) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993).
In regards to the Applicant’s argument,
Gungl’s image data is used to analyze the growth conditions of a given parcel of land, which is different from utilizing an image sensor to identify “an unscheduled activity, wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor”, as recited in amended claim 1. (see Remarks, pg. 11, paragraph 3)
The Examiner respectfully disagrees.
The Examiner respectfully notes the Applicant’s argument is directed to prior art (U.S. 11,844,315 B2 – Gungl) that was cited by the Examiner to teach a proposed limitation directed to image data associated with vegetation in a zone that was presented in the agenda for Applicant’s Initiated Interview on 18 September 2025. The prior art of Gungle was not cited for teaching “… utilizing an image sensor to identify ‘an unscheduled activity, wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor’”. Further, the Examiner notes the newly presented prior art of U.S. Patent Publication No. 2017/0112079 A1 (Eyring) has been cited for teaching the newly presented limitations of “… the one or more sensors comprising at least one of a rainfall sensor, a humidity sensor, or a soil moisture sensor”; “receiving, by the processing device from at least one imaging sensor, image data associated with one or more of the plurality of irrigation zones”; and “identifying, by the processing device, user-related event data comprising an unscheduled activity, wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor” as set for the below in the rejection of claim 1 under 35 U.S.C. 101.
With respect to the Applicant’s arguments,
Similar language is also included in independent claims 15 and 18. Thus, the combination of Cardinal, Bangalore, Kohler, Saxena, and Gungl does not teach or suggest all the features of the independent claims 1, 15, and 18, and corresponding dependent claims. Applicant respectfully requests the rejection of claims 1-9 and 15-20 under 35 U.S.C. §103 be withdrawn. Claims 10-11 stand rejected under 35 U.S.C. § 103(a) as allegedly being unpatentable over Cardinal in view of Bangalore, Kohler, Saxena and further in view of Bangerter.
Claims 10-11 depend on and include the features of claim 1. As discussed above, the combination of Cardinal, Bangalore, Kohler, and Saxena fails to teach or suggest all of the features of the claims. Bangerter is not relied upon for this purpose in the Office Action. Therefore, Applicant respectfully requests that the rejection of claims 10-11 under 35 U.S.C. § 103 be withdrawn.
Claim 13 stands rejected under 35 U.S.C. § 103(a) as allegedly being unpatentable over Cardinal in view of Bangalore, Kohler, Saxena, Redmond, and further in view of Fleming. Claim 13 depends on and includes the features of claim 1. As discussed above, the combination of Cardinal, Bangalore, Kohler, Saxena, and Redmond fails to teach or suggest all of the features of the claims. Fleming fails to cure these deficiencies. Therefore, Applicant respectfully submits that claim 13 is patentable over the cited references. Accordingly, Applicant requests that the rejection of claim 13 under 35 U.S.C. § 103 be withdrawn.
Claims 14 and 21 stand rejected under 35 U.S.C. § 103(a) as allegedly being unpatentable over Cardinal in view of Bangalore, Kohler, Saxena and further in view of Redmond. Claims 14 and 21 depend on and include the features of claim 1. As discussed above, the combination of Cardinal, Bangalore, Kohler, and Saxena fails to teach or suggest all of the features of the claims. Redmond fails to cure these deficiencies. Therefore, Applicant respectfully submits that claims 14 and 21 are patentable over the cited references.
The Examiner respectfully disagrees.
The Examiner refers to the above response, pg. 2, paragraph 3 - pg. 8, paragraph 6 of this Office action, and the arguments herein as addressed.
Claim 5 is objected to and claims 1-11 and 13-21 stand rejected under 35 U.S.C. 103 as set forth below.
Claim Objections
Claims 5 is objected to because of the following informalities:
Claim 5 includes antecedent and redundancy issues in the limitation of “… the one or more sensors comprise at least one of a rainfall sensor, a humidity sensor, a soil moisture sensor, an imaging sensor, …” in lines 1-3. Claim 1 provides antecedent support for “a rainfall sensor”, “a humidity sensor”, “a soil moisture sensor”, and “an imaging sensor” in the limitations of “… the one or more sensors comprising at least one of a rainfall sensor, a humidity sensor, or a soil moisture sensor; …” in lines 11-12 and “… receiving, by the processing device from at least one imaging sensor …” in line 13.
Appropriate correction is required.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 5-9 and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2004/0181315 A1 (hereinafter Cardinal) in view of U.S. Patent Publication No. 2018/0039243 A1 (hereinafter Bangalore) in further view of U.S. Patent Publication No. 2017/0112079 A1 (hereinafter Eyring) and U.S. Patent Publication No. 2016/0248847 A1 (hereinafter Saxena).
As per claim 1, Cardinal substantially teaches the applicant’s claimed invention. Cardinal teaches the limitations of a method comprising:
identifying, by a processing device (pgs. 1-2, par. [0018], [0019], and [0021], pg. 4, par. [0038], and Fig. 1, element 102 of Fig. 1, element 100; i.e. a central control system (CCS) of an automated landscape irrigation control system (ALICS) and [0019]: “ALICS 100 includes a central control system (CCS) 102 that receives, analyzes and stores data from a variety of sources, including landscape information 106 and environmental information 108.”), user input received via a user interface (pg. 2, par. [0021]; i.e. “ALICS 100 also includes a server 132 that provides management applications and user interfaces. In this example, server 132 is a web server that provides a web interface to the users. There are several ways to provide the web interface, including via a web browser such as Internet Explorer, via a client-side networking application or any other appropriate programs. … The user may access the websites provided by the web server from a computer, a wireless device such as a wireless personal digital assistant (PDA), or any other appropriate device with networking capabilities.”), the user input being associated with a plurality of sets of environmental data, wherein each set of environmental data of the plurality of sets of environmental data is associated with a corresponding irrigation zone of a plurality of irrigation zones (pgs. 1-2, par. [0018], [0019], [0021] and [0022] and pg. 3, par. [0032]; i.e. [0019]: “As used herein, landscape information refers to information about the landscape to be irrigated, including irrigation method, soil type, landscape slope, plant type, root depth, sprinkler precipitation rate, distribution uniformity, dripline diameter, emitter flow rate, number of emitters per plant, etc.” and [0032]: “… a user may configure the CCS with landscape information (300). The configuration is performed via a web interface.”);
generating, by the processing device based on the user input, irrigation schedules associated with the plurality of irrigation zones (pgs. 1-2, par. [0018], [0021], and [0024]; i.e. [0018]: “ The area controlled by the ALICS may be divided into zones based on microclimate, landscape type, or other factors affecting water requirement.” and [0024]: “Based on ET data as well as landscape configuration information 106, user input 104 and water agency input 110, CCS 102 calculates an irrigation schedule.”), the irrigation schedules comprising a corresponding time duration for irrigating each irrigation zone of the plurality of irrigation zones (pg. 2, par. [0024]; i.e. “The irrigation schedule is a schedule that controls the operations of the ICU and determines the amount of water used. In some embodiments, the irrigation schedule includes a set of valve commands that control stations 114-120 by turning valves at the stations on or off at predetermined times. Watering days, non-watering days, and/or hourly restrictions may also be included in the irrigation schedule. Other appropriate instructions for controlling the amount of water used for irrigation may also be included in the irrigation schedule.”);
receiving, by the processing device from one or more sensors (i.e. rain sensor and soil moisture sensors), sensor data associated with one or more of the plurality of irrigation zones (pg. 3, par. [0028] and [0032]; i.e. [0028]: “In some embodiments, one or more soil moisture sensors are provided with each ICU and are used to estimate ET and effective rainfall (i.e., the amount of rainfall absorbed into the plant root zone) by measuring changes in soil moisture at the service site and/or weather station location. The soil moisture readings from the sensors can be used to determine the appropriate length of time to suspend irrigation for local ICUs in the event of a rainfall. In some embodiments, the data is sent over a network to the ICU which, in turn, sends the data to the CCS periodically for use in calculating the ICU watering schedule.” and [0032]: “CCS is updated with environmental information such as weather station measurements and soil moisture sensor (302).”), the one or more sensors comprising a rain fall sensor or a soil moisture sensor (pg. 3, par. [0028]; i.e. “In some embodiments, a rain sensor is provided with each ICU and is used to suspend irrigation by measuring the amount of rainfall at a site and sending instructions to the CCS to suspend irrigation for the appropriate length of time in response to a rainfall event. In some embodiments, one or more soil moisture sensors are provided with each ICU and are used to estimate ET and effective rainfall (i.e., the amount of rainfall absorbed into the plant root zone) by measuring changes in soil moisture at the service site and/or weather station location.”);
updating, by the processing device, the irrigation schedule to generate an updated irrigation schedule based on the sensor data and the environmental data, the updated irrigation schedule having one or more different time durations than the irrigation schedules (pg. 2, par. [0023], [0028] and [0032], pg. 4, par. [0038], pg. 5, par. [0056], and pg. 7, par. [0091] and [0092]; i.e. [0023]: “The inputs from various sources are transferred to CCS 102, which uses the inputs to perform functions such as adjusting an irrigation schedule …”; [0032]: “… a user may configure the CCS with landscape information (300). … This information is stored by the CCS and is used to determine the irrigation schedule for each station managed by the ICU. CCS is updated with environmental information such as weather station measurements and soil moisture sensor (302). Optionally, adjustments can be made by the user, ALICS, and/or water agency.”; [0056]: “After any adjustments are computed (406), the watering interval for each station is then derived (408).”; and [0092]: “If, however, there are changes in the parameters, a new irrigation schedule is then sent to the ICU (510). In some embodiments, additional information such as the date and time for the ICU to make another call to the CCS is also sent.”); and
causing, by the processing device based on the updated irrigation schedules, irrigation of one or more irrigation zones of the plurality of irrigation zones (pg. 7, par. [0092] and [0093]; i.e. [0092]: “If, however, there are changes in the parameters, a new irrigation schedule is then sent to the ICU (510). In some embodiments, additional information such as the date and time for the ICU to make another call to the CCS is also sent.” and [0093]: “When it is time to irrigate, the ICU executes the new sequence (512).”).
Not explicitly taught are an irrigation schedule associated with the plurality of irrigation zones, the irrigation schedule comprising a corresponding time duration for irrigating each irrigation zone of the plurality of irrigation zones;
receiving, by the processing device from at least one imaging sensor, image data associated with one or more of the plurality of irrigation zones;
identifying, by the processing device, user-related event data comprising an unscheduled activity, wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor;
updating the irrigation schedule to generate an updated irrigation schedule based on weighing, using one or more conflict resolution rules, the sensor data, the user-related event data, and the environmental data to determine a hierarchy among the sensor data, the user-related event data, and the environmental data, the updated irrigation schedule having one or more different time durations than the irrigation schedule.
However Banglore, in an analogous art of an irrigation control system (pg. 1, par. [0004]), teaches the missing limitations of an irrigation schedule associated with a plurality of irrigation zones, the irrigation schedule comprising a corresponding time duration for irrigating each irrigation zone of the plurality of irrigation zones (pgs. 4-5, par. [0047] and [0048]; i.e. [0047]: “… a generated irrigation schedule, which specifies a watering time and watering duration for each of the three zones of site 250. For purposes of this illustration, assume that valve 220A corresponds to Zone 1, irrigation valves 220B corresponds to Zone 2, and irrigation valve 220C corresponds to Zone 3. Host device 100 sends remote control signals to irrigation controller 213, thereby causing irrigation controller 213 to actuate valves 220 zone-by-zone according to the irrigation schedule.”); and
updating the irrigation schedule, the updated irrigation schedule having one or more different duration than the irrigation schedule (pg. 4, par. [0046] and [0047] and pg. 10, par. [0095]; i.e. [0046]: “… a generated irrigation schedule, which specifies a watering time and watering duration for each of the three zones of site 250. For purposes of this illustration, assume that valve 220A corresponds to Zone 1, irrigation valves 220B corresponds to Zone 2, and irrigation valve 220C corresponds to Zone 3. Host device 100 sends remote control signals to irrigation controller 213, thereby causing irrigation controller 213 to actuate valves 220 zone-by-zone according to the irrigation schedule.” and [0095]: “This may also allow for the device to continue updating irrigation schedules based on measured weather conditions, even without an Internet connection to a central server.”) for the purpose of regulating water to individual zones (pg. 4, par. [0046]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Cardinal to include the addition of the limitations of an irrigation schedule associated with a plurality of irrigation zones, the irrigation schedule comprising a corresponding time duration for irrigating each irrigation zone of the plurality of irrigation zones; and updating the irrigation schedule, the updated irrigation schedule having one or more different duration than the irrigation schedule to advantageously prevent water waste per use an efficient irrigation schedule appropriate for a particular site (Banglore: pg. 10, par. [0095]).
Cardinal in view of Banglore does not expressly teach receiving, by the processing device from at least one imaging sensor, image data associated with one or more of the plurality of irrigation zones;
identifying, by the processing device, user-related event data comprising an unscheduled activity, wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor; and
updating the irrigation schedule to generate an updated irrigation schedule based on weighing, using one or more conflict resolution rules, the sensor data, the user-related event data, and the environmental data to determine a hierarchy among the sensor data, the user-related event data, and the environmental data
However Eyring, in an analogous art of irrigation systems (pg. 5, par. [0049]), teaches the missing limitations of receiving, by a processing device (Fig. 1, elements 135 or 145; i.e. a remote computer device or a local computing device) from at least one imaging sensor (pg. 2, par. [0027], pg. 3, par. [0030], pg. 4, par. [0037] and [0038], pg. 5, par. [0047] and [0054], and Fig. 1, element 110; i.e. [0030]: “… local computing device 145 and remote computing device 135 may be general purpose computing entities such as a personal computing device, for example, a desktop computer, a laptop computer, a netbook, a tablet personal computer (PC), a control panel, an indicator panel, a multi-site dashboard, an iPod®, an iPad®, a smartphone, a mobile phone, a wearable electronic device (e.g., a smartwatch), a personal digital assistant (PDA), a cordless phone, a wireless local loop (WLL) station, a display device (e.g., TVs, computer monitors, etc.), a BLUETOOTH® device, a Wi-Fi device, a mobile station, a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a mobile device, a wireless device, a wireless communications device, a remote device, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client, a printer, a camera, and/or any other suitable device operable to send and receive signals, store and retrieve data, and/or execute modules.”; [0037]: “Data gathered by the one or more sensors 110 may be communicated to local computing device 145, which may be a stationary or mobile input/output smart home display. … The local computing device 145 may process the data received from the one or more sensors 110 to obtain data related to weather such as humidity, temperature, wind speed; parameters associated with soil conditions such as the type of soil, the current moisture level, and/or the density of the soil; and may process data related to the presence of people, pets, or livestock at the irrigation location such as by obtaining data related to motion, sound, pressure, vibration, temperature changes, change in light, image-capturing, and/or video-capturing.”; [0038]: “In alternate embodiments, remote computing device 135 may process the data received from the one or more sensors 110, via network 120 and server 115, to obtain data related to weather such as humidity, temperature, wind speed; parameters associated with soil conditions such as the type of soil, the current moisture level, and/or the density of the soil; and may process data related to the presence of people, pets, or livestock at the irrigation location such as by obtaining data related to motion, sound, pressure, vibration, temperature changes, change in light, image-capturing, and/or video-capturing.”; [0047]: “… location data may be related to information about the soil and land at the location, weather information, or information related to activity at the location. Based on receiving location data, the irrigation module may analyze the data and make a determination as to when irrigation will commence and for how long. In some embodiments, the irrigation module may determine someone or something is located at and/or moving at the location, and may suspend commencing a pre-scheduled irrigation as a result.”; and [0054]: “… creating or adjusting an irrigation schedule based on weather and soil data, the irrigation schedule may be altered based on activity module 315 determining the presence of people and/or animals. In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard”), image data associated with one or more of a plurality of irrigation zones (pg. 2, par. [0027]; i.e. “The sensors 110 may be stationary and/or mobile, and may be located in a plurality of locations both inside and outside of the home. A sensor 110 may include and/or be one or more sensors that sense: proximity, motion, temperatures, humidity, moisture, weather, slope/incline, soil type, water type, soil density, sound level, time, amount of light, pressure, geo-location data of a user and/or a device, distance, biometrics, weight, speed, height, size, preferences, system performance, and/or other inputs that relate to a security and/or an automation system.”);
identifying, by the processing device (Fig. 1, elements 135 or 145; i.e. the remote computer device or the local computing device), user-related event data comprising an unscheduled activity (pgs. 5-6, par. [0045], [0054], and [0055]; i.e. [0045]: The receiver module 210 may be configured to receive current weather data, weather forecast data, soil-related data as described previously, and data related to the current and future location of people and/or pets in an irrigated area. Information may be passed on to the irrigation module 215, as well as to other components of the control panel 205.”; [0054]: “In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard.”; and [0055]: “… a user may program the irrigation system to water the lawn every Friday at 8:00 p.m., however one Friday the user is throwing a party in the yard. Before the watering is to being, the activity module 315 may determine (by way of receiving sensor information) that people are present in the yard, and thus should put a hold on the watering. Determining the presence of people may be by way of detecting movement by camera and/or motion sensor, or by determining pressure placed on the soil, or by other means.”; Examiner’s Note: Eyring’s identification of a person(s)’s presence at a location using a camera comprises of an unscheduled event since the system is unaware of the person(s)’s presence until detection by the camera at the time of a scheduled irrigation event wherein irrigation is rescheduled based on the person(s)’s presence.), wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones (pgs. 5-6, par. [0054] and [0055]; i.e. [0054]: “In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard.”; and [0055]: “… a user may program the irrigation system to water the lawn every Friday at 8:00 p.m., however one Friday the user is throwing a party in the yard. Before the watering is to being, the activity module 315 may determine (by way of receiving sensor information) that people are present in the yard, and thus should put a hold on the watering. Determining the presence of people may be by way of detecting movement by camera and/or motion sensor, or by determining pressure placed on the soil, or by other means.”), and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor (pgs. 5-6, par. [0054] and [0055]; i.e. [0054]: “In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard.” and [0055]: “… a user may program the irrigation system to water the lawn every Friday at 8:00 p.m., however one Friday the user is throwing a party in the yard. Before the watering is to being, the activity module 315 may determine (by way of receiving sensor information) that people are present in the yard, and thus should put a hold on the watering. Determining the presence of people may be by way of detecting movement by camera and/or motion sensor, or by determining pressure placed on the soil, or by other means.”); and
updating an irrigation schedule to generate an updated irrigation schedule based on the user-related event data (pgs. 5-6, par. [0054] and [0055]; i.e. [0054]: “… creating or adjusting an irrigation schedule based on weather and soil data, the irrigation schedule may be altered based on activity module 315 determining the presence of people and/or animals. In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard” and [0055]: “… a user may program the irrigation system to water the lawn every Friday at 8:00 p.m., however one Friday the user is throwing a party in the yard. Before the watering is to being, the activity module 315 may determine (by way of receiving sensor information) that people are present in the yard, and thus should put a hold on the watering. Determining the presence of people may be by way of detecting movement by camera and/or motion sensor, or by determining pressure placed on the soil, or by other means.”) for the purpose of altering an irrigation schedule based on activity at an location (pgs. 5-6, par. [0054] and [0055]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Cardinal in view of Banglore to include the addition of the limitations of receiving, by a processing device from at least one imaging sensor, image data associated with one or more of a plurality of irrigation zones; identifying, by the processing device, user-related event data comprising an unscheduled activity, wherein the user-related event data is associated with a user activity in at least one of the plurality of irrigation zones, and wherein the user-related event data comprises at least a portion of the image data captured by the at least one imaging sensor; and updating an irrigation schedule to generate an updated irrigation schedule based on the user-related event data to advantageously account for a set of location specific data to a change an irrigation schedule to more efficiently operate an irrigation system and provide water and energy saving irrigation (Eyring: pg. 1, par. [0003]).
Cardinal in view of Banglore in further view of Eyring does not expressly teach updating the irrigation schedule to generate an updated irrigation schedule based on weighing, using one or more conflict resolution rules, the sensor data, the user-related event data, and the environmental data to determine a hierarchy among the sensor data, the user-related event data, and the environmental data.
However Saxena, in an analogous art of controlling networked devices (pg. 2, par. [0014]; i.e. “… a system for automatically controlling network devices. Devices 102 includes one or more network connected devices including sensor devices and controllable devices (e.g., IoT devices). Examples of one or more controllable devices included in devices 102 include … an irrigation system, and any other device able to be connected to a computer network.”), teaches the missing limitation of control, based on weighing, using one or more conflict resolution rules, data to determine a hierarchy among the data (pgs. 6-7, par. [0048], [0053], and [0054]; i.e. [0048]: “… the identified states are correlated with corresponding controllable device states by analyzing a stored history of identified states and controllable device states to identify controllable device states that correspond with identified states. For example, corresponding pairings between an identified state (e.g., state vector) and a corresponding controllable state (e.g., status, configurations, functional states, parameters, and/or any other data) of a controllable device are determined. In some embodiments, correlating the identified states with corresponding controllable device states includes performing machine learning to discover correlations. For example, statistical and/or deep learning techniques are utilized to discover temporal correlations between identified states and controllable device states. In some embodiments, the identified states include state vectors that may include one or more of the following: a time value, a weather forecast, a date value, and other data associated with time and/or environment conditions.”; [0053]: “At 504, it is determined whether the automation rule conflicts with another automation rule to be active at the same time. For example, two or more automation rules are be activated because the identified state triggers these automation rules, but the rules specify conflicting controllable device states that cannot be all implemented at the same time (e.g., one rule specifies an “on” state while another rule specifies an “off” state). In some embodiments, the conflict is resolved. In some embodiments, the each automation rule is associated with a priority value that specifies a priority ordering in the event of a conflict.”; and [0054]: “In some embodiments, for a group of conflicting automation rules, the automation rule with the highest priority value is activated.”) for the purpose of modifying a state(s) of one or more controllable devices to the state(s) specified by an activated automation rule (pg. 7, par. [0054]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Cardinal in view of Banglore in further view of Eyring to include the addition of the limitation of control, based on weighing, using one or more conflict resolution rules, data to determine a hierarchy among the data to advantageously improve device efficiency and functionality per autonomous and automatic control (Saxena: pg. 2, par. [0002]).
As per claim 2, Cardinal teaches a receiver of an irrigation control system comprises the processing device (pg. 2, par. [0023], pg. 3, par. [0032], pg. 7, par. [0091] and Fig. 1, element 102; i.e. the central control system (CCS) and [0023]: “The inputs from various sources are transferred to CCS 102, which uses the inputs to perform functions such as adjusting an irrigation schedule, changing the configuration of irrigation control unit (ICU) 112, providing user feedback and alerts, etc.” and [0032]: “… a user may configure the CCS with landscape information (300). The configuration is performed via a web interface. This information is stored by the CCS and is used to determine the irrigation schedule for each station managed by the ICU.”).
As per claim 3, Cardinal teaches the identifying of the user input comprises receiving, by the processing device, the user input from a user device via a wireless network (pg. 2, par. [0019] and [0021]; i.e. [0019]: “In this example, the information is sent from its source to CCS 102 via a network 130. Throughout this specification, a network is referred to as a medium over which information is sent, including telephone networks, wireless networks, proprietary networks, the Internet, intranets, local area networks, wide area networks, or combinations thereof.” and [0021]: “Web server 132 communicates with users via network 130, and exchanges data with CCS 102 via the same network or a separate network. The user may access the websites provided by the web server from a computer, a wireless device such as a wireless personal digital assistant (PDA), or any other appropriate device with networking capabilities.”).
As per claim 5, Cardinal teaches the one or more sensors comprise at least one of a rainfall sensor or a soil moisture sensor (pg. 3, par. [0028] and [0032] and pg. 5, par. [0069]; i.e. [0028]: “In some embodiments, a rain sensor is provided with each ICU and is used to suspend irrigation by measuring the amount of rainfall at a site and sending instructions to the CCS to suspend irrigation for the appropriate length of time in response to a rainfall event. In some embodiments, one or more soil moisture sensors are provided with each ICU and are used to estimate ET and effective rainfall (i.e., the amount of rainfall absorbed into the plant root zone) by measuring changes in soil moisture at the service site and/or weather station location.” and [0032]: “CCS is updated with environmental information such as weather station measurements and soil moisture sensor (302). Optionally, adjustments can be made by the user, ALICS, and/or water agency.”).
Cardinal does not expressly teach the sensor data received from the one or more sensors further comprises at least one of existing sensor data or anticipated sensors data.
Cardinal in view of Banglore does not expressly teach he sensor data received from the one or more sensors further comprises at least one of existing sensor data or anticipated sensors data.
However Eyring, in an analogous art of irrigation systems (pg. 5, par. [0049]), teaches the missing limitations of sensor data received from one or more sensors (pg. 5, par. [0054] and Fig, 1, element 110; i.e. the digital camera or the video camera and [0054]: “In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard.”) further comprises of anticipated sensors data (pg. 1, par. [0027] and pg. 5, par. [0045] and [0054]; i.e. [0027]: “A sensor 110 may include and/or be one or more sensors that sense: proximity, motion, temperatures, humidity, moisture, weather, slope/incline, soil type, water type, soil density, sound level, time, amount of light, pressure, geo-location data of a user and/or a device, distance, biometrics, weight, speed, height, size, preferences, system performance, and/or other inputs that relate to a security and/or an automation system.”; [0045]: “The receiver module 210 may be configured to receive current weather data, weather forecast data, soil-related data as described previously, and data related to the current and future location of people and/or pets in an irrigated area.”; and [0054]: “In one embodiment, the activity module 315 may obtain information from sensors 110, where sensors 110 capture information related to the presence of people and/or animals. For example, sensors 110 may be one of a digital camera, a video camera, a motion sensor, a vibration sensor, and/or a pressure sensor. Activity module 315 may receive information from sensors 110 to determine activity in the yard.”) for the purpose of altering an irrigation based on activity at a location (pgs. 5-6, par. [0054] and [0055]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Cardinal in view of Banglore to include the addition of the limitation of sensor data received from one or more sensors further comprises of anticipated sensors data to advantageously account for a set of location specific data to a change an irrigation schedule to more efficiently operate an irrigation system and provide water and energy saving irrigation (Eyring: pg. 1, par. [0003]).
As per claim 6, Cardinal teaches wherein the causing of the irrigation of the one or more irrigation zones compris