Prosecution Insights
Last updated: April 19, 2026
Application No. 18/021,241

REAL-TIME CONTROL OF MECHANICAL VALVES USING COMPUTER VISION AND MACHINE LEARNING

Final Rejection §103
Filed
Feb 14, 2023
Examiner
TRAN, VI N
Art Unit
2117
Tech Center
2100 — Computer Architecture & Software
Assignee
King Fahd University Of Petroleum And Minerals
OA Round
2 (Final)
46%
Grant Probability
Moderate
3-4
OA Rounds
4y 1m
To Grant
83%
With Interview

Examiner Intelligence

Grants 46% of resolved cases
46%
Career Allow Rate
46 granted / 99 resolved
-8.5% vs TC avg
Strong +36% interview lift
Without
With
+36.3%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
39 currently pending
Career history
138
Total Applications
across all art units

Statute-Specific Performance

§101
15.5%
-24.5% vs TC avg
§103
53.8%
+13.8% vs TC avg
§102
13.3%
-26.7% vs TC avg
§112
11.2%
-28.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 99 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This Office Action has been issued in response to amendment filed 09/17/2025. Applicant's arguments have been carefully and fully considered; and they are not persuasive. Therefore, this action has been made FINAL. Claim Status Claims 1-4 and 15 have been amended. Claims 1-20 remain pending and are ready for examination. Response to Arguments Applicant's arguments have been fully considered but they are not persuasive. With respect to applicant’s argument of the page 9 of the remarks: “Xia discloses a system that (1) uses a camera to identify a user and retrieve its preferences, and (2) controls a fluid dispenser based on gesture motions collected by a sensor system. The system in Xia does not use an Al algorithm for either (1) or (2). In other words, the system in Xia relies on the sensor system 25 for identifying a gesture motion of the user, not on the camera 17. This is different from the independent Claim 1, which recites that the Al algorithm is configured to "extract a user action or gesture from the visual data," and to "generate a command based on the extracted user action or gesture." Note that no sensor is recited by the claims for identifying a gesture motion of a user.” The Examiner respectfully disagrees. Examiner respectfully would like to remind applicant that the rejections are based on the broadest reasonable interpretation of the claim limitations. Specification [0031] discloses “In this way, the user initially makes a first gesture or action to the camera to login into his/her profile... For example, the first gesture may be the user's face”. Based on applicant’s description, Examiner interpreted that ‘generate a user identifier’ reads on “extract a user action or gesture” and “responsive to user data to generate a control signal’ reads on “generate a command based on the extracted user action or gesture”. Moreover, the limitation does not limit the number of sensors to detect user’s action or gesture. Therefore, the method can use camera and sensors data to generate a control signal. With respect to applicant’s argument of the page 9 of the remarks: “To cure the deficiencies of Xia, the Office proposes to add an Al algorithm from Al Jazaery. However, the Al algorithm from Al Jazaery is configured to produce user gestures from captured images. If the system in Xia is modified to not use the sensor system for determining the user gestures, as taught by Al Jazaery, then the system in Xia needs to be modified such that feature (2) is removed, i.e., its principle of operation is modified to rely on an Al algorithm instead of the sensor system 25 for obtaining the user gestures” The Examiner respectfully disagrees. The understanding of the problem, based on applicant’s description, is using artificial intelligence for analyzing. Xia discloses processes captured images of the user's face and generates a user identifier, retrieve user data indicative of one or more user preferences, then generate a control signal responsive to the user data, but lacks analyzing images/videos using artificial intelligence. Al Jazaery [0030] discloses using machine learning model to analyze images/video from cameras. It would have been obvious to one of ordinary skill in the art to try the machine learning model of Al Jazaery in the method of Xia to improve the usage efficiency and user experience when interacting with the home appliances. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-6 and 14-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia (US20170254055A1 -hereinafter Xia) in view of Al Jazaery et al. (US20200110928A1 -hereinafter Al Jazaery). Regarding Claim 1, Xia teaches A smart-valve system for controlling a fluid (see [0105]; Xia: “Referring to FIGS. 1 to 3, there is illustrated a fluid dispensing system 1.”), the smart-valve system comprising: a hybrid valve configured to control a flow of the fluid (see [0105]; Xia: “System 1 is particularly adapted to function as a kitchen or bathroom water dispenser to dispense water as a primary fluid and optionally other fluids on demand.”), wherein the hybrid valve includes electronics for controlling the flow of the fluid (see [0112]; Xia: “The actuators are responsive to the control signal and the local input signals to dispense water from fluid dispenser 3 with predefined characteristics… The predefined characteristics include but are not limited to one or more of a predefined temperature, a predefined flow rate, a predefined dispensing time, a predefined distribution profile and/or a predefined fluid quality.”), and the hybrid valve also includes a manual handle for controlling the flow of the fluid (see [0115]; Xia: “As illustrated in the inset of FIG. 1, display 31 and handle 9 are able to be manually or automatically pivotally swivelled with respect to base 5. In a manual mode of operation, this motion dispenses water at a predetermined temperature and flow rate in a similar manner to that of known mixer taps.”); a controller connected to the electronics and configured to control the electronics to close or open the hybrid valve; (see [0112]; Xia: “The stored predefined gestures are associated with respective functional controls for selectively controlling a plurality of actuators within an actuator system 29.”) a camera oriented to capture visual data about a user; and (see [0120]; Xia: “Camera 17 captures an image of the user's face and the user is identified.”) …configured to receive the visual data from the camera, extract a user action or gesture from the visual data (see [0109]; Xia: “Camera 17 is configured to identify a user of system 1 through facial recognition, iris recognition or other visual biometric identification. Camera 17 is connected to a processor 19, which processes captured images of the user's face and generates a user identifier.), generate a command based on the extracted user action or gesture (see [0109]; Xia: “A network communication device 21 is responsive to the user identifier to access a remote server 23 to retrieve user data indicative of one or more user preferences. Processor 19 is responsive to the user data to generate a control signal.”), and send the command to the hybrid valve to control the flow of the fluid. (see [0120]; Xia: “The gesture is detected and, if matched with a predefined gesture, a corresponding local input signal and the control signal control the actuators to dispense water at a predefined temperature and flow rate.”) However, Xia does not explicitly teach: an artificial intelligence, Al, algorithm… Al Jazaery from the same or similar field of endeavor teaches: an artificial intelligence, Al, algorithm… (see [0050]; Al Jazaery: “the appliance control unit 107 further includes an image processing unit 115 which includes one or more machine learning models for analyzing the sequence of images (e.g., consecutive image frames of a video) from the one or more cameras 102, and provide motion gestures deduced from the image analysis performed on the images.” See [0026]: “As disclosed herein, using the camera makes it possible to control appliances with not only hands but also body language. It also makes it possible to control appliances, with not only hands, but also facial and head movement and expressions.” See [0006]: “This is also helpful in situations where the appliance is sensitive to disturbances cause by contact (e.g., a smart fish tank for sensitive or dangerous pets), and a user can control the appliance (e.g., setting internal environment, and release food or water to the pet, etc.) without direct contact with the appliance. This is also helpful in situations where the user does not want to touch the appliance's control panel because the user's hands are contaminated (e.g., the user's hands are web), and the user can control the appliance using motion gestures.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of Xia to include Al Jazaery’s features of an artificial intelligence, Al, algorithm. Doing so would improve the usage efficiency and user experience when interacting with the home appliances. (Al Jazaery, [0002]) Regarding Claim 2, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Al Jazaery further teaches wherein the user action or gesture is a hand orientation (see [0067]; Al Jazaery: “the motion gesture is a gesture based on movement of the user's body or certain parts of the body (e.g., user's hand, arm, head, etc.) independent of a contact surface (e.g., touchpad, touch-screen, mouse, joystick, etc.), a touch-less gesture, sign language gestures, etc.). In some embodiments, the first request was triggered by the user at the first home appliance by an input (e.g., a voice input or a specially designated “wake-up” gesture (e.g., wave hand from left to right and back to the left) or sound (e.g., three claps)).” See [0044]: “The one or more processing modules 114 utilize the data and models 116 to monitor presence of users and motion gestures performed by the users to determine a suitable control command and a suitable target appliance for the control command.”) or a facial expression. (see [0026]; Al Jazaery: “It also makes it possible to control appliances, with not only hands, but also facial and head movement and expressions. This allows people who cannot move their hands to control appliances with motion gestures provided by head movement or face movement (e.g., smiling or make an angry face)”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 2. Regarding Claim 4, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Xia further teaches wherein the controller hosts ... and also a database of profiles of users (see [0111]; Xia: “The predefined gestures are stored in a database 27, which is in communication with processor 19.”), and …is configured to match a facial expression of the user, captured with the camera, to a corresponding profile, and control the hybrid valve based on information stored in the profile. (see [0120]; Xia: “Camera 17 captures an image of the user's face and the user is identified…The gesture is detected and, if matched with a predefined gesture, a corresponding local input signal and the control signal control the actuators to dispense water at a predefined temperature and flow rate.”) However, Al Jazaery does not explicitly teach: the Al algorithm… Al Jazaery from the same or similar field of endeavor teaches: the Al algorithm… (see [0050]; Al Jazaery: “the appliance control unit 107 further includes an image processing unit 115 which includes one or more machine learning models for analyzing the sequence of images (e.g., consecutive image frames of a video) from the one or more cameras 102, and provide motion gestures deduced from the image analysis performed on the images.” See [0026]: “As disclosed herein, using the camera makes it possible to control appliances with not only hands but also body language. It also makes it possible to control appliances, with not only hands, but also facial and head movement and expressions.” See [0006]: “This is also helpful in situations where the appliance is sensitive to disturbances cause by contact (e.g., a smart fish tank for sensitive or dangerous pets), and a user can control the appliance (e.g., setting internal environment, and release food or water to the pet, etc.) without direct contact with the appliance. This is also helpful in situations where the user does not want to touch the appliance's control panel because the user's hands are contaminated (e.g., the user's hands are web), and the user can control the appliance using motion gestures.”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 4. Regarding Claim 5, the combination of Xia and Al Jazaery teaches all the limitations of claim 4 above, Xia further teaches wherein the information is related to at least one of a temperature of the fluid, a pressure of the fluid, a time to turn on the hybrid valve, a direction of the fluid, a time to turn off the hybrid valve. (see [0120]; Xia: “The temperature and flow rate are defined by the user preferences and the particular gesture performed by the user.” See [0112]: “The predefined characteristics include but are not limited to one or more of a predefined temperature, a predefined flow rate, a predefined dispensing time, a predefined distribution profile and/or a predefined fluid quality.”) Regarding Claim 6, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Xia further teaches further comprising: a temperature sensor located next to the user (see [0139]; Xia: “display 33 or screen 45 is adapted to display the estimated temperature of the fluid being dispensed.”) and configured to supply a measured temperature… to adjust a temperature of a dispensed fluid. (see [0127]; Xia: “Water will automatically be dispensed from outlet 49 at a temperature and flow determined by the user preferences or other default settings for a predetermined time or until the object is removed from the detection field.”) However, Xia does not explicitly teach: …supply a measured temperature to the Al algorithm… Al Jazaery from the same or similar field of endeavor teaches: …supply a measured temperature to the Al algorithm… (see [0042]; Al Jazaery: “a respective appliance of the one or more appliances 124 further includes sensors, which senses environment information of the respective appliance… In some embodiments, the sensors also provide information on the indoor environment, such as temperature, time of day, lighting, noise level, activity level of the room. See [0091]: “appliance control unit 107, which controls the appliance 124, including but not limited to …image processing unit 115”. See [0050]: “the appliance control unit 107 further includes an image processing unit 115 which includes one or more machine learning models for analyzing the sequence of images (e.g., consecutive image frames of a video) from the one or more cameras 102, and provide motion gestures deduced from the image analysis performed on the images.”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 6. Regarding Claim 14, the limitations in this claim is taught by the combination of Xia and Al Jazaery as discussed connection with claim 1. Regarding Claim 16, the combination of Xia and Al Jazaery teaches all the limitations of claim 14 above, Xia further teaches wherein the parameter is a temperature or a flow rate, or flow shape, or on or off state of the hybrid valve. (see [0120]; Xia: “The gesture is detected and, if matched with a predefined gesture, a corresponding local input signal and the control signal control the actuators to dispense water at a predefined temperature and flow rate. The temperature and flow rate are defined by the user preferences and the particular gesture performed by the user.” See [0174]: “At the second level 248, sensor 242 issues a control signal to a corresponding actuator to shut off the fluid flow from dispenser 3.”) Claim(s) 3 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia in view of Al Jazaery in view of Chew (US20140009378A1 -hereinafter Chew). Regarding Claim 3, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Al Jazaery further teaches …selecting a specific control action of the hybrid valve (see [0157]; Al Jazaery: “Control signal 187 is passed to a series of actuatable valves 188 and one or more fluid modifier devices 189. The valves 188 selectively direct the water through predetermined ones of the fluid modifier devices 189 based on the control signal.”), and …controlling the hybrid valve. (see [0171]; Al Jazaery: “Actuator system 29 includes an actuator (not shown) to control the opening and closing of valve 228, which is disposed within an upper section of drainage conduit 204.”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 3. However, it does not explicitly teach: includes a first gesture or action for logging into a corresponding profile, a second gesture or action for selecting a specific control action…, and a third gesture or action for controlling... Chew from the same or similar field of endeavor teaches includes a first gesture or action for logging into a corresponding profile (see [0009]; Chew: “An embodiment includes a system, such as an entertainment system, recognizing a first image of a first user via a camera, selecting a corresponding first profile for the first user.”), a second gesture or action for selecting a specific control action… (see [0009]; Chew: “the profile may include various gesture signatures—indicators that help distinguish the first user's gestures from one another (e.g., a “thumbs up” sign from a “halt” open faced palm related sign) and possibly from gestures of another user. After loading the user profile the system may interpret the first user forming his fist with his thumb projecting upwards as acceptance of a condition/question presented to the user visually on a GUI.” See [0016]: “The initial prompt may include a question, statement, instruction and the like. For example, the prompt may include “Make a gesture indicating you accept or agree.” The initial prompt may be communicated to User1 orally (e.g., produced from a television or monitor speaker) or visually (e.g., displayed on a monitor). The initial image data may be an image of User1 captured by a camera coupled to the system. The initial gesture may be a “thumbs up” gesture that includes User1's right hand, clinched in a fist, with the thumb pointing up.”), and a third gesture or action for controlling... (see [0016]; Chew: “Thus, at a later time a software API for television viewing may prompt User1 “Do you want to record this program?” to which User1 can simply flash a “thumbs up” sign to indicate the user does in fact want the program recorded.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of Xia and Al Jazaery to include Chew’s features of a first gesture or action for logging into a corresponding profile, a second gesture or action for selecting a specific control action, and a third gesture or action for controlling. Doing so would appreciate such differences in gestures and therefore reduce complicate use of such systems for various segments of society. (Chew, [0002]) Regarding Claim 15, the limitations in this claim is taught by the combination of Xia, Al Jazaery, and Chew as discussed connection with claim 3. Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia in view of Al Jazaery further in view of Joo (KR101268311B1 -hereinafter Joo -Note: As the Machine Translation attached). Regarding Claim 7, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above; however, it does not explicitly teach wherein the hybrid valve is an industrial valve in a plant, remotely located from the user so that the user does not have physical access to the hybrid valve. Joo from the same or similar field of endeavor teaches wherein the hybrid valve is an industrial valve in a plant, remotely located from the user so that the user does not have physical access to the hybrid valve. (see page 2, paragraph 4; Joo: “In general, in a special environment such as a power plant, a chemical complex, etc., a valve is frequently installed at a location remote from an inaccessible position or an operation position. Thus, in order to operate a valve mounted at a remote location that cannot be reached by the operator to operate the handle of a valve located at a remote location using a chain or using a P.O.V (Power Operated Valve).”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Joo’s features of being an industrial valve in a plant, remotely located from the user so that the user does not have physical access to the hybrid valve. Doing so would performing remote control of a stable valve actuator at low cost. (Joo, page 2, paragraph 8) Claim(s) 8 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia in view of Al Jazaery in view of Rexach et al. (US20190090056A1 -hereinafter Rexach). Regarding Claim 8, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above; however, it does not explicitly teach: further comprising: additional hybrid valves controlled by the same controller, wherein the Al algorithm is configured to individually control each hybrid valve to adjust a temperature of a dispensed fluid. Rexach from the same or similar field of endeavor teaches further comprising: additional hybrid valves controlled by the same controller (see [0077]; Rexach: “FIG. 3 illustrates a communication network for the example sets of appliances of FIG. 1 and/or FIG. 2. The communication network may include a server 13, a network device 14, and a communication bus or local network 15. The communication bus or local network 15 may be connected to one or more of any combination of …the programmable (automated) shower 2, …the sink faucet (automated sink) 8”), wherein the Al algorithm is configured to individually control each hybrid valve to adjust a temperature of a dispensed fluid. (see [0096]; Rexach: “The order of user preference may be set by a user input or a default order. The order of user preference may specify individual appliances. For example, the order of preference may specify the intelligent mirror, intelligent toilet, intelligent shower, light guides or other devices.” See [0392]: “The artificial intelligence device 800 may include a communication interface 806 that is configured to receive sensor data from an external appliance. The external appliance may be any of the appliances (e.g., intelligent bathroom device, intelligent kitchen devices, or host appliances) described herein.” See [0112]: “a temperature setting for the automated shower 2 may be indicative of user preferences and used to determine a temperature setting for the automated sink 8.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of Xia and Al Jazaery to include Rexach’s features of additional hybrid valves controlled by the same controller, wherein the Al algorithm is configured to individually control each hybrid valve to adjust a temperature of a dispensed fluid. Doing so would improve the operation of the household appliances by increasing the data set from which personalized device functions can be selected and performed. (Rexach, [0057]) Regarding Claim 17, the combination of Xia and Al Jazaery teaches all the limitations of claim 14 above; however, it does not explicitly teach: further comprising: storing in the controller profiles associated with various users of the hybrid valve; and generating the command based on a given profile of the profiles, which is associated with a given user that is currently using the hybrid valve. Rexach further teaches further comprising: storing in the controller profiles associated with various users of the hybrid valve (see [0312]; Rexach: “The external device 30, the network device 14, or server 13 may access a database according to the location where the sensor data was collected or other identifier to identify additional users or appliances in the geographic area where the sensor data was collected.”); and generating the command based on a given profile of the profiles, which is associated with a given user that is currently using the hybrid valve. (see [0141]; Rexach: “the control system 301 (e.g., through processor 300) generates settings for controlling generating a command for a device function for a second appliance based on user configuration for the identity of the user.”) The same motivation to combine the combination of Xia, Al Jazaery, and Rexach a set forth for Claim 8 equally applies to Claim 17. Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia) in view of Al Jazaery further in view of Pi (US 20180151054 A1 -hereinafter Pi). Regarding Claim 9, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above; however, it does not explicitly teach wherein the camera is oriented to capture both the user and the hybrid valve. Pi from the same or similar field of endeavor teaches wherein the camera is oriented to capture both the user and the hybrid valve. (see [0173]; Pi: “FIG. 1(d) may depict an exemplary embodiment a pre-rinse phase. User 950 may wet hands beneath washer 210. Such user 950 conduct may be recorded by camera 215.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Pi’s features of the camera is oriented to capture both the user and the hybrid valve. Doing so would ensure actual user compliance with proper hand washing techniques. (Pi, [0004]) Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia) in view of Al Jazaery in view of Dong (CN111451032A -hereinafter Dong - Note: As the Machine Translation attached). Regarding Claim 10, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Xia further teaches wherein the hybrid valve is a water tap (see Fig 1 and [0014]; Xia: “the dispenser is in fluid connection with at least one source of hot water and one source of cold water”), and the water tap has a movable part attached to a spout (see [0112]; Xia: “The actuators are responsive to the control signal and the local input signals to dispense water from fluid dispenser 3 with predefined characteristics. The actuators, which are disposed at various locations on fluid dispenser 3 and surrounding elements, are electromechanical in nature to translate the electrical control and local input signals to a mechanical action.”) [The actuator reads on ‘the movable part’], and …is configured to orient the movable part toward the user when detecting a given indicator in a profile of the user stored in the controller (see [0152]; Xia: “Based on input user gestures and/or predefined user preferences, the water is passed through one of a number (three illustrated) treatment paths. Each treatment path applies different levels of filtering and/or processing to the input water. The appropriate treatment path is selected by an actuator valve 166 responsive to the user gestures and/or user preferences”), wherein the indicator is associated with an age or disability of the user. (see [0114]; Xia: “the content may include advertising relevant to the one or more user preferences such as the user's age, sex or location.”) However, it does not explicitly teach: the Al algorithm… Dong from the same or similar field of endeavor teaches the Al algorithm… (see page 3, paragraph 12; Dong: “The intelligent home control terminal packages the first water outlet parameter to a regulation and control instruction used for controlling the shower head to conduct water outlet parameter regulation after analyzing and determining the most suitable first water outlet parameter for bathing and showering by the user based on various characteristic information of the user and utilizing big data, and sends the regulation and control instruction to the shower head so as to control the shower head to monitor and collect the second water outlet parameter in real time, and the first water outlet parameter packaged in the regulation and control instruction is correspondingly adjusted”. See page 12, paragraph 4; Dong: “Referring to fig. 3, in the second embodiment of the shower head regulation method of the present invention, the step S30 of determining the first water outlet parameter suitable for the user to shower at the ambient temperature based on the characteristic information includes: step S31, searching the shower outlet water parameters used by the user sample of the age and the gender at the environmental temperature based on big data; step S32, determining the found shower water outlet parameter as the first water outlet parameter for the user to shower. The method comprises the steps of analyzing and searching a user sample with the same age and sex as a current user based on characteristic information (age and sex) of the user by utilizing big data analysis and adopting a machine learning method, selecting an appropriate shower water outlet parameter for comfortable shower under the condition that the temperature of the environment where the current user is located is the same as that of the environment where the current user is located, and determining the found appropriate shower water outlet parameter as a first water outlet parameter for the current user to shower.) [That is, the AI/machine learning receives the user age in order to regulate/control the shower head/faucet] It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Dong’s features of the Al algorithm. Doing so would adjust to the accurate optimum scope that is fit for everyone. (Dong, page 5, paragraph 4) Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia) in view of Al Jazaery in view of Daansen et al. (US6375038B1 -hereinafter Daansen). Regarding Claim 11, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Xia further teaches further comprising: a microphone configured to collect a voice of the user (see [0117]; Xia: “System 1 is further adapted to receive voice commands issued by the user through a microphone 41 disposed on handle 9.”) and provide this data … for controlling the hybrid valve (see [0117]; Xia: “The voice control signal is sent to appropriate actuators to carry out predefined functions.”); However, Xia does not explicitly teach: …to the Al algorithm…; a speaker configured to provide verbal commands to the user. Al Jazaery from the same or similar field of endeavor teaches: …to the Al algorithm… (see [0050]; Al Jazaery: “the appliance control unit 107 further includes an image processing unit 115 which includes one or more machine learning models for analyzing the sequence of images (e.g., consecutive image frames of a video) from the one or more cameras 102, and provide motion gestures deduced from the image analysis performed on the images.” See [0026]: “As disclosed herein, using the camera makes it possible to control appliances with not only hands but also body language. It also makes it possible to control appliances, with not only hands, but also facial and head movement and expressions.” See [0006]: “This is also helpful in situations where the appliance is sensitive to disturbances cause by contact (e.g., a smart fish tank for sensitive or dangerous pets), and a user can control the appliance (e.g., setting internal environment, and release food or water to the pet, etc.) without direct contact with the appliance. This is also helpful in situations where the user does not want to touch the appliance's control panel because the user's hands are contaminated (e.g., the user's hands are web), and the user can control the appliance using motion gestures.”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 11. However, it does not explicitly teach: a speaker configured to provide verbal commands to the user. Daansen from the same or similar field of endeavor teaches: a speaker configured to provide verbal commands to the user. (see column 14, lines 29-32; Daansen: “The speaker 140 provides the voice commands and any other audio output. In the preferred embodiment the audio output consists of voice instructions on washing hands.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Daansen’s features of a speaker configured to provide verbal commands to the user. Doing so would assist the user in proper washing techniques for compliance with recommended guidelines and to monitor the number of usages. (Daansen, column 1, lines 14-18) Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia) in view of Al Jazaery further in view of Liu et al. (US20180293873A1 -hereinafter Liu). Regarding Claim 12, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Al Jazaery further teaches wherein the controller hosts the Al algorithm… (see [0050]; Al Jazaery: “the appliance control unit 107 further includes an image processing unit 115 which includes one or more machine learning models for analyzing the sequence of images (e.g., consecutive image frames of a video) from the one or more cameras 102, and provide motion gestures deduced from the image analysis performed on the images. In some embodiments, the image processing unit 115 optionally include some components locally at the appliance 124, and some components remotely at the server 108.”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 12. However, it does not explicitly teach: and the controller is integrated with the hybrid valve. Liu from the same or similar field of endeavor teaches and the controller is integrated with the hybrid valve. (see [0088]; Liu: “the stationary controller may be integrated with the electronics of the dispenser.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Liu’s features of the controller is integrated with the hybrid valve. Doing so would instruct a user as to proper hand hygiene, be achieve cost-effective and include user-friendly technology with accessible interventions. (Liu, [0090]) Claim(s) 13 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia) in view of Al Jazaery further in view of Hartley et al. (US20150199883A1 -hereinafter Hartley). Regarding Claim 13, the combination of Xia and Al Jazaery teaches all the limitations of claim 1 above, Al Jazaery further teaches further comprising: …the AI algorithm (see [0050]; Al Jazaery: “the appliance control unit 107 further includes an image processing unit 115 which includes one or more machine learning models for analyzing the sequence of images (e.g., consecutive image frames of a video) from the one or more cameras 102, and provide motion gestures deduced from the image analysis performed on the images.” See [0026]: “As disclosed herein, using the camera makes it possible to control appliances with not only hands but also body language. It also makes it possible to control appliances, with not only hands, but also facial and head movement and expressions.” See [0006]: “This is also helpful in situations where the appliance is sensitive to disturbances cause by contact (e.g., a smart fish tank for sensitive or dangerous pets), and a user can control the appliance (e.g., setting internal environment, and release food or water to the pet, etc.) without direct contact with the appliance. This is also helpful in situations where the user does not want to touch the appliance's control panel because the user's hands are contaminated (e.g., the user's hands are web), and the user can control the appliance using motion gestures.”) The same motivation to combine the combination of Xia and Al Jazaery a set forth for Claim 1 equally applies to Claim 13. However, it does not explicitly teach: a germ detection sensor configured to detect a presence of a germ on the user's hands and to provide such information to the …algorithm, wherein the …algorithm is configured to warn the user about the presence of the germs. Hartley from the same or similar field of endeavor teaches a germ detection sensor configured to detect a presence of a germ on the user's hands (see [0012]; Hartley: “The system may incorporate one or more UV lamps that are configured to illuminate a person's hands and illuminate the pathogens, such as bacteria, on the person's hands. A camera may be used to capture images of the illuminated bacteria for immediate or later image analysis to quantify the amount and/or type of bacteria present.” See [0040]: “The camera is configured to cooperate with the UV lamps to capture images while the UV lamps are active, thus providing a way to visually assess the quantity and type of pathogens contained on a person's hands. This can be used as a useful indicator to assess the level of contamination and types of pathogens that are being distributed throughout the facility. The system 500 may include the ability to parse the captured images to automatically determine the quantity and type of contaminants contained on a person's hands before they are sanitized.”) [That is, the camera is used to quantify the amount and/or type of bacteria present.] and to provide such information to the …algorithm (see [0024]; Hartley: “The sensor also sends usage information to the microcontroller 102 which can store this data in memory storage 112. The sanitizing solution is preferably any suitable anti-bacterial solution or gel capable of sanitizing a user's hands.”), wherein the …algorithm is configured to warn the user about the presence of the germs. (see [0013]; Hartley: “According to another embodiment, a method for encouraging hand sanitation and tracking compliance is provided, which includes detecting the proximity of a person, providing a notification to the person to sanitize their hands, detecting whether sanitizer was dispensed, after a predetermined period of time providing an alarm warning if sanitizer was not dispensed, and storing information about whether sanitizer was dispensed.” See [0037]: “If the dispenser has not been used after a predetermined time, an audible and/or visual alert plays to warn the person that they need to sanitize their hands.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Hartley’s features of a germ detection sensor configured to detect a presence of a germ on the user's hands and to provide such information to the algorithm, wherein the algorithm is configured to warn the user about the presence of the germs. Doing so would prompt users to maintain clean hands when working in environments where clean and sanitized hands are vital. (Hartley, [0002]) Regarding Claim 18, the limitations in this claim is taught by the combination of Xia, Al Jazaery, and Hartley as discussed connection with claim 13. Claim(s) 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia in view of Al Jazaery further in view of Deivasigamani et al. (US20110042470A1 -hereinafter Deivasigamani). Regarding Claim 19, the combination of Xia and Al Jazaery teaches all the limitations of claim 14 above; however, it does not explicitly teach further comprising: delaying a turning on state of the hybrid valve based on a profile of the user stored in the controller. Deivasigamani from the same or similar field of endeavor teaches further comprising: delaying a turning on state of the hybrid valve based on a profile of the user stored in the controller. (see [0005]; Deivasigamani: “It is an object of the present invention to provide a control system capable of managing false triggering by filtering out such detections (i.e. the discarding of entities that are not direct hot water users such as pets, insects, and the like).”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Deivasigamani’s features of delaying a turning on state of the hybrid valve based on a profile of the user stored in the controller. Doing so would provide a safety feature that helps protects heat sensitive people such as children, the elderly and the like from potential water burns. (Deivasigamani, Abstract) Claim(s) 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Xia in view of Al Jazaery further in view of Matuchniak et al. (US20200401963A1 -hereinafter Matuchniak). Regarding Claim 20, the combination of Xia and Al Jazaery teaches all the limitations of claim 14 above, Xia further teaches wherein the Al algorithm is configured to learn from each user to predict a next movement or a time until the movement takes place. Matuchniak from the same or similar field of endeavor teaches wherein the Al algorithm is configured to learn from each user to predict a next movement or a time until the movement takes place. (see [0104]; Matuchniak: “For example, where messages 103 or 104 are sent with common language that indicates classification as raising a particular issues, the machine learning modeling layer may be applied to model potential responses to survey broadcasts 184 based on simulated or predicted movement or responses of the affected organizational users 102 or predicted status of organizational facilities. Those potential responses may then be compared with actual responses 186, for example movement of affected organizational users 102 based on geo-location signals transmitted by their associated mobile devices 139, and further actions can be decided upon accordingly.” See [0061]: “The present invention may also model metadata 164 to assess a response to a specific event to evaluate compliance with organizational policies, organizational duty of care, and/or compliance with regulatory or governmental directives. For example, if the users 102 are required to wash their hands every 20 minutes, and maintain a specified distance from all other individuals in the organization 106, the metadata 164 may indicate whether each user 102 has visited a washing station within 20 minutes of their last visit, and whether they have come closer than the specified distance to any other user 102.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the teaching of the combination of Xia and Al Jazaery to include Matuchniak’s features of learning from each user to predict a next movement or a time until the movement takes place. Doing so would manage their risks in response to unexpected events, such as environmental or weather issues, civil disruptions, loss of infrastructure, and public health and safety events such as pandemics, through the exchange of trusted messages and analysis of information contained in such messages. (Matuchniak, [0011]) Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Smith (US20200209897A1) discloses using the image as an input to a neural network or other artificial intelligence-type system. Then, the image processing system 218 may determine whether any objects are present in the image based on the outputs from the neural network. Wang (US20190162426A1) discloses perform a specific gesture (for example, opening the palm) to control the smart water heater to discharge stream, and then adjust the water amount by a subsequent input gesture. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to VI N TRAN whose telephone number is (571)272-1108. The examiner can normally be reached Mon-Fri 9:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ROBERT FENNEMA can be reached at (571) 272-2748. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /V.N.T./Examiner, Art Unit 2117 /ROBERT E FENNEMA/Supervisory Patent Examiner, Art Unit 2117
Read full office action

Prosecution Timeline

Feb 14, 2023
Application Filed
Jun 12, 2025
Non-Final Rejection — §103
Sep 17, 2025
Response Filed
Dec 06, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12528200
LIGHT FOR TEACH PENDANT AND/OR ROBOT
2y 5m to grant Granted Jan 20, 2026
Patent 12523972
Event Engine for Building Management System Using Distributed Devices and Blockchain Ledger
2y 5m to grant Granted Jan 13, 2026
Patent 12525808
TIME-SHIFTING OPTIMIZATIONS FOR RESOURCE GENERATION AND DISPATCH
2y 5m to grant Granted Jan 13, 2026
Patent 12494653
CONTROLLING A HYBRID POWER PLANT
2y 5m to grant Granted Dec 09, 2025
Patent 12467818
DETECTING GAS LEAKS FROM IMAGE DATA AND LEAK DETECTION MODELS
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
46%
Grant Probability
83%
With Interview (+36.3%)
4y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 99 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month