Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements filed on 1/18/2025 is acknowledged by the examiner.
Status of Claims
Amendment to the Claims was filed on 11/14/2024.
Claims 27-29, 35, 37-68, and 70-109 were canceled.
Claims 1-26, 30-34, 36, and 69 are currently pending.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-14, 18-25, 30-34, and 69 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Verma et al. US PG PUB 20230333663 (“Verma”).
Regarding Claim 1, Verma discloses a touchless plumbing fixture (105; para 0047 “FIG. 11 shows a control assembly 505 in a series configuration, according to an exemplary embodiment. In various embodiments, the control assembly 505 may be similar or equivalent to the control assembly 205, where elements 510-552 of the control assembly 505 are respectively equivalent to elements 210-252 of the control assembly 205. Accordingly, the control assembly 505 receives input from the control unit 510, which a user may interact with to change one or more operating conditions of the plumbing system 100.”) comprising: a discharge outlet (ann. fig. 11) with a passageway (552, see ann. fig. 11) to conduct water in fluid communication with a mixing valve (525, para 0036 “ the valve and stepper motor 225 is a mixing valve configured to adjust a ratio of hot water to cold water”), the mixing valve (525) in fluid communication with a cold water source (517) and a hot water source (515)(para 0036 “the control assembly 205 may be configured such that hot and cold water (from hot and cold water inlets 215, 217) may exit the respective diverts 220 and 220 and mix first at the valve and stepper motor 225, wherein the valve and stepper motor 225 is a mixing valve configured to adjust a ratio of hot water to cold water”), the mixing valve (525) is operably coupled to one or more motors (525); the mixing valve (525) to independently control water flow rate (valve and stepped motor 227 controls the water flow; para 0036 “A flow of the mixed water may then be subsequently regulated by the valve and stepper motor 227 in the same line”) and water temperature (valve and stepped motor 525 controls the water temperature, para 0036); and electrically operable valve (solenoid valve 543, para 0048 and ann. fig. 11) in fluid communication with the passageway to conduct water, positioned between the mixing valve (525) and discharge outlet (ann. fig. 11)(see ann. fig. 11 for position of solenoid valve 543 and mixing valve 525 along the water flow lines); a sensor (584, para 0048 “one or more receiving devices 584 may be configured to receive one or more inputs from an input source (e.g., user, user device, other control device, etc.) and communicate the one or more inputs to the one or more controllers 580. In various embodiments, the one or more receiving devices 584 may include a plurality of sensors”) with a field of detection for detecting a user control object (hand, see figs. 14A-14F and flow chart fig. 13) in three dimensional sensory space (para 0052 “ Advantageously, by including multiple pairs of TOF sensors 587, 588, 590, and 591 arranged as shown in FIG. 12A, the control unit 510 can determine the location of the user’s hand along multiple orthogonal dimensions of three-dimensional space”); and a computer system (510, specifically mother board controller 580) operably coupled to the sensor (584)(para 0052 “ By combining measurements from the TOF sensors 587, 588, 590, and 591 and/or the PAJ sensor 595, the control unit 510 can also determine the location of the user’s hand along a third dimension orthogonal to both the first and second dimensions (e.g., perpendicular to the front surface of the control unit 510). This allows the control unit 510 to determine where and when the user’s hand is moving in three-dimensional space to enable a variety of different types of gestures (e.g., linear gestures, two-dimensional gestures, three-dimensional gestures, etc.) to be detected and used for control purposes”), to the one or more motors (525, 527), to the electrically operable valve (543), and one or more electronic components (sensors in sensor pad 584, display 599 on sensor pad 584), valves, motors, etc.), and comprising a searching mode that uses the sensor to capture an image (hand gesture), recognize a user's hands as a control object, find the control object, create a 3D model of the control object, apply reference points to the 3D model, and then determine an axial position for the reference points on the 3D model, and analyze the axial positions of the reference points to recognize a plurality of gestures as commands (para 0056, “ If the one or more controllers 580 determine that motion was detected, the one or more controllers 580 ay determine a gesture associated with the motion based on inputs received from the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 in an operation 615. The one or more controllers 580 may process the gesture based on at least one of a direction, a proximity, or a velocity associated with the gesture as determined by the TOF sensors 587, 588, 590, 591, and as a result, determine an operational function (e.g., a flow rate, a temperature, etc.) associated with the control assembly 505 that corresponds to the gesture (operation 620). In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593.”).
PNG
media_image1.png
417
702
media_image1.png
Greyscale
VERMA – ANNOTATED FIGURE 11
Regarding Claim 2, Verma discloses the electrically operable valve (543) is a solenoid valve (para 0048).
Regarding Claim 3, Verma discloses the solenoid valve (543) enables or disables the flow of water through the plumbing fixture (105) independently (para 0036) of the setting of the mixing valve (525, para 0036).
Regarding Claim 4, Verma discloses the one or more electronic components are selected from a group consisting of one or more motors (525, 527, 543, para 0036), one or more sensors (584 with sensors 587, 588, 590, and 591, para 0052) with one or more fields of detection (3D, para 0052), one or more electrically operable valves (525, 527, 543), an electronic display (599), and combinations thereof.
Regarding Claim 5, Verma discloses the one or more electronic components further comprises an electronic display (599) that presents information about water temperature, water flow rate, an active state, a mode, or combinations thereof to the user (para 0055 “In various embodiments, as shown in FIG. 12B, the control unit 510 may also include a display (“screen”) 599 that is communicably coupled with the one or more controllers 580, where the screen 599 is configured to display (i.e., provide a visible representation or illustration of) an operational state associated with one or more components within the control assembly 505 and/or the plumbing assembly 105.”).
Regarding Claim 6, Verma discloses the computer system (510) further comprises an electronic control board (motherboard 580) that sends and receives signals to and from the one or more electronic components (para 0048, “As shown, the control unit 510 may be configured such it is communicably coupled to at least one of the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, or a solenoid valve 543. The control unit 510 may include one or more controllers 580 (shown as a motherboard in FIG. 11), which may be in communication with one or more processors and memories, where the one or more controllers 580 are configured to receive signals from at least one of an electrical input 584, a receiving device 585, or a power source 575 (shown as a battery in FIG. 11) and send one or more control signals to at least one of the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, or a solenoid valve 543 in response.”).
Regarding Claim 7, Verma discloses the user control object is a hand or a pair of hands (para 0050 “For example, measurements taken by the PAJ sensor 595 and the TOF sensors 587, 588 at different times can be used to determine the location of the user’s hand at each time. A change in location can then be divided by the amount of time elapsed to determine the velocity of the user’s hand”).
Regarding Claim 8, Verma discloses the plurality of gestures are selected from a group consisting of hand gestures, other body gestures, and various combinations thereof (para 0042 “To control water through the plumbing assembly 105, a user may place their hand 300 (or other body part) above the sensory surface 260, as shown in FIGS. 6A-6D.”).
Regarding Claim 9, Verma discloses the plurality of gestures are selected from a group consisting of a water temperature gesture, a water flow rate gesture, a gesture to activate and deactivate the electrically operable valve, and combinations thereof (para 0056 “ In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof.”).
Regarding Claim 10, Verma discloses a second sensor (593) with a secondary field of detection (in front of or near the control unit 510, para 0056; while para 0057 discloses the first sensors detect movement in axial directions “n other embodiments, such as shown in FIG. 14C, the gesture 705 may include a linear motion in a first direction and aligned with the first axis 503 of the control unit 510, which may be sensed by the TOF sensors 587,588. Similarly, the gesture 705 may include a linear motion in a second direction and aligned with the first axis 503 of the control unit 510, as shown in FIG. 14D, where the TOF sensors 587, 588 may sense and characterize the gesture 705 based on its direction and velocity. In various embodiments, such as shown in FIG. 14E, the gesture 705 may include a linear motion in a first direction and aligned with the second axis 504 of the control unit 510, which may be sensed by the TOF sensors 590, 591. Similarly, the gesture 705 may include a linear motion in a second direction and aligned with the second axis 504 of the control unit 510, as shown in FIG. 14F, where the TOF sensors 590, 591 may sense and characterize the gesture 705 based on its direction and velocity.”).
Regarding Claim 11, Verma discloses the computer system is preconfigured to switch between an active mode (activate in an operational state 620) and a standby mode (deactivated in operation 630) and when the computer system (580) is in the standby mode and the second sensor(593) detects the control object in the secondary field of detection the computer system will switch to the active mode (para 0056, “In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593.”).
Regarding Claim 12, Verma discloses when the second sensor (593) does not detect the user in the secondary field of detection after a predefined period of time, the computer system (580) will switch to the standby mode (para 0054 “The control unit 510 may also include one or more passive infrared (PIR) sensors 593, which may be configured to detect a proximity of a user’s hand in front of the control unit 510. Accordingly, the control unit 510 may be configured to active in response to the one or more PIR sensors 593 detecting a presence of a user’s hand within a predetermined threshold.”).
Regarding Claim 13, Verma discloses the computer system (580) is preconfigured to switch between an active mode (operational 620) and a standby mode (deactivate 630) and when the computer system is in the standby mode and the sensor (593) detects the control object in the field of detection the computer system switches to the active mode (para 0054 “The control unit 510 may also include one or more passive infrared (PIR) sensors 593, which may be configured to detect a proximity of a user’s hand in front of the control unit 510. Accordingly, the control unit 510 may be configured to active in response to the one or more PIR sensors 593 detecting a presence of a user’s hand within a predetermined threshold.”).
Regarding Claim 14, Verma discloses when the sensor (593) does not detect the user in the field of detection after a predefined period of time, the computer system switches to the standby mode (para 0056, “ If the one or more controllers 580 determine that no motion was detected in an operation 607, the operation 610 may repeat or the control unit 510 may deactivate until another input is received by the PIR sensor 593.”).
Regarding Claim 18, Verma discloses the mixing valve (525) is a mixing chamber (vertical chamber in Y three way connector).
Regarding Claim 19, Verma discloses a temperature sensor (535, thermal coupler) is operably coupled to the computer system (510, mother board 580) that is preconfigured with an anti-freeze mode to prevent the water from freezing (para 0045).
Regarding Claim 20, Verma discloses water temperature is measured by output signals from the temperature sensor (535) while the computer system (580) is not in operation and when the water temperature falls below a preconfigured threshold then the computer system begins the anti-freeze mode and activates the electrically operable valve (para 0045 “ the control system 200 may be configured to operate within one or more predetermined temperature and/or flow rate ranges to prevent injury to a user (e.g., by limiting a maximum temperature of the water through the plumbing assembly 105), prevent water waste (e.g., by limiting a maximum flow rate through the plumbing assembly 105), prevent plumbing freezes (e.g., by maintaining a constant flow rate or initiating periodic water flow through the plumbing assembly 105).”).
Regarding Claim 21, Verma discloses while in the anti-freeze mode the computer system (580) continually monitors the water temperature (sensor 535) by measuring the output signals of the temperature sensor (535) and when the water temperature is above the preconfigured threshold then the computer system ends the anti-freeze mode and deactivates the electrically operable valve (para 0045, “ the control system 200 may be configured to operate within one or more predetermined temperature and/or flow rate ranges to prevent injury to a user (e.g., by limiting a maximum temperature of the water through the plumbing assembly 105), prevent water waste (e.g., by limiting a maximum flow rate through the plumbing assembly 105), prevent plumbing freezes (e.g., by maintaining a constant flow rate or initiating periodic water flow through the plumbing assembly 105).”).
Regarding Claim 22, Verma discloses the discharge outlet is a spout (ann. fig. 11).
Regarding Claim 23, Verma discloses the plumbing fixture is selected from a group consisting of sink, a shower, a tub, a fountain, and combinations thereof (para 0032).
Regarding Claim 24, Verma discloses the plumbing fixture is a faucet (ann. fig. 11 and para 0032).
Regarding Claim 25, Verma discloses the computer system (580) is configured to recognize user control objects (hands) and gestures by analyzing images captured by the sensor (TOF sensors 587, 588, 590, 591, and PAJ sensor 595) from a particular vantage point to computationally represent a portion of the user control object from the plurality of user control objects as one or more mathematically represented 3D surfaces (3D, para 0052)(para 0056, “ If the one or more controllers 580 determine that motion was detected, the one or more controllers 580 ay determine a gesture associated with the motion based on inputs received from the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 in an operation 615. The one or more controllers 580 may process the gesture based on at least one of a direction, a proximity, or a velocity associated with the gesture as determined by the TOF sensors 587, 588, 590, 591, and as a result, determine an operational function (e.g., a flow rate, a temperature, etc.) associated with the control assembly 505 that corresponds to the gesture (operation 620). In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593.”).
Regarding Claim 30, Verma discloses the computer system continuously operates in searching mode until the axial position of the image matches a preconfigured gesture stored in the computer system and wherein, when the image that matches the preconfigured gesture becomes an active gesture the computer system switches to a matching mode para 0056, “A method 600 carried out by the control unit 510 for controlling one or more operating states of the control assembly 505 is depicted in FIG. 13. In a first operation 605, the control unit 510 may initialize in response to the PIR sensor 593 detecting a user’s hand in front of or near the control unit 510. Responsive to initializing in the operation 605, the one or more controllers 580 may determine whether or not motion (i.e., of the user’s hand) was detected based on inputs received from one or more of the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 (operation 610). If the one or more controllers 580 determine that no motion was detected in an operation 607, the operation 610 may repeat or the control unit 510 may deactivate until another input is received by the PIR sensor 593. If the one or more controllers 580 determine that motion was detected, the one or more controllers 580 ay determine a gesture associated with the motion based on inputs received from the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 in an operation 615. The one or more controllers 580 may process the gesture based on at least one of a direction, a proximity, or a velocity associated with the gesture as determined by the TOF sensors 587, 588, 590, 591, and as a result, determine an operational function (e.g., a flow rate, a temperature, etc.) associated with the control assembly 505 that corresponds to the gesture (operation 620). In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593”).
Regarding Claim 31, Verma discloses when in the matching mode, the computer system (580) interfaces with the sensor to continuously capture a new image, as a current image, after a predefined period of time and compares the current image with a previously captured image, as a prior image (para 0053, “ When a user’s hand is detected within the detection region, the control unit 510 may record the locations of the user’s hand over time and match the detected locations and times to a gesture stored in the database.”).
Regarding Claim 32, Verma discloses the current image is one most recently captured and the prior image is the one immediately preceding the current image and for each image, the computer system (580) finds a control object (ser hand), creates a 3D model of the control object, applies reference points to the 3D model, determines the axial position for the reference points on the 3D model, and confirms the axial position matches the active gesture (para 0053 “ the control unit 510 is capable of detecting and distinguishing between gestures defined not only by the spatial locations of the user’s hand, but also by its velocity or other metrics that incorporate a time element (e.g., speed, acceleration, etc.). As such, gestures can be defined in up to four dimensions including the three spatial dimensions and a time dimension. For example, the control unit 510 can be configured to distinguish between a quick movement of the user’s hand and a slow movement of the user’s hand along the same path and may map quick movement to a first gesture and slow movement to a second gesture. The first gesture may trigger the control unit 510 to perform a first control action, whereas the second gesture may trigger the control unit 510 to perform a second control action. In some embodiments, the control unit 510 stores a database of various gestures, which can be defined or characterized by a time series of locations of the user’s hand in up to three dimensions to map out a path in up to three-dimensional space and may define the speed that the user’s hand moves along the path.”).
Regarding Claim 33, Verma discloses the computer system (580) determines the difference between the axial positions of the current image and the prior image and then correlates that difference into a change in the state of the plumbing fixture recognized by the active gesture (para 0053 “ the control unit 510 is capable of detecting and distinguishing between gestures defined not only by the spatial locations of the user’s hand, but also by its velocity or other metrics that incorporate a time element (e.g., speed, acceleration, etc.). As such, gestures can be defined in up to four dimensions including the three spatial dimensions and a time dimension. For example, the control unit 510 can be configured to distinguish between a quick movement of the user’s hand and a slow movement of the user’s hand along the same path and may map quick movement to a first gesture and slow movement to a second gesture. The first gesture may trigger the control unit 510 to perform a first control action, whereas the second gesture may trigger the control unit 510 to perform a second control action. In some embodiments, the control unit 510 stores a database of various gestures, which can be defined or characterized by a time series of locations of the user’s hand in up to three dimensions to map out a path in up to three-dimensional space and may define the speed that the user’s hand moves along the path.”).
Regarding Claim 34, Verma discloses the computer system continuously operates in the matching mode until parameters to end matching mode are met and then the computer system will switch to the searching mode (para 0056, “ If the one or more controllers 580 determine that motion was detected, the one or more controllers 580 ay determine a gesture associated with the motion based on inputs received from the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 in an operation 615. The one or more controllers 580 may process the gesture based on at least one of a direction, a proximity, or a velocity associated with the gesture as determined by the TOF sensors 587, 588, 590, 591, and as a result, determine an operational function (e.g., a flow rate, a temperature, etc.) associated with the control assembly 505 that corresponds to the gesture (operation 620). In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593.”).
Regarding Claim 36, Verma discloses a system for touchless control of water flow rate and temperature comprising (105, para 0047, “FIG. 11 shows a control assembly 505 in a series configuration, according to an exemplary embodiment. In various embodiments, the control assembly 505 may be similar or equivalent to the control assembly 205, where elements 510-552 of the control assembly 505 are respectively equivalent to elements 210-252 of the control assembly 205. Accordingly, the control assembly 505 receives input from the control unit 510, which a user may interact with to change one or more operating conditions of the plumbing system 100.”): a discharge outlet (ann. fig. 11) with a passageway to conduct water (552, see ann. fig. 11) in fluid communication with a mixing valve (525, para 0036 “ the valve and stepper motor 225 is a mixing valve configured to adjust a ratio of hot water to cold water”); the mixing valve (525) in fluid communication with a cold water source (517) and a hot water source (515)(para 0036 “the control assembly 205 may be configured such that hot and cold water (from hot and cold water inlets 215, 217) and operably coupled to one or more motors (525, 527) for independent control of water flow rate (527) and water temperature (525); an electrically operable valve (solenoid valve 543, para 0048 and ann. fig. 11) in fluid communication with the passageway to conduct water (ann. fig. 11), positioned between the mixing valve (525) and the discharge outlet (ann. fig. 11); a sensor (PAJ sensor 595, TOF sensors 587, 588, 590, 591) with a field of detection for detecting a user control object in three dimensional sensory space (3D, para 0052); and a computer system (580) operably coupled to the sensor PAJ sensor 595, TOF sensors 587, 588, 590, 591), to the one or more motors (525, 527), to the electrically operable valve (543), and one or more electronic components and comprising a searching mode that uses the sensor to capture an image, find a control object, create a 3D model of the control object, apply reference points to the 3D model, and then determine an axial position for the reference points on the 3D model, and analyze the axial positions of the reference points to recognize a plurality of gestures as commands (para 0056, “ If the one or more controllers 580 determine that motion was detected, the one or more controllers 580 ay determine a gesture associated with the motion based on inputs received from the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 in an operation 615. The one or more controllers 580 may process the gesture based on at least one of a direction, a proximity, or a velocity associated with the gesture as determined by the TOF sensors 587, 588, 590, 591, and as a result, determine an operational function (e.g., a flow rate, a temperature, etc.) associated with the control assembly 505 that corresponds to the gesture (operation 620). In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593.”).
Regarding Claim 69, Verma discloses a touchless hand tracking faucet fixture (105; para 0047 “FIG. 11 shows a control assembly 505 in a series configuration, according to an exemplary embodiment. In various embodiments, the control assembly 505 may be similar or equivalent to the control assembly 205, where elements 510-552 of the control assembly 505 are respectively equivalent to elements 210-252 of the control assembly 205. Accordingly, the control assembly 505 receives input from the control unit 510, which a user may interact with to change one or more operating conditions of the plumbing system 100.”), comprising: a spout (ann. fig. 11) with a passageway to conduct water in fluid communication with a mixing valve (525); the mixing valve (525) in fluid communication with one or more water source )(para 0036 “the control assembly 205 may be configured such that hot and cold water (from hot and cold water inlets 215, 217) may exit the respective diverts 220 and 220 and mix first at the valve and stepper motor 225, wherein the valve and stepper motor 225 is a mixing valve configured to adjust a ratio of hot water to cold water”) and operably coupled to one or more motors to independently control water flow rate (valve and stepped motor 227 controls the water flow; para 0036 “A flow of the mixed water may then be subsequently regulated by the valve and stepper motor 227 in the same line”) and water temperature (valve and stepped motor 525 controls the water temperature, para 0036); an electrically operable valve (solenoid valve 543, para 0048 and ann. fig. 11) in fluid communication with the passageway to conduct water and positioned between the mixing valve (525) and the spout (ann. fig. 11); a sensor (PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591) with a field of detection for detecting a user control object in three-dimensional sensory space (3D, para 0052); and a computer system (580) operably coupled to the sensor, to the one or more motors (525, 527), and to the electrically operable valve (543), and one or more electronic components (para 0056, “ In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof.”) and comprising a searching mode that uses the sensor to capture an image (hand gesture), recognize a user's hands as a control object, find the control object, create a 3D model of the control object, apply reference points to the 3D model, and then determine an axial position for the reference points on the 3D model, and analyze the axial positions of the reference points to recognize a plurality of gestures as commands (para 0056, “ If the one or more controllers 580 determine that motion was detected, the one or more controllers 580 ay determine a gesture associated with the motion based on inputs received from the PAJ sensor 595 and/or the TOF sensors 587, 588, 590, 591 in an operation 615. The one or more controllers 580 may process the gesture based on at least one of a direction, a proximity, or a velocity associated with the gesture as determined by the TOF sensors 587, 588, 590, 591, and as a result, determine an operational function (e.g., a flow rate, a temperature, etc.) associated with the control assembly 505 that corresponds to the gesture (operation 620). In an operation 625, the one or more controllers 580 may then transmit one or more control signals to one or more components within the control assembly 505 (e.g., the first valve and stepper motor 525, the second valve and stepper motor 527, the thermocouple 535, the flowmeter 540, the solenoid valve 543, etc.) to change or adjust an operational state thereof. Finally, after the one or more controllers 580 has changed or adjusted the operational state of one or more components within the control assembly 505, the control unit 510 may then deactivate in an operation 630 until another input is received by the PIR sensor 593.”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 15-17 are rejected under 35 U.S.C. 103 as being unpatentable over Verma et al. US PG PUB 20230333663 (“Verma”) in view of Rodenbeck US 10698429 (“Rodenbeck”).
Regarding Claim 15, Verma discloses the claimed invention, except the sensor is one or more digital cameras.
Rodenbeck teaches a sensor is a digital camera (digital image sensors, col. 36 lines 22-29).
It would have been obvious to one having ordinary skill in the art at a time prior to the effective filing date to have substituted the sensors, as disclosed by Verma, with digital camera sensors, as taught by Rodenbeck, for the purpose of using an exemplary sensing or viewing devices.
Regarding Claim 16, Verma discloses the claimed invention, except a sensor is one or more infrared cameras (col. 36 lines 22-29).
Rodenbeck teaches a sensor is an infrared camera.
It would have been obvious to one having ordinary skill in the art at a time prior to the effective filing date to have substituted the sensors, as disclosed by Verma, with infrared camera, as taught by Rodenbeck, for the purpose of using an alternative means to detect the presence of an object in a region.
Regarding Claim 17, Verma discloses the claimed invention, except one or more electronic components is an audio sensor operably coupled to the computer system for detecting audio commands or an audio device for providing audio feedback of command recognition to the user.
Rodenbeck teaches one or more electronic components is an audio sensor operably coupled to the computer system for detecting audio commands or an audio device for providing audio feedback of command recognition to the user (col. 41 lines 46-53).
It would have been obvious to one having ordinary skill in the art at a time prior to the effective filing date to have modified the user input devices and controller, as disclosed by Verma, by including an audio device (microphone, controller, etc.), as taught by Rodenbeck, for the purpose of allowing a user to use voice commands relating to tasks including “water on”, “water off”, “temperature increase”, and “temperature decrease”, “wash dishes”, “wash hands”, “wash vegetables”, “cold water”, and “hot water”.
Claim 26 is rejected under 35 U.S.C. 103 as being unpatentable over Verma et al. US PG PUB 20230333663 (“Verma”) in view of Horowitz et al. US 10380795 (“Horowitz”).
Regarding Claim 26, Verma discloses each 3D surface (3D, para 0052) corresponding to a cross-section of the portion of the user control object (hands).
Verma discloses the claimed invention, except the portion of the user control object is recognized from a plurality of edge points of the portion of the user control object in the image, tangent lines extending from the sensor to at least two edge points of the plurality of edge points, a centerline corresponding to the tangent lines, or combinations thereof, to reconstruct, or shape fit, the user control object in 3D space.
Horowitz et al. teaches portion of the user control object is recognized from a plurality of edge points of the portion of the user control object in the image, tangent lines extending from the sensor to at least two edge points of the plurality of edge points, a centerline corresponding to the tangent lines, or combinations thereof, to reconstruct, or shape fit, the user control object in 3D space (col. 6 lines 33-61).
It would have been obvious to one having ordinary skill in the art to modify the 3D modeling, as disclosed by Verma, by using image data from digital image data, to determine a distance between a user hand and a digital sensor, as taught by Horowitz, for the purpose of building a 3D model of a moving object.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Daphne Barry whose telephone number is (571)272-9966 and fax number is (571) 273-9966. The examiner can normally be reached on Monday through Friday 9 AM-6 PM (eastern).
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor either Kenneth Rinehart can be reached at (571) 272-4881 or Craig Schneider can be reached at (571) 272-3607. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center and the Private Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from Patent Center or Private PAIR. Status information for unpublished applications is available through Patent Center and Private PAIR to authorized users only. Should you have questions about access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
/DAPHNE M BARRY/Primary Examiner, Art Unit 3753