DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
2. Applicant’s amendment filed on January 9, 2026 has been entered. Claims 1 and 8 have been amended. Claims 1-8 are pending in this application.
Response to Arguments
3. Applicant’s arguments with respect to claim(s) 1 and 8 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. Claim(s) 1-4, 6-8, and 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Yoganandan et al. (US 2021/0011604) in view of Rimon et al. (US 2011/0279397), and further in view of Ichihara (US 2016/0202840).
Regarding claim 1, Yoganandan discloses a touch device (Fig. 2; [0046], e.g., a touch system 200), comprising:
a touch pad ([0048]-[0049], e.g., the hover touch controller device 202), adapted to perform hover detection and touch detection on an object above/on the touch pad (e.g., perform hover detection by the proximity sensor and touch detection by the touch sensor);
a screen, adapted to display an operating interface ([0051]-[0054], e.g., the display 220 displays an interactive surface);
a control unit ([0049], [0055], e.g., the hover touch controller 214, the processor 206), electrically coupled to the touch pad, and configured to:
determine that the object is currently in a hover state or a touch state (Figs 4B and 4C; [0062]-[0064], e.g., a hover state or a touch state);
output hover detection information when it is determined that the object is currently in the hover state ([0055], [0064], e.g., output the three-dimensional position information an (x, y) position for a hover point); and
output touch detection information when it is determined that the object is currently in the touch state (e.g., the two-dimensional position information an (x, y) position for a touch point); and
a processing unit ([0056], e.g., the processor 226), electrically coupled to the control unit and the screen, and configured to generate a position labeling pattern based on the hover detection information and display the position labeling pattern on the operating interface, and generate a touch input instruction adapted to control the operating interface based on the touch detection information (Figs 5-6; [0059], [0070]-[0072], e.g., displays the hover point 505 based on the hover detection information and displays the touch point 605 based on the touch detection information),
wherein a size of the position labeling pattern is associated with height information (Fig. 5; [0054], e.g., the radius of a circle associated with the cursor may correspond directly with the proximity of the interactive device to the touch surface and thereby its z value).
Yoganandan further discloses wherein the position labeling pattern is associated with a color (Figs 4B-4C, 5 and 6; [0064], [0069]-[0072], e.g., the hover point 505 is represented by a green circle) and the touch pad comprises a capacitive sensor (see [0061]).
Yoganadan does not specifically disclose wherein the color of the position labeling pattern is associated with the height information, and wherein the touch pad comprises a plurality of sensing elements, wherein when a capacitance change value of one of the plurality of sensing elements with respect to its neighboring sensing elements is greater than a touch threshold, it is determined that the object is in the touch state, and wherein when a plurality of capacitance change values detected by the plurality of sensing elements are all less than the touch threshold, but one of the plurality of capacitance change values detected by at least one of the plurality of sensing elements is greater than a hover threshold, it is determined that the object is in the hover state.
However, Rimon discloses a touch device (Fig. 2A; [0062], e.g., the device 30 comprises a proximity sensor matrix 32) wherein a color of a position labeling pattern is associated with height information (Fig. 4; [0068]-[0069], [0071]-[0074], [0086], e.g., a color of a spot 403 is associated with height information of a fingertip from a sensing surface).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Rimon in the invention of Yoganadan for utilizing different colors (gray level or transparency level) to illustrate a position labeling pattern corresponding to different height values with respect to a sensing surface in order to provide improved targeting and controlling of a virtual feature or object (see [0086] of Rimon).
Yoganadan in view of Rimon does not specifically disclose wherein the touch pad comprises a plurality of sensing elements, wherein when a capacitance change value of one of the plurality of sensing elements with respect to its neighboring sensing elements is greater than a touch threshold, it is determined that the object is in the touch state, and wherein when a plurality of capacitance change values detected by the plurality of sensing elements are all less than the touch threshold, but one of the plurality of capacitance change values detected by at least one of the plurality of sensing elements is greater than a hover threshold, it is determined that the object is in the hover state.
However, Ichihara discloses a capacitive touch sensor (Fig. 1; [0022], e.g., 102) comprising a plurality of sensing elements (Figs 1-2; [0035], e.g., a plurality of sensing elements arranged in a two-dimensional array), wherein when a capacitance change value of one of the plurality of sensing elements with respect to its neighboring sensing elements is greater than a touch threshold, it is determined that an object is in a touch state (Figs 1-3; [0023], [0081], e.g., if the touch detection unit 106 detects a capacitance change value 104 of a sensing element with respect to its neighboring sensing elements is greater than the touch threshold 107, the finger is in the touch state), and wherein when a plurality of capacitance change values detected by the plurality of sensing elements are all less than the touch threshold, but one of the plurality of capacitance change values detected by at least one of the plurality of sensing elements is greater than a hover threshold, it is determined that the object is in the hover state ([0023], [0081], e.g., if the touch detection unit 106 detects a capacitance change value of at least one sensing element is greater than the proximity detection threshold 108, the finger is in the hover state).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Ichihara in the invention of Yoganandan in view of Rimon for detecting a change in capacitance of a sensing element with respect to its neighboring sensing elements and defining a touch threshold and a hover threshold in order to calculate coordinates of a touched position or a hovering position and to carry out a function corresponding to a user operation (see [0023] and [0025] of Ichihara).
Regarding claim 2, Yoganandan further discloses the touch device according to claim 1, wherein the hover detection information comprises two-dimensional coordinate information and height information ([0049], [0064], e.g., provide three-dimensional (e.g., (x, y, z)) position information).
Regarding claim 3, Yoganandan further discloses the touch device according to claim 2, wherein a size of the position labeling pattern is associated with the height information (Fig. 5; [0054], e.g., the radius of a circle associated with the cursor may correspond directly with the proximity of the interactive device to the touch surface and thereby its z value).
Regarding claim 4, Yoganandan further discloses the touch device according to claim 1, wherein the touch detection information comprises two-dimensional coordinate information ([0048], [0064], e.g., provide two-dimensional (e.g., (x, y)) position information).
Regarding claim 6, Yoganandan further discloses the touch device according to claim 1, wherein the object is a finger ([0048]-[0049], e.g., the user’s finger).
Regarding claim 7, Yoganandan further discloses the touch device according to claim 1, wherein the touch pad is a capacitive touch pad ([0061], e.g., a projected capacitive sensor).
Regarding claim 8, Yoganandan discloses a touch method, applicable to a touch device (Fig. 2; [0046], e.g., a touch system 200), wherein the touch device comprises a touch pad ([0048]-[0049], e.g., the hover touch controller device 202) and a screen ([0051]-[0054], e.g., the display 220), the touch pad is adapted to perform hover detection and touch detection on an object above/on the touch pad (e.g., perform hover detection by the proximity sensor and touch detection by the touch sensor), and the screen is adapted to display an operating interface (e.g., displays interactive objects); and the touch method comprises:
determining that the object is currently in a hover state or a touch state (Figs 4B and 4C; [0062]-[0064], e.g., a hover state or a touch state);
outputting hover detection information when it is determined that the object is currently in the hover state ([0055], [0064], e.g., output the three-dimensional position information (x, y, z) for a hover point), and generating a position labeling pattern based on the hover detection information and displaying the position labeling pattern on the operating interface(Figs 5-6; [0059], [0070]-[0072], e.g., displays the hover point 505 based on the hover detection information); and
outputting touch detection information when it is determined that the object is currently in the touch state ([0055], [0064], e.g., output the two-dimensional position information (x, y) for a touch point), and generating a touch input instruction adapted to control the operating interface based on the touch detection information. (Figs 5-6; [0059], [0070]-[0072], e.g., displays the touch point 605 based on the touch detection information),
wherein a size of the position labeling pattern is associated with height information (Fig. 5; [0054], e.g., the radius of a circle associated with the cursor may correspond directly with the proximity of the interactive device to the touch surface and thereby its z value).
Yoganandan further discloses wherein the position labeling pattern is associated with a color (Figs 4B-4C, 5 and 6; [0064], [0069]-[0072], e.g., the hover point 505 is represented by a green circle) and the touch pad comprises a capacitive sensor (see [0061]).
Yoganadan does not specifically disclose wherein the color of the position labeling pattern is associated with the height information, and wherein the touch pad comprises a plurality of sensing elements, wherein when a capacitance change value of one of the plurality of sensing elements with respect to its neighboring sensing elements is greater than a touch threshold, it is determined that the object is in the touch state, and wherein when a plurality of capacitance change values detected by the plurality of sensing elements are all less than the touch threshold, but one of the plurality of capacitance change values detected by at least one of the plurality of sensing elements is greater than a hover threshold, it is determined that the object is in the hover state.
However, Rimon discloses a touch device (Fig. 2A; [0062], e.g., the device 30 comprises a proximity sensor matrix 32) wherein a color of a position labeling pattern is associated with height information (Fig. 4; [0068]-[0069], [0071]-[0074], [0086], e.g., a color of a spot 403 is associated with height information of a fingertip from a sensing surface).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Rimon in the invention of Yoganadan for utilizing different colors (gray level or transparency level) to illustrate a position labeling pattern corresponding to different height values with respect to a sensing surface in order to provide improved targeting and controlling of a virtual feature or object (see [0086] of Rimon).
Yoganadan in view of Rimon does not specifically disclose wherein the touch pad comprises a plurality of sensing elements, wherein when a capacitance change value of one of the plurality of sensing elements with respect to its neighboring sensing elements is greater than a touch threshold , it is determined that the object is in the touch state, and wherein when a plurality of capacitance change values detected by the plurality of sensing elements are all less than the touch threshold, but one of the plurality of capacitance change values detected by at least one of the plurality of sensing elements is greater than a hover threshold, it is determined that the object is in the hover state.
However, Ichihara discloses a capacitive touch sensor (Fig. 1; [0022], e.g., 102) comprising a plurality of sensing elements (Figs 1-2; [0035], e.g., a plurality of sensing elements arranged in a two-dimensional array), wherein when a capacitance change value of one of the plurality of sensing elements with respect to its neighboring sensing elements is greater than a touch threshold, it is determined that an object is in a touch state (Figs 1-3; [0023], [0081], e.g., if the touch detection unit 106 detects a capacitance change value 104 of a sensing element with respect to its neighboring sensing elements is greater than the touch threshold 107, the finger is in the touch state), and wherein when a plurality of capacitance change values detected by the plurality of sensing elements are all less than the touch threshold, but one of the plurality of capacitance change values detected by at least one of the plurality of sensing elements is greater than a hover threshold, it is determined that the object is in the hover state ([0023], [0081], e.g., if the touch detection unit 106 detects a capacitance change value of at least one sensing element is greater than the proximity detection threshold 108, the finger is in the hover state).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Ichihara in the invention of Yoganandan in view of Rimon for detecting a change in capacitance of a sensing element with respect to its neighboring sensing elements and defining a touch threshold and a hover threshold in order to calculate coordinates of a touched position or a hovering position and to carry out a function corresponding to a user operation (see [0023] and [0025] of Ichihara).
Regarding claim 11, Yoganandan further discloses the touch method according to claim 8, wherein the step of determining that the object is currently in the hover state or the touch state comprises:
obtaining previous state information of the object (Figs 4B; [0064]-[0065], e.g., obtain hover (x, y, z) information of the object);
determining a previous state of the object based on the previous state information (e.g., determine that the object is in the hover state when the object 420 has entered the hover volume 405 (z=100)); and
determining, when the previous state is the hover state, whether the object currently enters the touch state (Figs 4B-4C; [0064]-[0065], e.g., determine whether the object is in the touch state 410 (z=0)), and further determining, when it is determined that the object currently does not enter the touch state, whether the object is currently in the hover state ([0065], e.g., determine whether the object is currently in the hover state by determining whether the “HOVER_EXIT” state occurs).
Regarding claim 12, Yoganandan further discloses the touch method according to claim 8, wherein the step of determining that the object is currently in the hover state or the touch state comprises:
obtaining previous state information of the object (Fig. 4C; [0064]-[0065], e.g., obtain previous state information (x, y, z) of the object);
determining a previous state of the object based on the previous state information (e.g., determine that the object is in the touch state (z=0)); and
determining, when the previous state is the touch state, whether the object currently exits the touch state, and determining, when it is determined that the object currently exits the touch state, that the object is currently in the hover state ([0065], e.g., determining that the object currently exits the touch state and is currently in the hover state when the “TOUCH_UP” state occurs).
Regarding claim 13, Yoganandan further discloses the touch method according to claim 8, wherein the step of determining that the object is currently in the hover state or the touch state comprises:
obtaining previous state information of the object (Fig. 4A; [0064], e.g., there is no hover (no z data), no touch, and no output.);
determining a previous state of the object based on the previous state information (e.g., determine that the object is outside of the hover volume 405); and
determining, when the previous state is an object-free state, whether the object is currently in the hover state (Fig. 4B; [0064]-[0065], e.g., determine that the object is in the hover state when the object 420 has entered the hover volume 405 (z=100)).
6. Claim(s) 5 and 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Yoganandan et al. (US 2021/0011604) in view of Rimon et al. (US 2011/0279397) and Ichihara (US 2016/0202840), and further in view of Bertrand (US 2015/0091862).
Regarding claim 5, Yoganandan further discloses the touch device according to claim 1, wherein the touch pad is adapted to be operated in a high-sensitivity detection mode, wherein the high-sensitivity detection mode is that both hover detection and touch detection are performed on the object (Figs 4B and 4C; [0063]-[0063], e.g., detect both hover state and touch state).
Yoganandan in view of Rimon and Ichihara does not specifically discloses wherein the touch pad is adapted to be operated in a normal detection mode, wherein the normal detection mode is that only touch detection is performed on the object.
However, Bertrand discloses a touch device (Fig. 5; [0048], [0051], e.g., device 30) wherein a touch pad (e.g., touch sensor 38) is adapted to be operated in a normal detection mode, wherein the normal detection mode is that only touch detection is performed on the object ([0079], e.g., mode 4 only provides touch data).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Bertrand in the invention of Yoganandan in view of Rimon and Ichihara for performing only touch detection on an object by operating a touch pad in a normal detection mode so that the touch pad is able to operate in a high-sensitivity detection mode or it is only able to operate in the normal detection mode depending upon circumstances (see [0060] of Bertrand).
Regarding claim 9, Yoganandan further discloses the touch method according to claim 8, wherein the touch pad is adapted to be operated in a high-sensitivity detection mode, wherein the high-sensitivity detection mode is that both hover detection and touch detection are performed on the object (Figs 4B and 4C; [0063]-[0063], e.g., detect both hover state and touch state).
Yoganandan in view of Rimon and Ichihara does not specifically disclose wherein the touch pad is adapted to be operated a normal detection mode, wherein the normal detection mode is that only touch detection is performed on the object.
However, Bertrand discloses a touch device (Fig. 5; [0048], [0051], e.g., device 30) wherein a touch pad (e.g., touch sensor 38) is adapted to be operated in a normal detection mode, wherein the normal detection mode is that only touch detection is performed on the object ([0079], e.g., mode 4 only provides touch data).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Bertrand in the invention of Yoganandan in view of Rimon and Ichihara for performing only touch detection on an object by operating a touch pad in a normal detection mode so that the touch pad is able to operate in a high-sensitivity detection mode or it is only able to operate in the normal detection mode depending upon circumstances (see [0060] of Bertrand).
Regarding claim 10, Yoganandan further discloses the touch method according to claim 9, further comprising: determining whether the touch pad is operated in the high-sensitivity detection mode (Figs 4B and 4C; [0064], e.g., touch pad is determined to be operated in the the high-sensitivity detection mode by setting different signal thresholds (Z=100, Z=0) for hover detection and touch detection).
Conclusion
7. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HONG ZHOU whose telephone number is (571)270-5372. The examiner can normally be reached 9:00-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BENJAMIN C LEE can be reached on 571-272-2963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HONG ZHOU/Primary Examiner, Art Unit 2629