DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 are pending in the instant application.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Claim Objections
Claims 10, 16 and 20 are objected to because of the following informalities:
Claims 10 and 20, recite “N milliseconds interval”. When using a variable “N” the variable should be define in the claim. For example: N is a natural number or N is a positive number greater than 1.
Claim 16, lines 5, recite “the change in the second distance”. To correct antecedent issues, examiner suggests “a change in the second distance”.
Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim 1 is rejected under 35 U.S.C. 102(a)(1) as being anticipated by Gao et al. (US 20150242053 A1, hereinafter referenced as Gao).
Regarding Claim 1, Gao teaches a method for detecting touch coordinates (see abstract, para. [0010]-[0011]. A method of correcting the position of a touch input), comprising:
measuring raw data for a plurality of touch nodes (see Fig. 2, Fig. 6, para. [0050]-[0051]. The touch panel 205 includes a plurality of touch sensors 206 arranged in a grid of columns (that is, aligned with the vertical orientation of FIG. 2) and rows (that is, aligned with the horizontal orientation of FIG. 2). As illustrated in FIG. 2, in this example there are ten actuated touch sensors 207 by this touch input, and each of these touch sensors 207 provides information (for example, one or more signals) indicative of its actuation) on a display panel (see Fig. 1, para. [0043]-[0046], para. [0048]. As illustrated in the embodiment of FIG. 1, the user interface 122 may include a display 140 and a touch screen subsystem 150);
generating touch coordinates based on the raw data (see para. [0046], para. [0051]. The touch detection module 152 may include instructions that when executed scan the area of the touch panel 142 for touch events and provide the coordinates of touch events to the processing module 154. Each of these touch sensors 207 provides information (for example, one or more signals) indicative of its actuation. The information may include the location of the touch sensor (for example, an x,y location) and the strength (e.g., amplitude or magnitude) of the touch input); and
correcting the touch coordinates (see Para. [0055]-[0061]. Adjusting the touch position by biasing actuated touch sensors near the edge of a touch panel to increase their strength information may result in improved accuracy of the touch position estimation both near the edge of the touch panel. To correct the touch position centroid estimation when incomplete touch sensor information is available, the bias may be mitigated or removed in accordance with a bias model), wherein the correcting the touch coordinates includes:
extending a first distance between a first reference line and the touch coordinates to a second distance (see Figs. 6-7, para. [0055]-[0060]. The area 605 indicates a bias region that extends from the columns of border sensors into the touch sensors 206 a certain distance. The first region 604 corresponds to incomplete touch sensor information for a touch input made near the edge of the touch panel 205, as indicated by the activated sensors 207. In this bias model 606, bias increases linearly the closer the touch input is to the edge of the touch panel 205. Accordingly, a determined centroid near the edge of the touch panel 205 can be adjusted, in either a horizontal or lateral direction (x-direction) of the touch panel 205 and/or a vertical or longitudinal direction (y-direction) of the touch panel 205 to mitigate or remove the bias resulting in an adjusted centroid position that more accurately determines the true centroid position of a touch input. As shown in FIG. 7, the bias region 605 extends from a border sensor 210a to some distance towards the interior of the touch panel 205. In some embodiments, the bias region 605 may extend 1 mm to 3 mm towards the interior of the touch panel 205 from the column of border sensors 210a); and
correcting the touch coordinates using the second distance (see para. [0055]-[0060], para. [0067], Figs. 6-7. To correct the touch position centroid estimation when incomplete touch sensor information is available, the bias may be mitigated or removed in accordance with a bias model. FIG. 7 illustrates one example of an embodiment of a bias model 606 that may be used to correct a touch position estimation that is based on a centroid of a touch input when incomplete touch sensor information is available. The first region 604 corresponds to incomplete touch sensor information for a touch input made near the edge of the touch panel 205, as indicated by the activated sensors 207. In this bias model 606, bias increases linearly the closer the touch input is to the edge of the touch panel 205. Accordingly, a determined centroid near the edge of the touch panel 205 can be adjusted, in either a horizontal or lateral direction (x-direction) of the touch panel 205 and/or a vertical or longitudinal direction (y-direction) of the touch panel 205 to mitigate or remove the bias resulting in an adjusted centroid position that more accurately determines the true centroid position of a touch input. While the illustration in FIG. 7 illustrates a bias model 606 to correct either a horizontal or a vertical x or y position of a determined (estimated) centroid of a touch input, a similar bias removal (or mitigation) process may be used to improve the accuracy of the centroid estimation of the touch position in both the horizontal and vertical directions x and y directions); and
wherein the second distance is calculated as a value corresponding to the first distance based on an offset value calculated according to a variation value of the raw data (see Figs. 6-7, para. [0058]-[0060]. Note that the bias model may be a two dimensional function that jointly models the bias in the x and y directions. Once a bias model is defined and identified, if an estimated position is within the bias region 605, the estimated position may be adjusted according to the bias model. This may also compensate for the bias and improve the touch position accuracy everywhere on the touch panel by defining where on a touch panel to implement a bias removal process. In some embodiments, both an original x coordinate and an original y coordinate of the original touch input may be used to estimate the bias and provide an improved estimated centroid position. In some embodiments, an original x coordinate or an original y coordinate of the original touch input may be used to estimate the bias and provide an improved estimated centroid position. As shown in FIG. 7, the bias region 605 extends from a border sensor 210a to some distance towards the interior of the touch panel 205. In some embodiments, the bias region 605 may extend 1 mm to 3 mm towards the interior of the touch panel 205 from the column of border sensors 210a. The width of the bias region 605 may be determined by simulation. In some embodiments, the bias model 606 shown in FIG. 7 is determined initially offline using measurements, calculations or numeric models based on expected finger size or the expected shape of the touch input. Once the bias model 606 is numerically calculated or estimated, it can be used to determine the bias for an estimated touch position and the estimated centroid of the touch position in both the x and y directions for touch inputs along the edge of the sensor and across the entire surface of the touch panel 205 may be improved by subtracting the bias from the estimated centroid position).
PNG
media_image1.png
757
574
media_image1.png
Greyscale
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Gao (US 20150242053 A1) in view of Hwang et al. (US 20140160043 A1, hereinafter referenced as Hwang).
Regarding Claim 11, Gao teaches a display device (see Fig. 1), comprising:
a display panel (see Fig. 1, para. [0043]-[0046], para. [0048]. As illustrated in the embodiment of FIG. 1, the user interface 122 may include a display 140 and a touch screen subsystem 150) including a plurality of touch nodes for detecting touch input (see Fig. 2, Fig. 6, para. [0050]-[0051]. The touch panel 205 includes a plurality of touch sensors 206 arranged in a grid of columns (that is, aligned with the vertical orientation of FIG. 2) and rows (that is, aligned with the horizontal orientation of FIG. 2). As illustrated in FIG. 2, in this example there are ten actuated touch sensors 207 by this touch input);
a touch “controller” (see Fig. 1, touch screen subsystem 150, processing module 154, para. [0046]-[0049]. The processing module 154 may, in some embodiments, include instructions that when executed act as a touch screen controller (TSC)) configured to generate sensing values for the plurality of touch nodes as raw data according to reaction signals (see Figs. 1-6, para. [0050]-[0051]. As illustrated in FIG. 2, in this example there are ten actuated touch sensors 207 by this touch input, and each of these touch sensors 207 provides information (for example, one or more signals) indicative of its actuation); and
a microcontroller (see para. [0046]-[0049], para. [0073]. The processing module 154 can be a processor specifically configured for use with the touch screen subsystem 150 The processor may be any conventional processor, controller, microcontroller) configured to set at least one area on the display panel, to generate touch coordinates for touch nodes whose touch detection value included in the raw data (see Figs. 1-6, para. [0046]-[0047], para. [0051], para. [0056], para. [0065], Fig. 6. The touch detection module 152 may include instructions that when executed scan the area of the touch panel 142 for touch events and provide the coordinates of touch events to the processing module 154. Each of these touch sensors 207 provides information (for example, one or more signals) indicative of its actuation. The information may include the location of the touch sensor (for example, an x,y location) and the strength (e.g., amplitude or magnitude) of the touch input), to extend a first distance between a first reference line determining the at least one area and the touch coordinates to a second distance (see Figs. 6-7, para. [0055]-[0060]. The area 605 indicates a bias region that extends from the columns of border sensors into the touch sensors 206 a certain distance. The first region 604 corresponds to incomplete touch sensor information for a touch input made near the edge of the touch panel 205, as indicated by the activated sensors 207. In this bias model 606, bias increases linearly the closer the touch input is to the edge of the touch panel 205. Accordingly, a determined centroid near the edge of the touch panel 205 can be adjusted, in either a horizontal or lateral direction (x-direction) of the touch panel 205 and/or a vertical or longitudinal direction (y-direction) of the touch panel 205 to mitigate or remove the bias resulting in an adjusted centroid position that more accurately determines the true centroid position of a touch input. As shown in FIG. 7, the bias region 605 extends from a border sensor 210a to some distance towards the interior of the touch panel 205. In some embodiments, the bias region 605 may extend 1 mm to 3 mm towards the interior of the touch panel 205 from the column of border sensors 210a), and to correct the touch coordinates using the second distance (see para. [0055]-[0061], para. [0067], Figs. 6-7. To correct the touch position centroid estimation when incomplete touch sensor information is available, the bias may be mitigated or removed in accordance with a bias model. FIG. 7 illustrates one example of an embodiment of a bias model 606 that may be used to correct a touch position estimation that is based on a centroid of a touch input when incomplete touch sensor information is available. The first region 604 corresponds to incomplete touch sensor information for a touch input made near the edge of the touch panel 205, as indicated by the activated sensors 207. In this bias model 606, bias increases linearly the closer the touch input is to the edge of the touch panel 205. Accordingly, a determined centroid near the edge of the touch panel 205 can be adjusted, in either a horizontal or lateral direction (x-direction) of the touch panel 205 and/or a vertical or longitudinal direction (y-direction) of the touch panel 205 to mitigate or remove the bias resulting in an adjusted centroid position that more accurately determines the true centroid position of a touch input. While the illustration in FIG. 7 illustrates a bias model 606 to correct either a horizontal or a vertical x or y position of a determined (estimated) centroid of a touch input, a similar bias removal (or mitigation) process may be used to improve the accuracy of the centroid estimation of the touch position in both the horizontal and vertical directions x and y directions),
wherein the second distance is calculated as a value corresponding to the first distance based on an offset value calculated according to a variation value of the raw data (see Figs. 6-7, para. [0058]-[0060]. Note that the bias model may be a two dimensional function that jointly models the bias in the x and y directions. Once a bias model is defined and identified, if an estimated position is within the bias region 605, the estimated position may be adjusted according to the bias model. This may also compensate for the bias and improve the touch position accuracy everywhere on the touch panel by defining where on a touch panel to implement a bias removal process. In some embodiments, both an original x coordinate and an original y coordinate of the original touch input may be used to estimate the bias and provide an improved estimated centroid position. In some embodiments, an original x coordinate or an original y coordinate of the original touch input may be used to estimate the bias and provide an improved estimated centroid position. As shown in FIG. 7, the bias region 605 extends from a border sensor 210a to some distance towards the interior of the touch panel 205. In some embodiments, the bias region 605 may extend 1 mm to 3 mm towards the interior of the touch panel 205 from the column of border sensors 210a. The width of the bias region 605 may be determined by simulation. In some embodiments, the bias model 606 shown in FIG. 7 is determined initially offline using measurements, calculations or numeric models based on expected finger size or the expected shape of the touch input. Once the bias model 606 is numerically calculated or estimated, it can be used to determine the bias for an estimated touch position and the estimated centroid of the touch position in both the x and y directions for touch inputs along the edge of the sensor and across the entire surface of the touch panel 205 may be improved by subtracting the bias from the estimated centroid position).
PNG
media_image1.png
757
574
media_image1.png
Greyscale
Gao does not explicitly disclose the touch controller corresponds to a touch driver configured to supply touch driving signals to the plurality of touch nodes and to generate sensing values according to reaction signals generated in response to the touch driving signals; and generate touch coordinates for touch nodes whose touch detection value included in the raw data is equal to or greater than a threshold value preset for each of the at least one area.
However, Hwang teaches the touch “controller” corresponds to a touch driver configured to supply touch driving signals to the plurality of touch nodes and to generate sensing values for the plurality of touch nodes as raw data according to reaction signals generated in response to the touch driving signals (see Fig. 1, para. [0031], para. [0033]-[0037]. The touch sensing circuit 30 includes a Tx driver 32, an Rx sensing unit 34. The touch screen driving circuit supplies a driving signal to the touch sensors Cts and senses a change amount of charges of the touch sensors Cts. The touch sensing circuit 30 includes a Tx driver 32, an Rx sensing unit 34, a timing generation unit 38, etc. The touch sensing circuit 30 applies the driving signal to the touch sensors Cts through the Tx lines Tx1 to Txj using the Tx driver 32 and senses the change amount of charges of the touch sensors Cts in synchronization with the driving signal through the Rx lines Rx1 to Rxi and the Rx sensing unit 34, thereby outputting touch raw data. The touch sensing circuit 30 may be integrated into one readout integrated circuit (ROIC)); and a microcontroller configured to set at least one area on the display panel, to generate touch coordinates for touch nodes whose touch detection value included in the raw data is equal to or greater than a threshold value preset for each of the at least one area (see para. [0031], para. [0037]. The algorithm execution unit 36 performs the previously determined touch coordinate algorithm and compares the touch raw data received from the touch sensing circuit 30 with a previously determined threshold value. The touch coordinate algorithm decides the touch raw data equal to or greater than the threshold value as data of a touch input area and calculates coordinates of each touch input area. The algorithm execution unit 36 may be implemented as a microcontroller unit (MCU)).
Gao and Hwang are related to touch screen, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the touch circuit to include a touch driver and to generate touch coordinates touch detection value included in the raw data is equal to or greater than a threshold value, since it would aided in determining coordinates by using only relevant touch sensing data (touch raw data equal to or greater than the threshold value) as data of a touch input area.
Allowable Subject Matter
Claims 2-9, 12-15 and 17-19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claims 10, 16 and 20 would be allowable if rewritten to overcome the objections set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IVELISSE MARTINEZ QUILES whose telephone number is (571)270-7618. The examiner can normally be reached Monday thru Friday; 1:00 PM to 5:00 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Temesghen Ghebretinsae can be reached at 571-272-3017. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/IM/Examiner, Art Unit 2626
/TEMESGHEN GHEBRETINSAE/Supervisory Patent Examiner, Art Unit 2626 3/23/26