Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 09/01/2023 and 10/13/2023. The submissions are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements is considered by the examiner.
Drawings
2 The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore,” one or more control circuits” in claim 1. Fig.2 Item 106 only show one control circuit. Please correct to show either one control or multiple control circuits must be shown or the feature(s) canceled from the claim(s). No new matter should be entered.
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Status
Claims 1-25 are pending.
Claim Rejections - 35 USC § 102
3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
3 Claims 1-8, 12, and 15-25 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Poupyrev et al. (US 20180310659 A1).
PNG
media_image1.png
894
788
media_image1.png
Greyscale
4 Regarding to claim 1, Poupyrev discloses a sensor system, comprising:
a flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) and an opposite, second surface (FIG.1-5 Item102 discloses FIG. 3 has bottom surface 102 bottom in Paragraph [0065]) the flexible substrate defining a longitudinal direction and a lateral direction (FIG.1-5 Item102 discloses a longitudinal direction and a lateral direction);
a plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) extending in the longitudinal direction of the flexible substrate (FIG.1-5 Item102) and being substantially parallel (FIG.1-5 Item 202 discloses 202-3 & 202-4 are substantially parallel with 201-1 & 202-2in Paragraph [0065 & 0087]) and spaced apart in the lateral direction, wherein each of the plurality of first sensing lines FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) is coupled to the first surface of the flexible substrate (FIG.1-5 Item102 discloses FIG. 3 has top surface 102 top in Paragraph [0065]) within a sensing region having a length in the longitudinal direction;
a plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) extending in the longitudinal direction of the flexible substrate and substantially parallel (FIG.1-5 Item 202 discloses 202-3 & 202-4 are substantially parallel with 201-1 & 202-2in Paragraph [0065 & 0087]) with the plurality of first sensing lines FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]), wherein at least a first portion of each of the plurality of second sensing lines is exposed on the first surface of the flexible substrate (FIG.1-5 Item102 discloses FIG. 3 has top surface 102 top in Paragraph [0065]) within the sensing region and at least a second portion of each of the plurality of second sensing lines is exposed on the second surface of the flexible substrate (FIG.1-5 Item102 ) within the sensing region; and
one or more control circuits (FIG.1-5 Item210 or 212 or 106 discloses microprocessor 212 may then analyze the touch-input data to generate one or more control signals, which may then be communicated to computing device 106 (e.g., a smart phone) Paragraph [0063]) configured to:
detect, based on signals (FIG.1-5 Item 210 discloses sensing circuitry 210 uses the change in capacitance to identify the presence of the object detecting which horizontal conductive thread 202 is touched, and which vertical conductive thread 202 is touched by detecting changes in capacitance of each respective conductive thread 202in Paragraph [0074]) from at least two of the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) and at least two of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]), that a user has made a swipe gesture along the first surface in the longitudinal direction (FIG.1-5 Item 106 or 210 or 212 discloses conductive thread 202 have touch-input may then be used to generate touch data usable to control computing device 106. For example, the touch-input can be used to determine various gestures, such as single-finger touches (e.g., touches, taps, and holds), multi-finger touches (e.g., two-finger touches, two-finger taps, two-finger holds, and pinches), single-finger and multi-finger swipes (e.g., swipe up, swipe down, swipe left, swipe right) in Paragraph [0061]).
5 Regarding to claim 2, Poupyrev discloses the sensor system of claim 1, wherein the signals from at least two of the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) and at least two of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) describe a user touch input directed to the sensing region of the flexible substrate (FIG.1-5 Item 210 or 212 or 1062 discloses when an object, such as a user's finger, touches conductive thread 202, the position of the touch can be determined by sensing circuitry 210 by detecting a change in capacitance on the grid or array of conductive thread 202 in Paragraph [0061]).
6 Regarding to claim 3, Poupyrev discloses the sensor system of claim 2, wherein the one or more control circuits (FIG.1-5 Item 210 or 212 or 1062 discloses when an object, such as a user's finger, touches conductive thread 202, the position of the touch can be determined by sensing circuitry 210 by detecting a change in capacitance on the grid or array of conductive thread 202 in Paragraph [0061]).
are configured to detect, based on signals from at least two of the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) and at least two of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]), that the user has made the swipe gesture along the first surface in the longitudinal direction by (FIG.1-5 Item 106 or 210 or 212 discloses detects object. For example, the touch-input can be used to determine various gestures, such as single-finger touches (e.g., touches, taps, and holds), multi-finger touches (e.g., two-finger touches, two-finger taps, two-finger holds, and pinches), single-finger and multi-finger swipes (e.g., swipe up, swipe down, swipe left, swipe right) in Paragraph [0061]).:
determining, based on the signals, a movement direction of the user touch input, the movement direction having a longitudinal component with respect to the longitudinal direction and a lateral component with respect to the lateral direction (FIG.1-5 Item 106 or 210 or 212 discloses operation, sensing circuitry 210 can determine positions of touch-input on the grid of conductive thread 202 using self-capacitance sensing or projective capacitive sensing in Paragraph [0072]).
7 Regarding to claim 4, Poupyrev discloses the sensor system of claim 3, wherein the one or more control circuits (FIG.1-5 Item 106 or 210 or 212 discloses operation, sensing circuitry 210 can determine positions of touch-input on the grid of conductive thread 202 using self-capacitance sensing or projective capacitive sensing in Paragraph [0072]) are configured to determine, based on the signals, the movement direction by:
determining whether the movement direction matches a predetermined movement pattern associated with at least one of a lateral swipe gesture or a longitudinal swipe gesture (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 in Paragraph [0080]).
8 Regarding to claim 5, Poupyrev discloses the sensor system of claim 4, wherein the one or more control circuits (FIG.1-5 Item 106 or 210 or 212 discloses operation, sensing circuitry 210 can determine positions of touch-input on the grid of conductive thread 202 using self-capacitance sensing or projective capacitive sensing in Paragraph [0072]) are configured to determine that the movement direction matches the predetermined movement pattern associated with at least one of the lateral swipe gesture or the longitudinal swipe gesture by (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 in Paragraph [0080]).
identifying a first lateral swipe gesture based at least in part on the movement direction indicating that the user touch input crosses multiple sensing lines in a first lateral direction (FIG.1-5 Item 106 or 210 or 212 discloses computing device 106 can be implemented to recognize a number of the touches, swipes, or holds (e.g., a single tap, a double tap, or a triple tap), in Paragraph [0080]); and
identifying a second lateral swipe gesture based at least in part on the movement direction indicating that the user touch input crosses multiple sensing lines in a second lateral direction (FIG.1-5 Item 106 or 210 or 212 discloses computing device 106 can be implemented to recognize a variety of different types of gestures, such as touches, taps, swipes, holds, and covers made to interactive textile 102. To recognize the various different types of gestures, the computing device can be configured to determine a duration of the touch, swipe, or hold (e.g., one second or two seconds), in Paragraph [0080]).
9 Regarding to claim 6, Poupyrev discloses the sensor system of claim 5, wherein the one or more control circuits (FIG.1-5 Item 106 or 210 or 212 discloses operation, sensing circuitry 210 can determine positions of touch-input on the grid of conductive thread 202 using self-capacitance sensing or projective capacitive sensing in Paragraph [0072]) are configured to
determine that the movement direction matches the predetermined movement pattern associated with at least one of the lateral swipe gesture or the longitudinal swipe gesture by (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 in Paragraph [0080]).
identifying a first longitudinal swipe gesture based at least in part on the movement direction indicating that the user touch input moves along at least one sensing lines in a first longitudinal direction (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 in Paragraph [0080]); and
identifying a second longitudinal swipe gesture based at least in part on the movement direction indicating that the user touch input moves along at least one sensing line in a second longitudinal direction (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 which can be repeated in Paragraph [0080]).
10 Regarding to claim 7, Poupyrev discloses the sensor system of claim 3, wherein the flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) includes a first sensing subregion (FIG.1-5 Item 202 vertical discloses 202-1 & 202-2 in Paragraph [0065]) and a second sensing subregion (FIG.1-5 Item 202 vertical discloses 202-3 & 202-4 in Paragraph [0065]).
11 Regarding to claim 8, Poupyrev discloses the sensor system of claim 7, wherein a respective second sensing line (FIG.1-5 Item 202 vertical discloses 202-3 & 202-4 in Paragraph [0065]) is coupled to the first surface (FIG.1-5 Item102 top surface) of the flexible substrate (FIG.1-5 Item102 ) at the first sensing subregion (FIG.1-5 Item 202 vertical discloses 202-1 & 202-2 in Paragraph [0065]) of the flexible substrate and coupled to the second surface (FIG.1-5 Item102 bottom surface) of the flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) at the second sensing subregion (FIG.1-5 Item 202 vertical discloses 202-3 & 202-4 in Paragraph [0065]).of the flexible substrate (FIG.1-5 Item102 ).
12 Regarding to claim 12, Poupyrev discloses the sensor system of claim 7, wherein the one or more control circuits (FIG.1-5 Item210 or 212 or 106 discloses microprocessor 212 may then analyze the touch-input data to generate one or more control signals, which may then be communicated to computing device 106 (e.g., a smart phone) Paragraph [0063]) are configured to determine, based on the signals, the movement direction of the user touch input by:
detecting movement from a first position at a first time step to a second position at a second time step (FIG.1-5 Item 106 or 210 or 212 discloses recognizes the various different types of gestures, the computing device can be configured to determine a duration of the touch, swipe, or hold (e.g., one second or two seconds repeated in Paragraph [0080]), the first position including a first longitudinal position and a first lateral position, the second position (FIG.1-5 Item 106 or 210 or 212 discloses recognizes the various different types of gestures, the computing device can be configured to determine a duration of the touch, swipe, or hold (e.g., one second or two seconds repeated in Paragraph [0080]) including a second longitudinal position and a second lateral position (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 which can be repeated in Paragraph [0080]).; and
determining whether a longitudinal gesture or a lateral gesture has been performed based on the movement from the first position to the second position (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 which can be repeated in Paragraph [0080]).
13 Regarding to claim 15, Poupyrev discloses the sensor system of claim 3, wherein the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) are coupled to the first surface of the flexible substrate (FIG.1-5 Item102 discloses FIG. 3 has top surface 102 top in Paragraph [0065]) along the length of the sensing region; and
the one or more control circuits (FIG.1-5 Item210 or 212 or 106 discloses
microprocessor 212 may then analyze the touch-input data to generate one or more control signals, which may then be communicated to computing device 106 (e.g., a smart phone) Paragraph [0063]) are configured to determine whether the
user touch input is associated with the first surface (FIG.1-5 Item102 top) or the second surface (FIG.1-5 Item102 bottom) based at least in part on the signals generated in response to the user touch input by the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4in Paragraph [0065]).
14 Regarding to claim 16, Poupyrev discloses the sensor system of
claim 3, wherein the one or more control circuits (FIG.1-5 Item210 or 212 or
106 discloses microprocessor 212 may then analyze the touch-input data to generate one or more control signals, which may then be communicated to computing device 106 (e.g., a smart phone) Paragraph [0063]) are configured to:
determine, based on the movement direction of the user touch input, that the user touch input is associated with one or more gestures (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 in Paragraph [0080]).; and
initiate one or more actions based at least in part on determining that the user touch input is associated with one or more gestures (FIG.1-5 Item 106 or 210 or 212 discloses computing device 106 (e.g., a smart phone) via the network interface 216 to cause the computing device 106 to initiate a particular functionality.in Paragraph [0063]).
15 Regarding to claim 17, Poupyrev discloses the sensor system of claim 16, wherein at least one action of the one or more actions includes switching the sensor system from a first user context to a second user context (FIG.1-5 Item 106 or 210 or 212 discloses touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 which can be repeated in Paragraph [0080]).
16 Regarding to claim 18, Poupyrev discloses the sensor system of claim 1, wherein the sensor system is integrated with a wearable device (FIG.1-5 Item210 or 212 or 106 discloses can be wearable (e.g., computing spectacles and smart watches), non-wearable but mobile (e.g., laptops and tablets), or relatively immobile (e.g., desktops and servers). Paragraph [0055]).
17 Regarding to claim 19, Poupyrev discloses the sensor system of claim 1, the sensor system is a capacitive touch sensor (FIG.1-5 Item210 or 212 or 106 discloses the conductive threads 202 may form a single series of parallel threads. For instance, in one embodiment, the capacitive touch sensor Paragraph [0069]).
18 Regarding to claim 20, Poupyrev discloses the sensor system of claim 1, wherein the sensing region of the flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) is divided in the longitudinal direction into a plurality of sensing subregions (FIG.1-5 Item 202 vertical discloses 202-1 & 202-2 in Paragraph [0065]), the plurality of sensing subregions being free of overlap with each other in the longitudinal direction, and wherein only a single respective sensing portion of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) is exposed within each respective sensing subregion (FIG.1-5 Item 202 vertical discloses 202-1 & 202-2 in Paragraph [0065]).
19 Regarding to claim 21, Poupyrev discloses the sensor system of claim 20, wherein each sensing portion of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) is free of overlap in the longitudinal direction with sensing portions of neighboring second sensing lines of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]).
20 Regarding to claim 22, Poupyrev discloses the sensor system of claim 1, wherein the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) are approximately equally spaced apart in the lateral direction.
21 Regarding to claim 23, Poupyrev discloses the sensor system of claim 1, wherein each of the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) is arranged between a respective pair of first sensing lines such that the first sensing lines FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) alternate with the second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) in the lateral direction.
22 Regarding to claim 24, Poupyrev discloses a computer-implemented method of determining a user gesture, comprising:
detecting, from at least two of a plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) and at least two of a plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]),
signals describing a user touch input directed to a sensing region of a flexible substrate, the flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) having a first surface (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) and an opposite, second surface (FIG.1-5 Item102 or 220 discloses FIG. 3 has bottom surface 102 bottom in Paragraph [0065 & 0086]), the flexible substrate (FIG.1-5 Item102 or 220) defining a longitudinal direction and a lateral direction (FIG.1-5 Item102 & 202 discloses multiple vertical and horizontal conductive threads 202. For example, a single touch with a single finger may generate the coordinates X1,Y1 and X2,Y1 in Paragraph [0076]), the plurality of first sensing lines extending in the longitudinal direction and being substantially parallel (FIG.1-5 Item 202 discloses 202-3 & 202-4 are substantially parallel with 201-1 & 202-2in Paragraph [0065 & 0087]) and spaced apart in the lateral direction, wherein each of the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) is coupled to the first surface of the flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) within a sensing region having a length in the longitudinal direction, the plurality of second sensing lines (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) extend in the longitudinal direction of the flexible substrate (FIG.1-5 Item 102) and substantially parallel with the plurality of first sensing lines (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]), wherein at least a portion of each of the plurality of second sensing lines is exposed on the first surface (FIG.1-5 Item 102 top)) of the flexible substrate (FIG.1-5 Item 102) ) within the sensing region and at least a second portion of each of the plurality of second sensing lines(FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) is exposed on the second surface (FIG.1-5 Item 102 bottom)) of the flexible substrate (FIG.1-5 Item 102)) within the sensing region; and
determining, based on the signals, that a user has made a swipe gesture (FIG.1-5 Item 106 or 210 or 212 discloses conductive thread 202 have touch-input may then be used to generate touch data usable to control computing device 106. For example, the touch-input can be used to determine various gestures, such as single-finger touches (e.g., touches, taps, and holds), multi-finger touches (e.g., two-finger touches, two-finger taps, two-finger holds, and pinches), single-finger and multi-finger swipes (e.g., swipe up, swipe down, swipe left, swipe right) in Paragraph [0061]). along the first surface (FIG.1-5 Item 102) in the longitudinal direction (FIG.1-5 Item 210 discloses sensing circuitry 210 uses the change in capacitance to identify the presence of the object detecting which horizontal conductive thread 202 is touched, and which vertical conductive thread 202 is touched by detecting changes in capacitance of each respective conductive thread 202in Paragraph [0074]).
23 Regarding to claim 25, Poupyrev discloses an interactive object, comprising:
a touch sensor (FIG.1-5 Item100 discloses 100 includes an interactive textile 102, which is shown as being integrated within various interactive objects 104. Interactive textile 102 is a textile that is configured to sense multi-touch-input in Paragraph [0052]) comprising a plurality of conductive sensing lines (FIG.1-5 Item 202 discloses 202-in Paragraph [0065]) integrated with a flexible substrate (FIG.1-5 Item102 or 220 discloses 102 in Paragraph [0065 & 0086]), the plurality of conductive sensing lines comprising a first conductive sensing line (FIG.1-5 Item 202 discloses 202-1 & 202-2 in Paragraph [0065]) coupled to a first surface (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) of the flexible substrate (FIG.1-5 Item102) at a first sensing subregion and a second sensing subregion of the flexible substrate (FIG.1-5 Item102) and a second conductive sensing line (FIG.1-5 Item 202 discloses 202-3 & 202-4 in Paragraph [0065]) coupled to the first surface (FIG.1-5 Item102 top) of the flexible substrate at (FIG.1-5 Item102) the first sensing subregion and a second surface (FIG.1-5 Item102 bottom) of the flexible substrate at the second sensing subregion; and
one or more control circuits (FIG.1-5 Item210 or 212 or 106 discloses microprocessor 212 may then analyze the touch-input data to generate one or more control signals, which may then be communicated to computing device 106 (e.g., a smart phone) Paragraph [0063]) configured to:
obtain touch data associated with a touch input to the touch sensor (FIG.1-5 Item 210 discloses sensing circuitry 210 uses the change in capacitance to identify the presence of the object detecting which horizontal conductive thread 202 is touched, and which vertical conductive thread 202 is touched by detecting changes in capacitance of each respective conductive thread 202in Paragraph [0074]), the touch data indicative of a respective response to the touch input by the plurality of conductive sensing lines (FIG.1-5 Item 202 discloses 202-in Paragraph [0065]); and
determine whether the touch input is associated with the first sensing subregion or the second sensing subregion of the flexible substrate (FIG.1-5 Item102 or 220 discloses FIG. 3 has top surface 102 top in Paragraph [0065 & 0086]) based at least in part on the respective response to the touch input by the plurality of conductive sensing lines(FIG.1-5 Item 106 or 210 or 212 discloses conductive thread 202 have touch-input may then be used to generate touch data usable to control computing device 106. For example, the touch-input can be used to determine various gestures, such as single-finger touches (e.g., touches, taps, and holds), multi-finger touches (e.g., two-finger touches, two-finger taps, two-finger holds, and pinches), single-finger and multi-finger swipes (e.g., swipe up, swipe down, swipe left, swipe right) in Paragraph [0061]).
Allowable Subject Matter
Claims 9-11 and 13-14 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
The following is an examiner’s statement of reasons for allowance:
Regarding claim 9 the prior art or record taken alone or in combination fail to teach or suggest the sensor system of claim 8,wherein the longitudinal component is based, at least in part, on one or more signals from the respective second sensing line for at least one of the first sensing subregion or the second sensing subregion of the flexible substrate.” in combination with all the other elements of claim 9.
Claims 10-11 are also allowed as they further limit claim 9.
Regarding claim 13 the prior art or record taken alone or in combination fail to teach or suggest the sensor system of claim 7, wherein: determining a respective first sensing line is coupled to the first surface of the flexible substrate at the first sensing subregion, the second sensing subregion, and the third sensing subregion of the flexible substrate; and
a respective second sensing line is coupled to the second surface of the flexible substrate at the first sensing subregion and the third sensing subregion of the flexible substrate and is coupled to the first surface at the second sensing subregion of the flexible substrate.” in combination with all the other elements of claim 13.
Regarding claim 14 the prior art or record taken alone or in combination fail to teach or suggest the sensor system of claim 7, wherein: a respective first sensing line is coupled to the first surface at the first sensing subregion and the second sensing subregion and is coupled to the second surface at the third sensing subregion of the flexible substrate; and
a respective second sensing line is coupled to the first surface of the flexible substrate at the first sensing subregion and the third sensing subregion and is coupled to the second surface of the flexible substrate at the second sensing subregion”. in combination with all the other elements of claim 14.
24 Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRENT J ANDREWS whose telephone number is (571)272-6101. The examiner can normally be reached 10am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Judy Nguyen can be reached at (571)272-2258. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRENT J ANDREWS/Examiner, Art Unit 2858
/JUDY NGUYEN/Supervisory Patent Examiner, Art Unit 2858