Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1 and 10 are amended. Claims 2, 8, 11, 14, 18-19 are cancelled. Claims 1, 3-7, 9-10, 12-13, 15-17, and 20 are currently under review.
Response to Arguments
Applicant's arguments filed December 19, 2025 have been fully considered but they are not persuasive. With respect to the 112(a) rejection, the Applicant indicates that paragraph 10, “For example, palm rejection is performed by detecting the force applied by a palm in addition to capacitive touch sensing so that a light palm press can be more positively distinguished from a heavy thumb touch intended at a click” and paragraphs 11 and 44 provide support. The Office disagrees. The Office believes that the claim limitations are directed to figures 9A and 9B, where at step 174 with reference to paragraph 46, teaches “At step 174, the force detection sensor processing resource applies both the force detection and the touch area detection to determine if the level of the force applied at the touch detection surface indicates a click input” which is not the same as the claim limitations indicating “the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb”. The specification does not mention “weight” with respect to determining a level of force applied to the touch detection surface. The weight of a thumb is simply a resting weight of the thumb and is NOT a level of force applied at a touch detection surface for indicating a click input as the specification describes. Therefore, the 112(a) rejection applies.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1, 10, and 17 and their dependents claims 3-7, 9, 12-13, 15-16, and 20 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. The claim limitations, “the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb/the amount of pressures detected by the force detection sensor correspond to a weight of the end user thumb” are not described in the specification. Paragraph 10 of the specification only mentions, “For example, palm rejection is performed by detecting the force applied by a palm in addition to capacitive touch sensing so that a light palm press can be more positively distinguished from a heavy thumb touch intended at a click … the touch detection surface supports a variety of other interfaces, such as gaming and utility interfaces and a scale to weigh objects using the force detection sensor”, where “a heavy thumb touch intended at a click” is a level of force not a weight. Paragraph 11 merely mentions, “The improved confidence aids in the use of the touch detection surface for touch function row inputs where a palm may rest with varied weight and an end user may interact through proximity as opposed to touch at the touch function row touch areas”; and paragraph 55 refers to figure 18 and describes a process for measuring an object’s weight at palm rest by launching a weigh scale application. Nothing in the specification definitively and specifically indicates “the amount of pressures detected by the force detection sensor correspond to a weight of the end user thumb”. The specification does not mention “weight” with respect to determining a level of force applied to the touch detection surface. The weight of a thumb is simply a resting weight of the thumb and is NOT a level of force applied at a touch detection surface for indicating a click input as the specification describes. Therefore, the 112(a) rejection applies. For the purposes of examination, “the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb” will be interpreted as “the amount of pressure detected by the force detection sensor for the touch corresponds to a level of force of the end user thumb”
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 9-10, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang (Pub. No.: US 2019/0361543 A1) in view of Burks (Pub. No.: US 2018/0217714 A1) and in view of Jung (Pub. No.: US 2024/0329783 A1)
With respect to Claim 1, Zhang teaches an information handling system (fig. 1, item 100: computing device; ¶33) comprising: a housing (fig. 1, item 103 is disposed in a housing/frame and item 110: bottom housing; ¶42, “The cover 112 may be attached to a bottom housing component 110 to define substantially all of the enclosure of the base portion 104”); a processor (fig. 9, item 902; ¶123; ¶125) disposed in the housing and operable to process information; a memory (fig. 9, item 904; ¶124; ¶126) disposed in the housing and interfaced with the processor, the memory operable to store the information; a main display (fig. 1, item 103; fig. 9, item 920) coupled to the housing and interfaced with the processor, the main display operable to present the information as visual images (¶34; ¶124); a keyboard (fig. 1, item 114) coupled to the housing and operable to accept end user inputs as mechanical key presses (¶36); a palm rest capacitive touch detection surface (fig. 1, item 109; fig. 2, item 203; fig. 8A; fig. 9, item 906; ¶44; ¶55, “In some cases, the touch- and/or force-sensing components 203 include electrode layers that detect changes in capacitance due to the application of a touch and/or force input on the cover 112”; ¶111; ¶127) coupled to the housing adjacent the keyboard and interfaced with the processor, the palm rest capacitive touch detection surface operable to detect touches as inputs (¶45); a force detection sensor disposed at the palm rest capacitive touch detection surface and operable to detect forces associated with touches at the palm rest capacitive touch detection surface (¶44, “The touch-sensitive input regions 109, which may be part of the dynamic display interface, may also be associated with and/or at least partially defined by underlying displays that display graphical outputs within the touch-sensitive input regions 109, as well as touch- and/or force-sensing systems that detect touch and/or force inputs applied to the cover 112”; ¶55-56); and a non-transient memory (fig. 9, item 904 is a memory that is a non-transient memory; ¶124; ¶126) storing instructions that when executed cause: presentation of a user interface at the main display (¶124) to define an active area of less than all of the palm rest capacitive touch detection surface (figs. 3A-3B, item 316 or 320; ¶71; ¶73, “The first region 316 may define a trackpad region that is active when the device 300 is in a first mode of operation”; ¶74-75, “When the device 300 is in a second mode of operation, such as when a second application program is active, the device 300 may activate an expanded trackpad region to provide a larger area (or a smaller or differently shaped area) in which a user may provide trackpad inputs”; fig. 8A; ¶110); and configuration of the active area of less than all of the palm rest capacitive touch detection surface when selected at the user interface (fig. 3B, region of item 320 is less than 318; ¶75, “When the device 300 is in a second mode of operation, such as when a second application program is active, the device 300 may activate an expanded trackpad region to provide a larger area (or a smaller or differently shaped area) in which a user may provide trackpad inputs”- when the user selects a second application program); configuration of the force detection sensor to the active area of less than all of the palm rest capacitive touch detection surface when selected at the user interface (¶76; ¶78 – due to the mode of operation); detection of an amount of force associated with the touches with the force detection sensor (¶44, “The touch-sensitive input regions 109, which may be part of the dynamic display interface, may also be associated with and/or at least partially defined by underlying displays that display graphical outputs within the touch-sensitive input regions 109, as well as touch- and/or force-sensing systems that detect touch and/or force inputs applied to the cover 112”; ¶55; ¶56, “The force sensing components may include a strain sensor, capacitive gap sensor, or other force sensitive structure that is configured to produce an electrical response that corresponds to an amount of applied force associated with a touch input”); and discarding at least some of the touches as unintended inputs (¶76; ¶78, “For example, when a user presses in the trackpad region with sufficient force (e.g., when the force exceeds a threshold), a haptic output may be produced via the cover 312 to inform the user that the input has been detected and/or registered. When the device 300 is in the first mode of operation, the haptic output may be produced when an input having an applied force that exceeds a threshold force is detected within the first region 316. In the first mode of operation, if an input exceeding the threshold force is detected outside the first region 316, the input may not be registered and the device 300 may produce no haptic output.” – ignoring corresponds to discarding, please note that comparisons are made with the force of touches and whether they exceed the threshold force and detection and discarding are dependent on area).
Zhang does not mention discarding at least some of the touches as unintended inputs by comparing a surface area size of the touches and the amount of force of the touches.
Burks teaches an information handling system (fig. 1, item 100; ¶5) comprising: a housing (fig. 1); a processor (fig. 2, item 260; ¶6, processing resource; ¶12-13) disposed in the housing and operable to process information; a memory (fig. 2, item 264; ¶13) disposed in the housing and interfaced with the processor, the memory operable to store the information; a main display (fig. 1, item 102; ¶6) coupled to the housing and interfaced with the processor; a keyboard (fig. 1, item 108; ¶6) coupled to the housing and operable to accept end user inputs; a palm rest capacitive touch detection surface (fig. 1, item 112 comprising item 110; ¶7, “The touchpad 110 can be provided in or near the handrest area 112”) coupled to the housing adjacent the keyboard and interfaced with the processor, the palm rest capacitive touch detection surface operable to detect touches as inputs (¶7); and a non-transient memory (fig. 2, item 264; ¶13-14) storing instructions that when executed cause: detection of an amount of force (¶33, “monitoring can include monitoring information representative of a frequency, a location on a touchpad, a duration, and/or an amount of force, among other information associated with a provided input to a touchpad”); and discarding at least some of the touches as unintended inputs by comparing a surface area size of the touches and the amount force of the touches (¶29, “the compare module 280 can, in some examples, compare a size of a provided input (e.g., a percentage of a touchpad contacted unintentionally by a hand of a user while typing on a keyboard) to a user specific size threshold (e.g., an average size of an area of a touchpad contacted by a tap and/or a click of the touchpad by the user), among other types of comparisons of provided inputs to specific thresholds set by an electronic device based on provided inputs to a touchpad of the electronic device”; ¶33, “monitoring can include monitoring information representative of a frequency, a location on a touchpad, a duration, and/or an amount of force, among other information associated with a provided input to a touchpad”).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date to modify the information handling system of Zhang, such that the instructions comprise: discarding at least some of the touches as unintended inputs by comparing a surface area size of the touches and the amount of force of the touches, as taught by Burks so as to mitigate unintended inputs (¶36-37).
Zhang and Burks combined do not explicitly teach accepting at least some of the touches as inputs when an area of a touch corresponds to a palm size and the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb.
Jung teaches an information handling system (fig. 1) that: accepts at least some touches as inputs when an area of a touch corresponds to a palm size (¶96; ¶211-212; ¶229; ¶230; ¶245; ¶273, “the size of the user's hand, and the input pattern, so that the present invention may be utilized in various electronic devices 10 and can satisfy the requirements of various users since the user can select a desired input method”) and the amount of pressure detected by a force detection sensor for the touch corresponds to a weight of the end user thumb (fig. 27, the right thumb comes into contact (push) to provide an input as the number 5; ¶111, “when the finger comes into contact (touch) therewith with a set pressure or less, detect the finger input signal in the second finger input mode when the finger comes in contact (push) therewith with the set pressure or more, and detect a position of the finger through the finger input signal” – since the pressure performs a contact push, the contact push is a level of force; ¶126; ¶183; ¶254).
Therefore it would have been obvious to a person of ordinary skill in the art to modify the combined information handling system of Zhang and Burks, such that the stored instructions when executed cause: accepting at least some of the touches as inputs when an area of a touch corresponds to a palm size and the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb, as taught by Jung so as to prevent input errors (¶97; ¶171).
With respect to Claim 9, claim 1 is incorporated, Zhang teaches wherein the instructions compare the area and force expected for a palm resting on the palm rest capacitive touch detection surface (¶78, “In the first mode of operation, if an input exceeding the threshold force is detected outside the first region 316, the input may not be registered and the device 300 may produce no haptic output. In the second mode of operation, on the other hand, an input may be registered and a haptic output may be produced when an input having an applied force that exceeds a threshold force is detected within the expanded input region 320 (which includes and/or encompasses the area of the first region 316)”).
With respect to Claim 10, Zhang teaches a method for managing touch inputs at an information handling system (fig. 9, item 904; ¶124, “the device 900 includes one or more processing units 902 that are configured to access a memory 904 having instructions stored thereon. The instructions or computer programs may be configured to perform one or more of the operations or functions described with respect to the device 900” – the instructions = method), the method comprising: presenting a user interface at a main display (figs. 8A-8C); defining an active area of less than all of a palm rest capacitive touch detection surface (figs. 3A to 3B, items 316 or 320: active areas; ¶74-75); configuring the active area of the palm rest capacitive touch detection surface that detects touches to less than all of the palm rest capacitive touch detection surface when selected at the user interface (fig. 3B, region of item 320 is less than 318; ¶75, “When the device 300 is in a second mode of operation, such as when a second application program is active, the device 300 may activate an expanded trackpad region to provide a larger area (or a smaller or differently shaped area) in which a user may provide trackpad inputs”- when the user selects a second application program); including a force detection sensor at the palm rest capacitive touch detection surface, the force detection sensor detecting an amount of forces associated with touches at the palm rest capacitive touch detection surface (¶44, “The touch-sensitive input regions 109, which may be part of the dynamic display interface, may also be associated with and/or at least partially defined by underlying displays that display graphical outputs within the touch-sensitive input regions 109, as well as touch- and/or force-sensing systems that detect touch and/or force inputs applied to the cover 112”; ¶55; ¶56, “The force sensing components may include a strain sensor, capacitive gap sensor, or other force sensitive structure that is configured to produce an electrical response that corresponds to an amount of applied force associated with a touch input”); and configuring the active area of the palm rest capacitive touch detection surface that detects the amount of force to less than all of the force detection sensor when selected at the user interface (¶76; ¶78); detecting touches with the palm rest touch detection surface (¶44, “The touch-sensitive input regions 109, which may be part of the dynamic display interface, may also be associated with and/or at least partially defined by underlying displays that display graphical outputs within the touch-sensitive input regions 109); detecting the amount of force associated with the touches with the force detection sensor (¶44, “as well as touch- and/or force-sensing systems that detect touch and/or force inputs applied to the cover 112”; ¶55; ¶56, “The force sensing components may include a strain sensor, capacitive gap sensor, or other force sensitive structure that is configured to produce an electrical response that corresponds to an amount of applied force associated with a touch input”); and discarding at least some of the touches as unintended inputs (¶76; ¶78, “For example, when a user presses in the trackpad region with sufficient force (e.g., when the force exceeds a threshold), a haptic output may be produced via the cover 312 to inform the user that the input has been detected and/or registered. When the device 300 is in the first mode of operation, the haptic output may be produced when an input having an applied force that exceeds a threshold force is detected within the first region 316. In the first mode of operation, if an input exceeding the threshold force is detected outside the first region 316, the input may not be registered and the device 300 may produce no haptic output.” – ignoring corresponds to discarding, please note that comparisons are made with the force of touches and whether they exceed the threshold force and detection and discarding are dependent on area).
Zhang does not mention discarding at least some of the touches as unintended inputs by comparing a surface area size of the touches and the amount of force of the touches.
Burks teaches an information handling system (fig. 1, item 100; ¶5) comprising: a housing (fig. 1); a processor (fig. 2, item 260; ¶6, processing resource; ¶12-13) disposed in the housing and operable to process information; a memory (fig. 2, item 264; ¶13) disposed in the housing and interfaced with the processor, the memory operable to store the information; a main display (fig. 1, item 102; ¶6) coupled to the housing and interfaced with the processor; a keyboard (fig. 1, item 108; ¶6) coupled to the housing and operable to accept end user inputs; a palm rest capacitive touch detection surface (fig. 1, item 112 comprising item 110; ¶7, “The touchpad 110 can be provided in or near the handrest area 112”) coupled to the housing adjacent the keyboard and interfaced with the processor, the palm rest capacitive touch detection surface operable to detect touches as inputs (¶7); and a non-transient memory (fig. 2, item 264; ¶13-14) storing instructions that when executed cause: detection of an amount of force (¶33, “monitoring can include monitoring information representative of a frequency, a location on a touchpad, a duration, and/or an amount of force, among other information associated with a provided input to a touchpad”); and discarding at least some of the touches as unintended inputs by comparing a surface area size of the touches and the amount force of the touches (¶29, “the compare module 280 can, in some examples, compare a size of a provided input (e.g., a percentage of a touchpad contacted unintentionally by a hand of a user while typing on a keyboard) to a user specific size threshold (e.g., an average size of an area of a touchpad contacted by a tap and/or a click of the touchpad by the user), among other types of comparisons of provided inputs to specific thresholds set by an electronic device based on provided inputs to a touchpad of the electronic device”; ¶33, “monitoring can include monitoring information representative of a frequency, a location on a touchpad, a duration, and/or an amount of force, among other information associated with a provided input to a touchpad”).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date to modify the method of Zhang, such that the method comprises: discarding at least some of the touches as unintended inputs by comparing a surface area size of the touches and the amount of force of the touches, as taught by Burks so as to mitigate unintended inputs (¶36-37).
Zhang and Burks combined do not explicitly teach accepting at least some of the touches as inputs when an area of a touch corresponds to a palm size and the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb.
Jung teaches an information handling system (fig. 1) and methos (fig. 17; ¶186-188) that: accepts at least some touches as inputs when an area of a touch corresponds to a palm size (¶96; ¶211-212; ¶229; ¶230; ¶245; ¶273, “the size of the user's hand, and the input pattern, so that the present invention may be utilized in various electronic devices 10 and can satisfy the requirements of various users since the user can select a desired input method”) and the amount of pressure detected by a force detection sensor for the touch corresponds to a weight of the end user thumb (fig. 27, the right thumb comes into contact (push) to provide an input as the number 5; ¶111, “when the finger comes into contact (touch) therewith with a set pressure or less, detect the finger input signal in the second finger input mode when the finger comes in contact (push) therewith with the set pressure or more, and detect a position of the finger through the finger input signal” – since the pressure performs a contact push, the contact push is a level of force; ¶126; ¶183; ¶254).
Therefore it would have been obvious to a person of ordinary skill in the art to modify the combined method of Zhang and Burks, such that the stored instructions when executed cause: accepting at least some of the touches as inputs when an area of a touch corresponds to a palm size and the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb, as taught by Jung so as to prevent input errors (¶97; ¶171).
With respect to Claim 17, Zhang teaches a system (¶30, “operating system”) for managing touch inputs, the system comprising: a processing resource (fig. 9, item 902; ¶123; ¶125) operable to execute instructions to process information; a display (fig. 1, item 103; fig. 9, item 920) operable to present the information as visual images (¶34; ¶124); a keyboard (fig. 1, item 114) having mechanical keys that depress to accept inputs (¶36); a palm rest (fig. 1; figs. 3A to 3B; ¶31, “the dynamic display interface may include a display and associated touch (and/or force) sensing system positioned in a palm rest region of the laptop (e.g., below the keyboard)”; ¶45, “As shown in FIG. 1, the device 100 includes a first touch-sensitive input region 109-1 that corresponds to a palm rest region that is below the keyboard 114 (e.g., it may be positioned along a bottom side of the keyboard 114, as shown in FIG. 1)”) coupled adjacent the keyboard; a capacitive touch detection surface (fig. 1,item 109-1; fig. 9, item 906; ¶45; ¶46-47; ¶127) coupled at the palm rest and operable to detect end user touches; a force detection sensor disposed at the capacitive touch detection surface (¶44, “The touch-sensitive input regions 109, which may be part of the dynamic display interface, may also be associated with and/or at least partially defined by underlying displays that display graphical outputs within the touch-sensitive input regions 109, as well as touch- and/or force-sensing systems that detect touch and/or force inputs applied to the cover 112”), the force detection sensor operable to detect an amount of pressures applied by the end user touches (¶55; ¶56, “The force sensing components may include a strain sensor, capacitive gap sensor, or other force sensitive structure that is configured to produce an electrical response that corresponds to an amount of applied force associated with a touch input” – pressure is defined as an amount of force applied to a surface); and a non-transient memory (fig. 9, item 904 is a memory that is a non-transient memory; ¶124; ¶126) storing instructions that when executed on the processing resource cause: presentation of a user interface at the display (¶124) to define an active area of less than all of the capacitive touch detection surface (figs. 3A-3B, item 316 or 320; ¶71; ¶73, “The first region 316 may define a trackpad region that is active when the device 300 is in a first mode of operation”; ¶74-75); configuration of the force detection sensor to the active area of less than all of the capacitive touch detection surface when selected at the user interface (¶76; ¶78); configuration of the capacitive touch detection surface to the active area of less than all of the capacitive touch detection surface when selected at the user interface (fig. 3B, region of item 320 is less than 318; ¶75, “When the device 300 is in a second mode of operation, such as when a second application program is active, the device 300 may activate an expanded trackpad region to provide a larger area (or a smaller or differently shaped area) in which a user may provide trackpad inputs”- when the user selects a second application program).
Although Zhang mentions a palm rest region, Zhang does not teach wherein the instructions further comprise: identification of at least some of the touches as having a surface area shape of an end user palm; rejection of the touches having the surface area shape of the end user palm when the amount of pressures detected by the force detection sensor correspond to a weight of the end user palm.
Burks teaches an information handling system (fig. 1, item 100; ¶5) comprising: a housing (fig. 1); a processor (fig. 2, item 260; ¶6, processing resource; ¶12-13) disposed in the housing and operable to process information; a memory (fig. 2, item 264; ¶13) disposed in the housing and interfaced with the processor, the memory operable to store the information; a main display (fig. 1, item 102; ¶6) coupled to the housing and interfaced with the processor; a keyboard (fig. 1, item 108; ¶6) coupled to the housing and operable to accept end user inputs; a palm rest capacitive touch detection surface (fig. 1, item 112 comprising item 110; ¶7, “The touchpad 110 can be provided in or near the handrest area 112”) coupled to the housing adjacent the keyboard and interfaced with the processor, the palm rest capacitive touch detection surface operable to detect touches as inputs (¶7); and a non-transient memory (fig. 2, item 264; ¶13-14) storing instructions that when executed cause: identification of at least some of the touches as having a surface area shape of an end user palm (¶27-29); rejection of the touches having the surface area shape of the end user palm when the amount of pressures detected by the force detection sensor correspond to a weight of the end user palm (¶27-29; ¶33, “monitoring can include monitoring information representative of a frequency, a location on a touchpad, a duration, and/or an amount of force, among other information associated with a provided input to a touchpad” – pressure is the amount of force applied to a surface).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date to modify the information handling system of Zhang, wherein the instructions further comprise: identification of at least some of the touches as having a surface area shape of an end user palm; rejection of the touches having the surface area shape of the end user palm when the amount of pressures detected by the force detection sensor correspond to a weight of the end user palm, as taught by Burks so as to mitigate unintended inputs (¶36-37).
Zhang and Burks combined do not explicitly teach acceptance of the end user touches having the surface area shape of the end user palm when the amount of pressures detected by the force detection sensor correspond to a weight of the end user thumb.
Jung teaches an information handling system (fig. 1) that: accepts end user touches as inputs having the surface area shape of the end user palm (¶96, “the upper surface of the palm detection unit 260 may be applied in various shapes so as to correspond to the shape of a user's hand”; ¶211-212; ¶229; ¶230; ¶245) when the amount of pressures detected by the force detection sensor correspond to a weight of the end user thumb (fig. 27, the right thumb comes into contact (push) to provide an input as the number 5; ¶111, “when the finger comes into contact (touch) therewith with a set pressure or less, detect the finger input signal in the second finger input mode when the finger comes in contact (push) therewith with the set pressure or more, and detect a position of the finger through the finger input signal” – since the pressure performs a contact push, the contact push is a level of force; ¶126; ¶183; ¶254).
Therefore it would have been obvious to a person of ordinary skill in the art to modify the combined of Zhang and Burks, such that the stored instructions when executed cause: accepting at least some of the touches as inputs when an area of a touch corresponds to a palm size and the amount of pressure detected by the force detection sensor for the touch corresponds to a weight of the end user thumb, as taught by Jung so as to prevent input errors (¶97; ¶171).
Claims 3-4 and 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang, Burks, and Jung as applied to claims 1 and 10 above, and further in view of Ligtenberg et al. (Pub. No.: US 2018/0218859 A1) hereinafter referred to as Ligtenberg.
With respect to Claim 3, claim 1 is incorporated, Zhang teaches further comprising: a haptic device (fig. 2A, item 206; fig. 9, item 912) at the palm rest capacitive touch detection surface and operable to generate localized vibrations as haptic feedback at the palm rest capacitive touch detection surface (¶62-63); and instructions that cause configuration of the haptic device to the active area of less than all of the palm rest capacitive touch detection surface when selected at the user interface (¶78).
Although Zhang teaches a haptic device, Zhang, Burks, and Jung combined do not teach plural haptic devices disposed in a spaced manner; nor does Zhang teach instructions that cause configuration of the plural haptic devices to the active area of less than all of the capacitive touch area when selected at the user interface.
Ligtenberg teaches an information handling system (fig. 1A, item 100; ¶124; fig. 16F, item 1600: computing device; fig. 50, item 5000; ¶619) comprising: a housing (fig. 16A, item 1602: top case and 1604: bottom case; ¶334); a processor (fig. 50, item 5002; ¶151, “The base portion 104 may also include components 208 within the interior volume, such as processors, memory devices, circuit boards, input/output devices, haptic actuators, wired and/or wireless communication devices, communication ports, disk drives, and the like”; ¶376; ¶619-621) disposed in the housing and operable to process information; a memory (fig. 50, item 5004; ¶622) disposed in the housing and interfaced with the processor, the memory operable to store the information; a main display (fig. 1A, item 102; fig. 16F, item 1603; fig. 50, item 5020; ¶125; ¶334; ¶620; ¶622) coupled to the housing and interfaced with the processor, the main display operable to present the information as visual images (¶125); a keyboard (fig. 16F, item 1614 comprising: 1616 and 1618) coupled to the housing and operable to accept end user inputs as mechanical key presses (¶343-344); a palm rest capacitive touch detection surface (fig. 16F, item 1610; fig. 50, item 5006; ¶336-337; ¶623; ¶625) coupled to the housing adjacent the keyboard and interfaced with the processor, the palm rest capacitive touch detection surface operable to detect touches as inputs (¶336); and a non-transient memory (fig. 50, item 5004; ¶620; ¶622) storing instructions that when executed cause: presentation of a user interface at the main display (¶620, “the instructions may be configured to control or coordinate the operation of one or more displays 5020, one or more touch sensors 5006, one or more force sensors 5008, one or more communication channels 5010, and/or one or more haptic feedback devices 5012”) and definition of an active area of the palm rest capacitive touch detection surface (¶337, “As one example, a display underlying the trackpad region 1610 may produce an image of a border (e.g., representing or replicating an image of a trackpad) that indicates where a user may provide touch inputs. As another example, the display may produce an image of a slider that a user can select and/or move to change a volume setting of the computing device 1600. These are merely some examples, and numerous other images and objects can be displayed, and inputs to the trackpad region 1610 may affect numerous settings and operations of the computing device 1600”); and configuration of the active area of the capacitive touch when selected at the user interface (¶337, “A display may be used, for example, to display input areas, buttons, keys, or other affordances. As one example, a display underlying the trackpad region 1610 may produce an image of a border (e.g., representing or replicating an image of a trackpad) that indicates where a user may provide touch inputs”); further comprising: plural haptic devices (fig. 30A, items 3022 and 3024) disposed in a spaced manner at the palm rest capacitive touch detection surface (¶452) and operable to generate localized vibrations as haptic feedback at the palm rest capacitive touch detection surface (fig. 10; ¶240, “Reinforcing members may also be included (or strategically omitted) to create haptic or tactile feedback regions, such as by isolating haptic outputs from a particular haptic actuator or device to a localized region that is less than the entire top case of a device”; ¶434; ¶453; ¶454, “The haptic output produced by the electromagnetic actuators 3022, 3024 may be detectable at any location in the trackpad region”); and instructions that cause configuration of the plural haptic devices to the active area of less than all of the palm rest capacitive touch detection surface when selected at the user interface (¶452, “The electromagnetic actuators 3022, 3024, 3026, 3028 may include more than one type of actuator, each type configured to produce a different type of haptic feedback in response to a different event or action” – the plural haptic devices are disposed in the active area = trackpad region but occupy an overlay of less than all of the capacitive touch area; ¶453, “The haptic output produced by the electromagnetic actuators 3022, 3024 may be detectable at any location in the trackpad region” – when selected at the user interface).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined information handling system of Zhang, Burks, and Jung, such that the haptic device is replaced with plural haptic devices disposed in a spaced manner and to comprise instructions that cause configuration of the plural haptic devices to the active area of less than all of the palm rest capacitive touch detection surface when selected at the user interface, as taught by Ligtenberg so as to provide localized feedback (¶240).
With respect to Claim 4, claim 3 is incorporated, Zhang teaches further comprising: a secondary display coupled to the palm rest capacitive touch detection surface to present visual images at the palm rest capacitive touch detection surface (¶75, “a display corresponding to the first touch-sensitive input region 309-1 (e.g., the display 204-1, FIG. 2A) may display a graphical output 319 that at least partially surrounds the first region 316 and defines an expanded input region 320 that includes the first region 316 and at least a portion of the second region 318. The graphical output 319 may be a border (e.g., one or more lines), a colored area, an image, or any other suitable graphical output that visually distinguishes the expanded input region 320”); and instructions that present the active area of less than all of the palm rest capacitive touch detection surface at the secondary display when selected at the user interface (fig. 3B, area enclosed by item 319 is less than 318 when selected by the user to be in a second mode of operation; ¶75).
With respect to Claim 12, claim 10 is incorporated, Zhang teaches further comprising: a haptic device (fig. 2A, item 206; fig. 9, item 912) at the palm rest capacitive touch detection surface, the haptic device generating localized vibrations as haptic feedback at the palm rest capacitive touch detection surface (¶62-63; ¶130, “haptic outputs may be local or global”); and configuring the active area of the palm rest touch detection surface that generates haptic feedback to the haptic device when selected at the user interface (¶78, either the first region 316 or the expanded input region 320, both of which are less in area than region 318).
Although Zhang teaches a haptic device, Zhang, Burks, and Jung combined do not teach plural haptic devices disposed in a spaced manner; nor does Zhang teach configuring the active area of the palm rest that generates haptic feedback to less than all of the plural haptic devices when selected at the user interface.
Ligtenberg teaches an information handling system (fig. 1A, item 100; ¶124; fig. 16F, item 1600: computing device; fig. 50, item 5000; ¶619) comprising: a housing (fig. 16A, item 1602: top case and 1604: bottom case; ¶334); a processor (fig. 50, item 5002; ¶151, “The base portion 104 may also include components 208 within the interior volume, such as processors, memory devices, circuit boards, input/output devices, haptic actuators, wired and/or wireless communication devices, communication ports, disk drives, and the like”; ¶376; ¶619-621) disposed in the housing and operable to process information/a method; a memory (fig. 50, item 5004; ¶622) disposed in the housing and interfaced with the processor, the memory operable to store the information/method; a main display (fig. 1A, item 102; fig. 16F, item 1603; fig. 50, item 5020; ¶125; ¶334; ¶620; ¶622) coupled to the housing and interfaced with the processor, the main display operable to present the information as visual images (¶125); a keyboard (fig. 16F, item 1614 comprising: 1616 and 1618) coupled to the housing and operable to accept end user inputs as mechanical key presses (¶343-344); a palm rest capacitive touch detection surface (fig. 16F, item 1610; fig. 50, item 5006; ¶336-337; ¶623; ¶625) coupled to the housing adjacent the keyboard and interfaced with the processor, the palm rest capacitive touch detection surface operable to detect touches as inputs (¶336); and a non-transient memory (fig. 50, item 5004; ¶620; ¶622) storing instructions that when executed cause: presentation of a user interface at the main display (¶620, “the instructions may be configured to control or coordinate the operation of one or more displays 5020, one or more touch sensors 5006, one or more force sensors 5008, one or more communication channels 5010, and/or one or more haptic feedback devices 5012”) and definition of an active area of the palm rest capacitive touch detection surface (¶337, “As one example, a display underlying the trackpad region 1610 may produce an image of a border (e.g., representing or replicating an image of a trackpad) that indicates where a user may provide touch inputs. As another example, the display may produce an image of a slider that a user can select and/or move to change a volume setting of the computing device 1600. These are merely some examples, and numerous other images and objects can be displayed, and inputs to the trackpad region 1610 may affect numerous settings and operations of the computing device 1600”); and configuration of the active area of the palm rest capacitive touch detection surface when selected at the user interface (¶337, “A display may be used, for example, to display input areas, buttons, keys, or other affordances. As one example, a display underlying the trackpad region 1610 may produce an image of a border (e.g., representing or replicating an image of a trackpad) that indicates where a user may provide touch inputs”); further comprising: plural haptic devices (fig. 30A, items 3022 and 3024) disposed in a spaced manner at the palm rest capacitive touch detection surface (¶452) and operable to generate localized vibrations as haptic feedback at the palm rest capacitive touch detection surface (fig. 10; ¶240, “Reinforcing members may also be included (or strategically omitted) to create haptic or tactile feedback regions, such as by isolating haptic outputs from a particular haptic actuator or device to a localized region that is less than the entire top case of a device”; ¶434; ¶453; ¶454, “The haptic output produced by the electromagnetic actuators 3022, 3024 may be detectable at any location in the trackpad region”); and instructions that cause configuration of the active area of the palm rest capacitive touch detection surface that generates haptic feedback to less than all of the plural haptic devices when selected at the user interface (¶15; ¶136; ¶427; ¶452, “The electromagnetic actuators 3022, 3024, 3026, 3028 may include more than one type of actuator, each type configured to produce a different type of haptic feedback in response to a different event or action” – the plural haptic devices are disposed in the active area = trackpad region but occupy an overlay of less than all of the capacitive touch area; ¶453, “The haptic output produced by the electromagnetic actuators 3022, 3024 may be detectable at any location in the trackpad region” – when selected at the user interface).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Zhang, Burks, and Jung, such that the haptic device is replaced with plural haptic devices disposed in a spaced manner and to configure the active area of the palm rest capacitive touch detection surface that generates haptic feedback to less than all of the plural haptic devices when selected at the user interface, as taught by Ligtenberg so as to provide localized feedback (¶240).
With respect to Claim 13, claim 12 is incorporated, Zhang teaches further comprising: coupling a secondary display to the palm rest capacitive touch detection surface to present visual images at the palm rest capacitive touch surface (¶75, “a display corresponding to the first touch-sensitive input region 309-1 (e.g., the display 204-1, FIG. 2A) may display a graphical output 319 that at least partially surrounds the first region 316 and defines an expanded input region 320 that includes the first region 316 and at least a portion of the second region 318. The graphical output 319 may be a border (e.g., one or more lines), a colored area, an image, or any other suitable graphical output that visually distinguishes the expanded input region 320”); and presenting the active area of less than all of the palm rest capacitive touch detection surface at the secondary display when selected at the user interface (fig. 3B, area enclosed by item 319 is less than 318 when selected by the user to be in a second mode of operation; ¶75).
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang, Burks, Jung, and Ligtenberg as applied to claim 4 above, and further in view of Berger et al. (Pub. No.: US 2018/0129413 A1) hereinafter referred to as Berger.
With respect to Claim 5, claim 4 is incorporated, Zhang teaches wherein the instructions further: executing a gaming application that presents a game figure at the main display (¶77, “For example, a large trackpad region may be advantageous when controlling graphics programs and games, where users manipulate graphical outputs displayed on a primary display (e.g., moving an animated character around a virtual world, translating or rotating graphical objects in a computer aided drafting program, or the like)”); executing inputs the palm rest capacitive touch detection surface to interact with the game figure (¶77); and communicating inputs made at the palm rest capacitive touch detection surface to the gaming figure (¶77).
Zhang, Burks, Jung, and Ligtenberg combined do not mention executing inputs at the palm rest capacitive touch detection surface corresponds to an interactive user interface at the secondary display having input icons and communicating inputs made at the secondary display from the palm rest capacitive touch detection surface to the gaming figure.
Berger teaches an information handling system (figs. 1 or 2) comprising: a processor (figs. 1 or 2, item 150 or 250 respectively; ¶40; ¶42) operable to process information/instructions; a capacitive touch detection surface (figs. 1 or 2, item 120 or 220; ¶40; ¶42) operable to detect touches as inputs and such that a secondary display (figs. 1 or 2, item 140 or 240; ¶40; ¶42) is coupled to the capacitive touch detection surface to present visual images at the capacitive touch detection surface; wherein the instructions further comprise: executing an interactive user interface at the secondary display having input icons to interact with a game figure and a gaming application (fig. 7: interactive user interface comprising input icons = keys 710, 720, 730, 740, and 750 , gaming application = Minecraft; ¶50); and communicating inputs made at the secondary display from the capacitive touch detection surface to the gaming figure (¶50).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined information handling system of Zhang, Burks, Jung, and Ligtenberg, such that executing inputs at the palm rest capacitive touch detection surface corresponds to an interactive user interface at the secondary display having input icons to communicate inputs made at the secondary display from the palm rest capacitive touch detection surface to the gaming figure, as taught by Berger so as to provide an operation specific user interface for ease of use (¶51).
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang, Burks, Jung, and Ligtenberg as applied to claim 4 above, and further in view of Zarraga et al. (Pub. No.: US 2020/0218418 A1) hereinafter referred to as Zarraga.
With respect to Claim 6, claim 4 is incorporated, Zhang teaches wherein the instructions further comprise: presenting a graphical output at the secondary display (fig. 8A, item 822 or item 824; ¶111, “a first graphical output 822 associated with the first application program may be displayed along the palm rest region of the device 800 (e.g., the first touch-sensitive input region 109-1, FIG. 1). The first graphical output 822 may be any suitable graphical output, and a user may interact with the first graphical output 822 to control one or more aspects of the active application program” or ¶113, “an additional graphical output 824 displayed along the palm rest region of the device 800. The additional graphical output 824 may include an image curve, histogram, or other graphical representation of a property of the image that is being viewed and/or edited. The additional graphical output 824 also includes a slider bar 826 (or any other suitable affordance or graphical output) with which a user may interact to modify the property of the image that is represented by the image curve or histogram” respectively); and detecting touches at the palm rest capacitive touch detection surface as inputs to the application program (¶111; ¶113).
Zhang, Burks, Jung, and Ligtenberg combined do not teach the graphical output is a musical keyboard, nor does Zhang mention that the application program is directed to the musical keyboard.
Zarraga teaches an information handling system (fig. 1; ¶24) comprising: a processor (¶24) operable to process information/instructions; a palm rest capacitive touch detection surface (fig. 15, item 1510; ¶62) operable to detect touches as inputs; and a secondary display (fig. 15, item 1510; ¶62, “The touch surface 1510 may also comprise a display of its own”) coupled to the palm rest capacitive touch detection surface to present visual images at the palm rest capacitive touch detection surface; wherein the instructions further comprise: presenting a musical keyboard at the secondary display (¶62, “the touch surface 1510 has specific controls (e.g., a keyboard, piano) in which the system is context aware”), and detecting touches at the palm rest capacitive touch detection surface as inputs to the musical keyboard (¶62, “When a user is in a piano or music application, the system displays piano keys on the display 1500 or display integrated into the touch surface 1510”).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined information handling system of Zhang, Burks, Jung, and Ligtenberg, such that the graphical output is a musical keyboard and the application program is directed to the musical keyboard, as taught by Zarraga so as to allow for application specific user interfaces for a variety of applications.
Claims 7 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang, Burks, and Jung, as applied to claims 1 and 10 above, and further in view of Hamburger et al. (Pub. No.: US 2021/0165545 A1) hereinafter referred as Hamburger.
With respect to Claim 7, claim 1 is incorporated, Zhang, Burks, and Jung combined do not mention wherein the instructions further comprise: detecting a weight of an object with the force detection sensor; and presenting the weight at the main display.
Hamburger teaches an information handling system (fig. 1, item 100; ¶64), the system comprising: a processor operable to process information (¶23), a display (fig. 1, item 102; ¶64) operable to present the information as visual images; and a touch detection surface (fig. 1, item 102) operable to detect touches as inputs (¶17; ¶64); wherein the system further: detects a weight of an object with the touch detection surface (¶65), the touch detection surface comprising a force detection sensor (¶17); and presents the weight at the display (¶65).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined information handling system of Zhang, Burks, and Jung, wherein the instructions further comprise: detecting a weight of an object with the force detection sensor; and presenting the weight at the display, as taught by Hamburger so as to allow for a variety of applications for a touch detection surface.
Although Hamburger only mentions a single display and not both a main display and a secondary display, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined information handling system of Zhang, Burks, Jung, and Hamburger such that the weight is presented at either the main display or the secondary display as only a finite number of options are available and easily configurable via software.
With respect to Claim 16, claim 10 is incorporated, Zhang, Burks, and Jung combined do not mention further comprising: placing an object on the palm rest capacitive touch detection surface; detecting a weight of the object with the force detection sensor; and presenting the weight at the main display.
Hamburger teaches an information handling system (fig. 1, item 100; ¶64), the system comprising: a processor operable to process information (¶23), a display (fig. 1, item 102; ¶64) operable to present the information as visual images; and a touch detection surface (fig. 1, item 102) operable to detect touches as inputs (¶17; ¶64); wherein the system further: detects a weight of an object with the touch detection surface when the object is placed thereon (¶65), the touch detection surface comprising a force detection sensor (¶17); and presents the weight at the display (¶65).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Zhang, Burks, and Jung, such that the touch detection surface corresponds to the palm rest capacitive touch detection surface and the method further comprises: detecting a weight of an object with the force detection sensor; and present the weight at the display, as taught by Hamburger so as to allow for a variety of applications for a touch detection surface.
Although Hamburger only mentions a single display and not both a main display and a secondary display, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined information handling system of Zhang, Burks, Jung, and Hamburger such that the weight is presented at either the main display or the secondary display as only a finite number of options are available and easily configurable via software.
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang, Burks, and Jung as applied to claim 10 above, further in view of Hotelling (Patent No.: US 7,561,146 B1) and Shirley.
With respect to Claim 15, claim 10 is incorporated, Although Zhang mentions a palm rest region, Zhang, Burks, and Jung combined do not teach further comprising: comparing the touches with an area of a palm; and discarding the touches when the touches have the area of the palm.
Hotelling teaches a system (fig. 2; column 4, lines 11-13) comprising a processor (fig. 10, item 302; column 9, lines 24-33 and lines 48-67) operable to process information/instructions; a main display (fig. 2, item 212; column 4, lines 17-34) operable to present the information as visual images; a keyboard (fig. 2, item 222; column 4, lines 24 and 34-37) operable to accept end user inputs; a palm rest capacitive touch detection surface (fig. 2, item 224; column 4, lines 34-37) operable to detect touches as inputs; and a non-transient memory (fig. 10, item 304; column 10, lines 1-21) storing instructions/a method (fig. 11) that when executed cause: comparison of the touches with an area of a palm (fig. 11, item 408; column 11, lines 1-35); and discarding the touches when the touches have the area of the palm (fig. 11, item 414; column 11, lines 36-44).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Zhang, Burks, and Jung, to further comprise: comparing the touches with an area of a palm; and discarding the touches when the touches have the area of the palm, as taught by Hotelling, so as to reject unintentional inputs from interfering with inputs.
Zhang, Burks, Jung, and Hotelling combined do not teach the method further comprising: comparing the touches with force of the palm resting on the palm rest capacitive touch detection surface; and discarding the touches when the touches have the area and the force of the palm.
Shirley teaches a method (fig. 5; ¶66-67) for managing touch inputs at an information handling system (fig. 4, item 100; ¶55-56), the method comprising: detecting touches at a touch detection surface (fig. 5, item 502); detecting force associated with the touches with a force detection sensor (fig. 5, item 502); discarding at least some of the touches as unintended inputs by comparing an area of the touches and a force of the touches (fig. 5, item 506; ¶30; ¶69); comparing the touches with an area of a palm (¶30, “touch sensing software of system 100 (described in more detail below) may be configured to recognize unintended touch region 110 as a palm touch based on the size and/or shape of the detected touch area on the screen, or characteristics of screen deformation (e.g., slope)”); comparing touches with force of the palm resting on the touch detection surface (¶27, “A stronger force applied to touch screen 102 by a physical touch results in a broader (e.g., wider) deformation of touch screen 102 than from a weaker force touch. The broader area of deformation may extend beyond the surface area of detected physical contact with touch screen 102 (e.g., beyond physical contact between a palm of a hand and touch screen 102)”; ¶29, “”; ¶30, “touch sensing software of system 100 (described in more detail below) may be configured to recognize unintended touch region 110 as a palm touch based on the size and/or shape of the detected touch area on the screen, or characteristics of screen deformation (e.g., slope)”); and discarding the touches when the touches have the area and the force of the palm (¶30, “the magnitudes of various touch intensity signals received via one or more of touch sensors 114 may provide information indicating a three-dimensional (3D) form of unintended touch region 110, where the 3D form extends into the depth of touch screen 102 (see FIGS. 2A and 2B and FIGS. 3A and 3B). The 3D form of unintended touch region 110 may correspond, to some degree, to the 3D form of a palm and/or arm that touches and deforms touch screen 102. The width of the 3D form may depend on the amount of force applied (i.e., palm strength) to the surface of touch screen 102”).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Zhang, Burks, Jung, and Hotelling, such that the method further comprises: comparing the touches with force of the palm resting on the palm rest capacitive touch detection surface; and discarding the touches when the touches have the area and the force of the palm, as taught by Shirley so as to reduce detection of ghost touch signal around the area of a detected palm touch and reduce impairments in the performance of the touch screen (¶22).
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang, Burks, and Jung as applied to claim 17 above, and further in view of Ligtenberg.
With respect to Claim 20, claim 17 is incorporated, Zhang teaches further comprising: plural haptic devices (fig. 2A, item 206; fig. 9, item 912; ¶124; ¶130, “one or more haptic actuator(s)”) at the capacitive touch detection surface and operable to generate localized vibrations as haptic feedback at the capacitive touch detection surface (¶62-63; ¶130, “haptic outputs may be local or global”).
Although Zhang teaches plural haptic devices, Zhang, Burks, and Jung combined do not mention plural haptic devices disposed in a spaced manner; nor does Zhang and Burks teach instructions that cause configuration of the plural haptic devices to the active area of less than all of the capacitive touch area when selected at the user interface.
Ligtenberg teaches an information handling system (fig. 1A, item 100; ¶124; fig. 16F, item 1600: computing device; fig. 50, item 5000; ¶619) comprising: a housing (fig. 16A, item 1602: top case and 1604: bottom case; ¶334); a processor (fig. 50, item 5002; ¶151, “The base portion 104 may also include components 208 within the interior volume, such as processors, memory devices, circuit boards, input/output devices, haptic actuators, wired and/or wireless communication devices, communication ports, disk drives, and the like”; ¶376; ¶619-621) disposed in the housing and operable to process information/a method; a memory (fig. 50, item 5004; ¶622) disposed in the housing and interfaced with the processor, the memory operable to store the information/method; a main display (fig. 1A, item 102; fig. 16F, item 1603; fig. 50, item 5020; ¶125; ¶334; ¶620; ¶622) coupled to the housing and interfaced with the processor, the main display operable to present the information as visual images (¶125); a keyboard (fig. 16F, item 1614 comprising: 1616 and 1618) coupled to the housing and operable to accept end user inputs as mechanical key presses (¶343-344); a capacitive touch detection surface (fig. 16F, item 1610; fig. 50, item 5006; ¶336-337; ¶623; ¶625) coupled to the housing adjacent the keyboard and interfaced with the processor, the capacitive touch detection surface operable to detect touches as inputs (¶336); and a non-transient memory (fig. 50, item 5004; ¶620; ¶622) storing instructions that when executed cause: presentation of a user interface at the main display (¶620, “the instructions may be configured to control or coordinate the operation of one or more displays 5020, one or more touch sensors 5006, one or more force sensors 5008, one or more communication channels 5010, and/or one or more haptic feedback devices 5012”) and to define an active area of the capacitive touch detection surface (¶337, “As one example, a display underlying the trackpad region 1610 may produce an image of a border (e.g., representing or replicating an image of a trackpad) that indicates where a user may provide touch inputs. As another example, the display may produce an image of a slider that a user can select and/or move to change a volume setting of the computing device 1600. These are merely some examples, and numerous other images and objects can be displayed, and inputs to the trackpad region 1610 may affect numerous settings and operations of the computing device 1600”); and configuration of the active area of the capacitive touch detection surface when selected at the user interface (¶337, “A display may be used, for example, to display input areas, buttons, keys, or other affordances. As one example, a display underlying the trackpad region 1610 may produce an image of a border (e.g., representing or replicating an image of a trackpad) that indicates where a user may provide touch inputs”); further comprising: plural haptic devices (fig. 30A, items 3022 and 3024) disposed in a spaced manner at the capacitive touch detection surface (¶452) and operable to generate localized vibrations as haptic feedback at the capacitive touch detection surface (fig. 10; ¶240, “Reinforcing members may also be included (or strategically omitted) to create haptic or tactile feedback regions, such as by isolating haptic outputs from a particular haptic actuator or device to a localized region that is less than the entire top case of a device”; ¶434; ¶453; ¶454, “The haptic output produced by the electromagnetic actuators 3022, 3024 may be detectable at any location in the trackpad region”); and instructions that cause configuration of the plural haptic devices to the active area of less than all of the capacitive touch detection surface when selected at the user interface (¶15, “a first haptic actuator configured to produce a first haptic output at a first area of the keyboard region, and a second haptic actuator configured to produce a second haptic output at a second area of the keyboard region that is different from the first area”; ¶136; ¶427; ¶452, “The electromagnetic actuators 3022, 3024, 3026, 3028 may include more than one type of actuator, each type configured to produce a different type of haptic feedback in response to a different event or action” – the plural haptic devices are disposed in the active area = trackpad region but occupy an overlay of less than all of the capacitive touch area; ¶453, “The haptic output produced by the electromagnetic actuators 3022, 3024 may be detectable at any location in the trackpad region” – when selected at the user interface).
Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined system of Zhang, Burks, and Jung, such that plural haptic devices are disposed in a spaced manner; and to further comprise instructions that cause configuration of the plural haptic devices to the active area of less than all of the capacitive touch detection surface when selected at the user interface, as taught by Ligtenberg so as to provide localized feedback (¶240).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DONNA V Bocar whose telephone number is (571)272-0955. The examiner can normally be reached Monday - Friday 8:30am to 5pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr A Awad can be reached at (571)272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DONNA V Bocar/Examiner, Art Unit 2621