DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Chen U.S. Patent Publication No. 2012/0218185 (hereinafter Chen) in view of Zuber et al. U.S. Patent Publication No. 2018/0088686 (hereinafter Zuber) and further in view of Metelius U.S. Patent Publication No. 2021/0373757 (hereinafter Metelius).
Consider claim 1, Chen teaches an input device, comprising: a housing having a curved top surface and an interior surface (Figure 5, element 3), the interior surface defining an internal volume (Figure 5); a touch sensor assembly including an array of capacitive sensing elements disposed against the interior surface (Figure 5 and [0037], 341-345); an orientation sensor disposed in the internal volume (Figure 5, displacement sensing element 32); and a force sensor assembly configured to detect a direction of a force exerted on the curved top surface of the housing ([0035] and figure 7, pressing signal PS1 and PS2).
Chen does not appear to specifically disclose an angular orientation sensor and force sensor detects a direction and magnitude of a force.
However, in a related field of endeavor, Zuber teaches a domed input assemblies (abstract) and further teaches an angular orientation sensor ([0062], inertial measurement unit (“IMU”) sensor input component(s) 210 of assembly 200 may be operative to detect physical rotation of dome structure 201d with respect to base structure 201b. [0060] and [0036], rotate housing and 45 degree CW and angular rate sensors as motion sensors) and force sensor detects a direction and magnitude of a force ([0052], physical use may be detect force pressing (and corresponding magnitude) by a portion of user U (e.g., a finger tip or multiple finger tips) downward (and corresponding direction) into surface 201s of assembly 200. [0036], pressure sensor).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to provide angular orientation sensor and force sensors as taught by Zuber with the benefit that appropriate control data may be determined based on the physical use detected at operation as suggested in [0052]. Furthermore, each input component can be configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating device as suggested in [0036].
Chen does not appear to specifically disclose detect a tilt direction and a magnitude of a force exerted on the housing at least while the force is applied to a single contact region on the housing, the tilt direction of the force being non-orthogonal to a bottom surface of the housing configured to be parallel to a support surface of the housing.
However, in a related field of endeavor, Metelius teaches a manually interactive device in figure 1, and further teaches detect a tilt direction and a magnitude of a force exerted on the housing by a tilted finger at least while the force is applied to a single contact region on the housing by the tilted finger (Figures 8-10 and [0051], see finger), the tilt direction of the force being non-orthogonal to a bottom surface of the housing configured to be parallel to a support surface of the housing (Figures 9-10, see direction of the finger).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to detect tilt direction as taught by Metelius in order to provide a virtual joystick and/or derive functional information based on the direction in which the finger has tilted as suggested in [0055].
Consider claim 2, Chen, Zuber and Metelius teach all the limitations of claim 1. In addition, Chen teaches the touch sensor assembly is configured to: detect a first hand position of a user touching the housing based on a first set of capacitive sensing elements detecting contact between the hand and the housing (Figure 8 and [0042]); and detect a second hand position of a user touching the housing based on a second set of capacitive sensing elements detecting contact between the hand and the housing (Figure 9 and [0044-0045]); and the orientation sensor detects an orientation of the input device in response to detecting the first hand position (Figure 8 and [0042], displacement); and the force sensor assembly detects the direction of the force exerted on the housing in response to detecting the second hand position (Figure 9 and [0045], clicking and pressing signal). Furthermore, Metelius teaches contact at single contact region and detects the tilt direction (Figures 8-10 and [0051], see finger, see motivation to combine above).
Chen does not appear to specifically disclose angular orientation sensor detects rotation of the input device.
However, in a related field of endeavor, Zuber teaches a input assembly (e.g. a mouse) in [0030] and further teaches angular orientation sensor detects rotation of the input device ([0054], table 1: R9-R10 rotation of assembly; ([0062], inertial measurement unit (“IMU”) sensor input component(s) 210 of assembly 200 may be operative to detect physical rotation of dome structure 201d with respect to base structure 201b. [0060] and [0036], rotate housing and 45 degree CW and angular rate sensors as motion sensors).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to rotate the input device as taught by Zuber in order to make cursor brighter or darker as suggested in table 1.
Consider claim 3, Chen, Zuber and Metelius teach all the limitations of claim 2. In addition, Chen teaches the touch sensor assembly includes two sensor elements disposed on the interior surface (Figure 5, 341-345).
Consider claim 4, Chen, Zuber and Metelius teach all the limitations of claim 1. In addition, Chen teaches wherein the force sensor assembly includes two force sensors (Figure 7, sensors corresponding to PS1/F1 and PS2/F2).
Consider claim 5, Chen, Zuber and Metelius teach all the limitations of claim 1.
Chen does not appear to specifically disclose wherein the orientation sensor includes an inertial measurement unit (IMU).
However, Zuber teaches wherein the orientation sensor includes an inertial measurement unit (IMU) ([0036], angular rate or inertial sensor; [0062], inertial measurement unit (“IMU”) sensor input component(s) 210 of assembly 200 may be operative to detect physical rotation of dome structure 201d with respect to base structure 201b. [0060] and [0036], rotate housing and 45 degree CW and angular rate sensors as motion sensors).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to provide IMU as taught by Zuber in order to make dedicated control functions for making selections or issuing commands associated with operating device as taught by Zuber in [0036].
Consider claim 6, Chen, Zuber and Metelius teach all the limitations of claim 1.
Chen does not appear to specifically disclose a feedback module.
However, Zuber teaches a feedback module in [0062].
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to provide a feedback modules as taught by Zuber any suitable haptic and/or audible and/or visual feedback may be provided by any suitable output component(s) 212 of assembly 200 to help user U confidently interact with system 1 as suggested in [0062].
Consider claim 7, Chen, Zuber and Metelius teach all the limitations of claim 6. In addition, Zuber teaches wherein the feedback module includes at least one of: a haptic mechanism; a light; or a speaker ([0062], see motivation to combine in claim 6).
Consider claim 8, Chen, Zuber and Metelius teach all the limitations of claim 6. In addition, Chen teaches the housing is circular about a central axis (Figures 5-6).
Consider claim 9, Chen, Zuber and Metelius teach all the limitations of claim 6. In addition, Chen teaches wherein: the input device further comprises a processor electrically coupled to the touch sensor assembly (Figure 5 and [0037], control unit 33); and the processor is configured to determine an intended orientation of the housing based on a hand position of a user detected by the touch sensor assembly (Figures 8-9, [0042] and [0044]).
Claim(s) 10-11, 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zuber in view of Metelius and further in view of Rosenfeld et al. U.S. Patent Publication No. 2010/0245246 (hereinafter Rosenfeld).
Consider claim 10, Zuber teaches a mouse input system (Figure 1a), comprising: a housing including a base portion and a grip portion coupled to the base portion (Figure 1H, 201d and 201b); a plurality of touch sensors disposed on the grip portion ([0048], touch sensors that may be integrated into and/or under and/or about exterior surface 201s); and a force sensor disposed on the base portion (Figure 1H, [0052], [0061], force sensors 210s); a processor in electrical communication with the plurality of touch sensors and the force sensor (Figure 1 and [0036] and [0041], processor 202 and input component 210): and a memory component storing electronic instructions that, when executed by the processor, cause the processor to (Figure 1, memory 204): detect, using the plurality of touch sensors, that the grip portion is contacted by at least one finger in only a single contact region on the grip portion ([0062], single). In addition, Zuber teaches force sensor (e.g. any pressure sensors, strain gauges) in [0036].
Zuber does not appear to specifically disclose detect, using the force sensor, a tilt direction of the at least one finger relative to the single contact region while the housing remains substantially stationary.
However, Metelius teaches detect, using the force sensor, a tilt direction of the at least one finger relative to the single contact region while the housing remains substantially stationary (Figures 8-10 and [0051], see finger).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to detect tilt direction as taught by Metelius in order to provide a virtual joystick and/or derive functional information based on the direction in which the finger has tilted as suggested in [0055].
Zuber does not appear to specifically disclose contact region in response to compression of the housing.
However, in a related field of endeavor, Rosenfeld teaches a mouse comprising a touch sensor in figure 1 and further teaches contact region in response to compression of the housing ([0045], slight deformation of the mouse and further refers to pressure sensors).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to compress or deform the mouse as taught by Rosenfeld with the benefit that pressure sensing may allow the computer mouse controller to detect pressure signals that correspond to an actuation of a "left click", "right click", and other such "virtual button" actions that are actuated by mechanical buttons on some mice.
Consider claim 11, Zuber, Metelius and Rosenfeld teach all the limitations of claim 10. In addition, Zuber teaches wherein the plurality of touch sensors is configured to detect multiple positions of a hand contacting the grip portion ([0048], orientation of one or more body parts (e.g. hand)).
Consider claim 13, Zuber, Metelius and Rosenfeld teach all the limitations of claim 11. In addition, Zuber teaches an emitter (Figure 1, communications component) electrically coupled to the processor (Figure 1, 202, 204), wherein the emitter: in response to the plurality of touch sensors detecting a first hand position of the at least one finger, send a first signal including first information regarding a direction ([0052], force pressing by a portion of user U); and in response to the plurality of touch sensors detecting a second hand position of the at least one finger contacting the grip portion, send a second signal including second information regarding an angular orientation of the mouse ([0054], figure 1d, table 1: R9-R10 rotation of assembly). In addition, Metelius teaches a tilt direction (Figures 8-10 and [0051], see finger, see motivation to combine above).
Consider claim 14, Zuber, Metelius and Rosenfeld teach all the limitations of claim 13. In addition, Zuber teaches an orientation sensor electrically coupled to the processor ([0062], IMU).
Claim(s) 15 and 18-21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zuber in view of Metelius.
Consider claim 15, Zuber teaches a mouse, comprising (Figure 1a): a housing defining an exterior grip portion and an internal volume (Figure 1H, 201d), the exterior grip portion having a rounded upper input surface (Figures 1A-H, 201s); a sensor assembly disposed in the internal volume ([0048], touch sensors that may be integrated into and/or under and/or about exterior surface 201s); and an emitter electrically coupled to the sensor assembly (Figure 1 and [0044], communications component) ; wherein: in response to the sensor assembly detecting a first touch input on the housing (Figures 1A-H, 201s), the emitter sends a first signal including information regarding an angular position of the exterior grip portion while the housing rotates relative to a support surface, the first touch input including touch input at a set of touch input locations on the housing ([0054], figure 1d, table 1: R9-R10 rotation of assembly. [0060], For example, user U may physically rotate assembly 200 in the direction of arrow CW by 45° about axis IA on surface 5 from the orientation of FIG. 1D to the orientation of FIG. 2E… and such detected 45° CW rotation may be operative to generate particular command data 99 for controlling an interface of device 100 in a particular manner (e.g., control data based on the action of Rule R9 (e.g., make cursor 112c brighter))); and in response to the sensor assembly detecting a second touch input on the rounded upper input surface of the housing, the emitter sends a second signal including information regarding a direction of a force exerted on the housing from the second touch input while the housing is stationary relative to a support surface, ([0052], force pressing by a portion of user U; For example, R5-R8 in table 1 and corresponding control action. See also figures 1D, 1E and [0057] for rule R5). Furthermore, Zuber teaches in [0062], single or multi-finger clicks.
Zuber does not appear to specifically disclose the second touch input including a tilting touch input at a single location on the housing.
However, Metelius teaches the second touch input including a tilting touch input at a single location on the housing (Figures 8-10 and [0051], see finger).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to detect tilt direction as taught by Metelius in order to provide a virtual joystick and/or derive functional information based on the direction in which the finger has tilted as suggested in [0055].
Consider claim 18, Zuber and Metelius teach all the limitations of claim 15. In addition, Zuber teaches wherein the sensor assembly comprises: a force vector sensor (Figure 1H, [0052], [0061], force sensors 210s); a touch sensor array ([0048], touch sensors that may be integrated into and/or under and/or about exterior surface 201s); and an angular sensor ([0062], IMU).
Consider claim 19, Zuber and Metelius teach all the limitations of claim 18. In addition, Zuber teaches wherein the touch sensor array includes a plurality of capacitive sensing elements configured to detect the first touch input and the second touch input ([0062], capacitive touch sensor).
Consider claim 20, Zuber and Metelius teach all the limitations of claim 18. In addition, Zuber teaches wherein the force vector sensor includes a first force sensor disposed at a first location within the internal volume (Figure 1g, left 210s, 201d and 201b) and a second force sensor disposed at a second location within the internal volume (Figure 1g, right 210s, 201d and 201b).
Consider claim 21, Zuber and Metelius teach all the limitations of claim 10. In addition, Zuber teaches wherein the housing is circular, and wherein the single contact region is located at a circular housing ([0062], single).
Zuber does not appear to specifically disclose single contact region is located at a center of housing.
However, Metelius teaches single contact region is located at a center of housing (Figure 8).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to provide a center contact region with the benefit that a process may be initiated by an initial manual interaction involving a relatively hard finger press, in the direction of arrow 801, orthogonal to the x-y plane as suggested in [0053] and figure 8. In addition, a virtual joystick can be provided as suggested in [0057].
Response to Arguments
Applicant's arguments filed 02/03/2026 have been fully considered but they are not persuasive.
On page 10 of Applicant’s response, Applicant argues that “Metelius only detects force or pressure orthogonal to a bottom surface of a housing. Specifically, Metelius explains that force or pressure is measured in the "Z-dimension," which is perpendicular to the bottom surface of the device with screen 102. See, e.g.,, FIG. 8 and paras. [0027] and [0051]. The Z dimension is orthogonal to the input surface (i.e., the X and y dimensions, as shown in Metelius FIG. 8).” The Office respectfully disagrees for the following reasons.
Metelius teaches in [0051], it is therefore possible to detect that a finger has…tilted in a particular direction. Figure 8 shows an orthogonal direction but figure 9 teaches a tilted (non-orthogonal) direction. [0055] teaches the invention creates the possibility of providing a virtual joystick such that having positioned a finger and applied pressure, as described with reference to FIG. 8, a finger may be tilted in the direction of arrow 901. The invention then seeks to derive functional information based on the direction in which the finger has tilted. This may create a movement and the speed of this movement may be determined by the extent to which the device has been pushed, which is similar to Applicant’s publication [0098].
On page 10, Applicant argues that “Even if the Metelius reference teaches tilt sensing in the manner alleged by the Examiner, its alleged tilt sensing is not completed using a curved top surface of a housing. Rather, Metelius only discloses its alleged tilt sensing in connection with a flat”. The Office respectfully disagrees for the following reasons.
Examiner is not using Metelius for the purpose of providing a flat surface. As detailed above, the purpose of Metelius is a tilted direction.
On page 11, Applicant argues that “the Chen mouse 3 is domed and intended to be grasped with a full hand (as shown, for example, in FIGS. 8-9) and therefore does not facilitate input from tilted finger.” The Office respectfully disagrees for the following reasons.
Examiner is not using Chen for the purpose of a tilted gesture. As detailed above, Metelius teaches a tilted direction. Furthermore, Chen teaches that inputs can be provided to a flat surface or a curved surface in figures 8-10.
On pages 11-12, Applicant argues “force sensing in response to compression of a housing in Zuber”. The Office respectfully disagrees for the following reasons.
Rosenfeld teaches this limitation in [0045] as detailed above. Consequently, these arguments have been considered but they are not persuasive.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERTO W FLORES whose telephone number is (571)272-5512. The examiner can normally be reached Monday-Friday, 7am-4pm, EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, AMR A AWAD can be reached at (571)272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ROBERTO W FLORES/ Primary Examiner, Art Unit 2621