Prosecution Insights
Last updated: April 19, 2026
Application No. 18/289,969

Wearable Electronic Device, Electronic Device System and Methods Thereof

Non-Final OA §103
Filed
Nov 08, 2023
Examiner
MICHAUD, ROBERT J
Art Unit
2622
Tech Center
2600 — Communications
Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
OA Round
5 (Non-Final)
83%
Grant Probability
Favorable
5-6
OA Rounds
2y 2m
To Grant
96%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
494 granted / 593 resolved
+21.3% vs TC avg
Moderate +13% lift
Without
With
+12.6%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 2m
Avg Prosecution
21 currently pending
Career history
614
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
52.5%
+12.5% vs TC avg
§102
27.4%
-12.6% vs TC avg
§112
12.2%
-27.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 593 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 11/03/25 have been fully considered but they are not persuasive in regards to the prior art. However, the Office in this response has for attempted to clarify the motivation to combine and further identify and clarify the previous Office Action in this action in the hopes of moving the prosecution of the application forward. The applicant noted in said arguments, that the prior art cited, Erivantcev US Patent Application (20170308165), hereinafter “Erivantcev”, Pelis et al., US Patent Application (20190250733), hereinafter “Pelis” and Speck et al., US Patent Application (US 2017/0083115), hereinafter “Speck” in combination does not teach claim 21 as previously amended and presented. The applicant continues “The Office Action's analysis of claim 21 admits that the combination of Erivantcev and Pelis fails to teach that touch sensor data (from a touch sensor on a ring-shaped device) initiates a scrolling through items of a display while the scrolling is "continued based on the motion sensor data" from a motion sensor on a wearable electronic device. (Office Action p. 14.) To be clear, the Office Action contends that Pelis discloses a "page scroll," controlled by the touch of a finger on a touch sensor, but admits that the combination of Erivantcev and Pelis does not disclose the specific combination of operations, controlled by the specific combination of sensor data, specified by the claim. Response: The Office does not agree with the applicant’s representation of what the Office Action admitted or not admitted and states the document stand for itself and should not be paraphrased to skew a response. However, the Office in an attempt to clarify the Office Action has redrafted the document with the applicant’s concerns in mind, by providing further clarification and justification of the use the prior art selected and motivation to combine the prior art as used in the Office Action. These changes should make moot the concerns of the applicant as expressed in its remarks. The applicant further attempts to argue that: Speck's paragraph 0109 simply teaches that a device might include "at least one accelerometer and/or a gyroscope, by means of which spatial changes of position of the input device can be recorded." No specific operations related to the use of this accelerometer or gyroscope are discussed. Response: Since this paragraph being included in the office action has confused the applicant, please allow the Office to clarify that paragraph 0109 was simply cited to show that Speck and Erivantcev were analogous art, which both used motion sensors embedded in a a wearable to track movement (recording spatial changes of position). Since Erivantcev already disclosed motion sensors the information from paragraph 0109 was removed from the Office Action. Next the applicant argues that – “Speck's paragraph 0116 talks about a "gradation of commands," which may involve either "the recording unit or via the motion sensors." Here, Speck says that "when there is a manipulation movement of one or more finger a virtual sensor is slowly scrolled, while when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered”. This also fails to disclose or suggest, whether considered alone or in combination with Erivantcev and/or Pelis, that scrolling is initiated based on touch sensor data (from a touch sensor on a ring-shaped device) and continued based on motion sensor data from a separate motion sensor on a wearable electronic device. Response: As the Office Action stated Speck is being used to show that a motion of a hand is detected and tracked and recorded, then converted to a gesture (i.e. a gesture to increase sculling speed of an already scrolling display) then Speck increases the scrolling on a display. Speck is cited in the Office Action as “when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered. [Speck para 0116]”. A Faster scrolling is triggered means under the broadest reasonable interpretation “BRI” that scrolling was occurring but a second gesture is recognized to increase the speed of scrolling. Speck the Office finds, teaches the limitation “which is continued based on the motion sensor data” as presently presented. Erivantcev teaches a ring device with a touch sensor which recognizes gestures and Pelis teaches that a swipe gesture across a display will make a menu initiate scrolling and as discussed Speck teaches a second gesture recorded by a motion sensor which will trigger a faster scrolling operation. Additionally, the applicant argues that “Nothing in Speck suggests using data from both a touch sensor, on a ring-shaped device, and a motion sensor on a separate, wearable electronic device, to first initiate the scrolling (based on the touch sensor data) and continue it (based on the motion sensor data). Response: The Office notes that the claim does not require two separate wearable devices and the Office is not attempting to use Speck to disclose what the applicant is discussing. Speck simply teaches that it recognizes a gesture motion captured by a motion sensor (which Erivantcev teaches) and said gesture of the hand makes a scrolling display scroll faster. Lastly the applicant argues – “Second, while Speck suggests that one can do either slow scrolling, using a finger motion, and fast scrolling, using a whole hand motion, Speck does not discuss at all how these two inputs might be combined and, again, fails to suggest that the scrolling is initiated based on data from one sensor, on one device, and then continued, based on data from another sensor, on another device.” Response: Input data such as gesture recognition, whether on a touch sensor or made in free space, is well known technology. The sensors and processing devices used in the prior art produce predictable results which can be used as input signals to control an interface. It has been ruled that one of ordinary can combine such technology. Further as noted in the previous office action that even the applicant discusses in its specification that its invention is given a scroll command and after the display scrolling has become still it is given a second command from detected by a motion sensor which continued the scrolling. The applicant at paragraph [0236] After this the user may either start from the beginning of that song list with another tap or start scrolling through the songs by a swipe down/up on the electronic ring-shaped device 120. Applicant at paragraph [0237] In an example there is a starting point of the arm where the content of the display 115, 135 is still. If the arm moves slightly to the left, the scrolling continues as long as the arm is in that position, and to stop the scrolling the arm returns to the starting position which is detected by the motion sensor 114 of the wearable electronic device 110. The Office finds that the applicant may continue scrolling even after the content of the display has stopped scrolling (i.e. is still) and by initiating a second gesture for scrolling (i.e. moving arm arm) which continues the scrolling. And the combined art as cited does discuss how “faster scrolling” is triggered which under BRI indicates scrolling is continued by a second gesture detected on a second sensor causing the scrolling to continue. Combing the well known technology and its teachings is within the skill of one of ordinary in the art and as such the applicant’s arguments are not persuasive. ‘ In review of the applicant’s specification the applicant discusses continuing to scroll as an action initiated by a second gesture after the contents of the display have become “still” from the initial touch sensor data initiating a scroll. Applicant at paragraph [0236] After this the user may either start from the beginning of that song list with another tap or start scrolling through the songs by a swipe down/up on the electronic ring-shaped device 120. Applicant at paragraph [0237] In an example there is a starting point of the arm where the content of the display 115, 135 is still. If the arm moves slightly to the left, the scrolling continues as long as the arm is in that position, and to stop the scrolling the arm returns to the starting position which is detected by the motion sensor 114 of the wearable electronic device 110. The Office finds that the applicant teaches “continue scrolling” occurs even after the content of the display has stopped scrolling (i.e. is still) and the continued scrolling is initiating a second gesture for scrolling (i.e. moving hand) which continues the scrolling. For the reasons discussed above the Office has maintained its art rejections while clarifying its discussions and motivations to combine said art in an attempt to move the prosecution of the application forward. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 21, 24, 26-27, 29-31, and 33-38 is/are rejected under 35 U.S.C. 103 as being unpatentable over Erivantcev US Patent Application (20170308165), hereinafter “Erivantcev”, Pelis et al., US Patent Application (20190250733), hereinafter “Pelis” and Speck US Patent Application (20170083115), hereinafter “Speck”. Regarding claim 21 Erivantcev teaches a method, performed by a wearable electronic device, for controlling a user interface (UI), The hand-worn devices can be used for interaction with VR/AR applications implemented on mobile platforms for data input and/or control interface [Erivantcev para 0037] wherein the wearable electronic device is configured to communicate with an electronic ring-shaped device comprising one or more touch sensors, one ring (104b) includes a touch pad (106) mounted on the ring (104b) [Erivantcev para 0043] the method comprising: controlling the UI based on touch sensor data communicated from the electronic ring- shaped device adapted to receive touch inputs activated by the thumb of the same hand of the user including one or more click buttons incorporated into the touch pad mount. [Erivantcev para 0043] and further based on motion sensor data from a motion sensor of the wearable electronic device. include multiple-sensor, hand-worn devices for motion capturing and positioning of hand and fingers in virtual reality (“VR”) or augmented/mixed reality (“AR”) applications (collectively, “VR/AR” applications) and controlling computing devices via gesture recognition. [Erivantcev para 0037] The hand-worn devices can also be used to implement intuitive gestures that are standard on touch screen on smartphones and mouse-based graphical user interfaces, such as sliding, swiping, zooming, selecting, pointing, and clicking [Erivantcev para 0039]; However, Erivantcev does not explicitly teach (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling (3) which is continued based on the motion sensor data; Pelis teaches a known technique of using a single finger swipe gesture by using a finger and running it across the touch sensor to affect a page scroll which teaches (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling. More specifically Pelis teaches if the user launches an email application, the user may automatically be presented with a ‘reader's’ user interface such as a projected or displayed touch pad. The user may use the touch pad to interact with the email program to assist in reading, scrolling, moving to another email, etc. [Pelis para 0097] The touch sensor 220 may be used to take gesture style input from the user. For example, the user may be able to take a single finger and run it across the touch sensor 220 to affect a page scroll. [Pelis para 0037] Erivantcev discloses a data input device and method of operating the same. In one embodiment, a data input device comprises a plurality of inertial sensor units, one or more touch input devices, a microcontroller configured to collect sensor data from the inertial sensors and the one or more touch input devices and process the sensor data to generate processed sensor data, and a wireless transceiver configured to transmit the processed sensor data to a host computer. In another embodiment, a method comprises receiving sensor data from a handheld device; calculating hand movement characteristics in three dimensional space based on the sensor data; calculating the position and orientation of the components of the handheld device; identifying positions and movements of one or more fingers of a user manipulating the handheld device; identifying a gesture from the positions and movements of one or more fingers of a user manipulating the handheld device; identifying a recognized gesture corresponding to the identified gesture; and dispatching an event notifying the gesture to an application. Pelvis discloses user interface control of a head-worn computer which also receives and interprets control inputs such as gestures and movements. The head worn computer relates with external user interfaces used in connection with to head worn computing. Pelvis discloses a well-known technique of displaying a menu and then using a finger swipe gesture across the touch sensor to initiate scrolling thru a menu is a user controlling an interface through gesture recognition of a movement improves the user’s control of items displayed on an interface. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev and Pelis the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Pelis’ teachings improve the user control and response of a touch and display interface by recognizing the user is interacting with said interface and responding to a user swipe gesture upon the interface to cause the scrolling through data presented on the interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Erivantcev and Pelis do teach (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling However Erivantcev in view of Pelis does not teach (3) which is continued based on the motion sensor data; Speck teaches a user wearing an input device which is designed as a ring and can record data of the relative position of at least one finger of a hand with respect to the input device (i.e. including movement of the finger) and Speck discloses incorporating movements of a hand into an input. Speck through its ability to incorporated hand movements into input controls teaches which is continued based on the motion sensor data. As discussed above Erivantcev in view of Pelis teaches a ring input device and a touch sensor which for example, when there is a manipulation movement of one or more finger on a sensor data presented on a display is scrolled and this is well known in the art of controlling a display input and interface. Speck teaches that a gradation of commands, whether … or via the motion sensors, … while when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered. [Speck para 0116] Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev, Pelis and Speck in the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Speck improves users control of a touch and display interface by allowing the input device to learn hand movements which are recognized through motion sensors tracking hand movements and then converted into simple gestures to control the data input and response of a touch and display interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Regarding claim 24 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition Erivantcev teaches wherein the touch sensor data comprises gesture data. System software on the host computer (114) recognizes gestures from the input data and performs actions that are triggered by the gestures in application software. [Erivantcev para 0063] the method detects a recognized gesture or touch pad movement. As described above, in step 315, the method detects all or substantially all gestures of the user. [Erivantcev para 0085] Regarding claim 27 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition Erivantcev teaches wherein the motion sensor of the wearable electronic device is any of an Inertial Measurement Unit (IMU) a data input device (100) includes inertial sensors (e.g., 102a-c) mounted inside … an inertial sensor comprises a micro-electromechanical system (MEMS) inertial measurement unit (IMU) [Erivantcev para 0042] and a magnetic sensor. An IMU (e.g., 102a-n) i…. a magnetometer configured to measure the magnitude and direction of the magnetic field at its location in space. [Erivantcev para 0042] Regarding claim 29 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition Erivantcev teaches wherein the UI is a UI of a further electronic device other than the wearable electronic device and the electronic ring-shaped device. Application software implements the application-specific function and usually has a graphical interface. For example, application software may implement a virtual reality (or augmented/mixed reality) game. [Erivantcev para 0042] Regarding claim 30 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition Erivantcev teaches wherein the further electronic device is any of an Internet-of-things device, a television, a computer, a mobile phone, an audio player, a headset, a light system, a thermostat and a drone. the input device of FIG. 4 includes a communication device … the processed data from the microcontroller to a host computing device (e.g., a smartphone, a personal computer, a laptop computer, a game console, a virtual reality headset). [Erivantcev para 0042] Regarding claim 31 Erivantcev teaches A wearable electronic device configured to control a graphical user interface (UI), The hand-worn devices can be used for interaction with VR/AR applications implemented on mobile platforms for data input and/or control interface [Erivantcev para 0037]; wherein the wearable electronic device comprises a motion sensor include multiple-sensor, hand-worn devices for motion capturing and positioning of hand and fingers in virtual reality (“VR”) or augmented/mixed reality (“AR”) applications (collectively, “VR/AR” applications) and controlling computing devices via gesture recognition. [Erivantcev para 0037] and is further configured to: communicate with an electronic ring-shaped device comprising one or more touch sensors one ring (104b) includes a touch pad (106) mounted on the ring (104b) [Erivantcev para 0043]; control the UI based on touch sensor data communicated from the electronic ring- shaped device adapted to receive touch inputs activated by the thumb of the same hand of the user including one or more click buttons incorporated into the touch pad mount. [Erivantcev para 0043] and further based on motion sensor data from a motion sensor include multiple-sensor, hand-worn devices for motion capturing and positioning of hand and fingers in virtual reality (“VR”) or augmented/mixed reality (“AR”) applications (collectively, “VR/AR” applications) and controlling computing devices via gesture recognition. [Erivantcev para 0037] The hand-worn devices can also be used to implement intuitive gestures that are standard on touch screen on smartphones and mouse-based graphical user interfaces, such as sliding, swiping, zooming, selecting, pointing, and clicking [Erivantcev para 0039]; However, Erivantcev does not explicitly teach (1) wherein said controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling (3) which is continued based on the motion sensor data; Pelis teaches a known technique of using a single finger swipe gesture by using a finger and running it across the touch sensor to affect a page scroll which teaches (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling. More specifically Pelis teaches if the user launches an email application, the user may automatically be presented with a ‘reader's’ user interface such as a projected or displayed touch pad. The user may use the touch pad to interact with the email program to assist in reading, scrolling, moving to another email, etc. [Pelis para 0097] The touch sensor 220 may be used to take gesture style input from the user. For example, the user may be able to take a single finger and run it across the touch sensor 220 to affect a page scroll. [Pelis para 0037] Erivantcev discloses a data input device and method of operating the same. In one embodiment, a data input device comprises a plurality of inertial sensor units, one or more touch input devices, a microcontroller configured to collect sensor data from the inertial sensors and the one or more touch input devices and process the sensor data to generate processed sensor data, and a wireless transceiver configured to transmit the processed sensor data to a host computer. In another embodiment, a method comprises receiving sensor data from a handheld device; calculating hand movement characteristics in three dimensional space based on the sensor data; calculating the position and orientation of the components of the handheld device; identifying positions and movements of one or more fingers of a user manipulating the handheld device; identifying a gesture from the positions and movements of one or more fingers of a user manipulating the handheld device; identifying a recognized gesture corresponding to the identified gesture; and dispatching an event notifying the gesture to an application. Pelvis discloses user interface control of a head-worn computer which also receives and interprets control inputs such as gestures and movements. The head worn computer relates with external user interfaces used in connection with to head worn computing. Pelvis discloses a well-known technique of displaying a menu and then using a finger swipe gesture across the touch sensor to initiate scrolling thru a menu is a user controlling an interface through gesture recognition of a movement improves the user’s control of items displayed on an interface. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev and Pelis the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Pelis’ teachings improve the user control and response of a touch and display interface by recognizing the user is interacting with said interface and responding to a user swipe gesture upon the interface to cause the scrolling through data presented on the interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Erivantcev and Pelis do teach (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling However Erivantcev in view of Pelis does not teach (3) which is continued based on the motion sensor data; Speck teaches a user wearing an input device which is designed as a ring and can record data of the relative position of at least one finger of a hand with respect to the input device (i.e. including movement of the finger) and Speck discloses incorporating movements of a hand into an input. Speck through its ability to incorporated hand movements into input controls teaches which is continued based on the motion sensor data. As discussed above Erivantcev in view of Pelis teaches a ring input device and a touch sensor which for example, when there is a manipulation movement of one or more finger on a sensor data presented on a display is scrolled and this is well known in the art of controlling a display input and interface. Speck teaches that a gradation of commands, whether … or via the motion sensors, … while when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered. [Speck para 0116] Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev, Pelis and Speck in the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Speck improves users control of a touch and display interface by allowing the input device to learn hand movements which are recognized through motion sensors tracking hand movements and then converted into simple gestures to control the data input and response of a touch and display interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Regarding claim 33 Erivantcev, Pelvis and Speck teaches everything in claim 31 in addition Erivantcev teaches comprising any of a watch, a smart band, a mobile phone, a headset, electronic clothing and electronic eyewear. the input device of FIG. 4 includes a communication device … the processed data from the microcontroller to a host computing device (e.g., a smartphone, a personal computer, a laptop computer, a game console, a virtual reality headset). [Erivantcev para 0042] Regarding claim 34 Erivantcev, Pelvis and Speck teaches everything in claim 31 in addition Erivantcev teaches wherein the wearable electronic device is configured to be worn on a wrist of an arm. the processed sensor data comprises one or more of gyroscope vector data, acceleration vector data, quaternion rotation data, spatial coordinate data of a hand or wrist, and touch data from the one or more touch input devices. [Erivantcev para 0016] Regarding claim 35 Erivantcev teaches a method, performed by an electronic device system comprising a wearable electronic device The hand-worn devices can be used for interaction with VR/AR applications implemented on mobile platforms for data input and/or control interface [Erivantcev para 0037] and an electronic ring-shaped device one ring (104b) includes a touch pad (106) mounted on the ring (104b) [Erivantcev para 0043] and wherein the wearable electronic device comprises a motion sensor, include multiple-sensor, hand-worn devices for motion capturing and positioning of hand and fingers in virtual reality (“VR”) or augmented/mixed reality (“AR”) applications (collectively, “VR/AR” applications) and controlling computing devices via gesture recognition. [Erivantcev para 0037] and wherein the electronic ring-shaped device comprises one or more touch sensors, one ring (104b) includes a touch pad (106) mounted on the ring (104b) [Erivantcev para 0043] the method comprising: communicating, by the electronic ring-shaped device, touch sensor data from the one or more touch sensors to the wearable electronic device adapted to receive touch inputs activated by the thumb of the same hand of the user including one or more click buttons incorporated into the touch pad mount. [Erivantcev para 0043]; and controlling a user interface (UI), by the wearable electronic device, based on the touch sensor data and further based on motion sensor data from the motion sensor. controlling computing devices via gesture recognition. [Erivantcev para 0037] The hand-worn devices can also be used to implement intuitive gestures that are standard on touch screen on smartphones and mouse-based graphical user interfaces, such as sliding, swiping, zooming, selecting, pointing, and clicking [Erivantcev para 0039] However, Erivantcev does not explicitly teach (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling (3) which is continued based on the motion sensor data; Pelis teaches a known technique of using a single finger swipe gesture by using a finger and running it across the touch sensor to affect a page scroll which teaches (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling. More specifically Pelis teaches if the user launches an email application, the user may automatically be presented with a ‘reader's’ user interface such as a projected or displayed touch pad. The user may use the touch pad to interact with the email program to assist in reading, scrolling, moving to another email, etc. [Pelis para 0097] The touch sensor 220 may be used to take gesture style input from the user. For example, the user may be able to take a single finger and run it across the touch sensor 220 to affect a page scroll. [Pelis para 0037] Erivantcev discloses a data input device and method of operating the same. In one embodiment, a data input device comprises a plurality of inertial sensor units, one or more touch input devices, a microcontroller configured to collect sensor data from the inertial sensors and the one or more touch input devices and process the sensor data to generate processed sensor data, and a wireless transceiver configured to transmit the processed sensor data to a host computer. In another embodiment, a method comprises receiving sensor data from a handheld device; calculating hand movement characteristics in three dimensional space based on the sensor data; calculating the position and orientation of the components of the handheld device; identifying positions and movements of one or more fingers of a user manipulating the handheld device; identifying a gesture from the positions and movements of one or more fingers of a user manipulating the handheld device; identifying a recognized gesture corresponding to the identified gesture; and dispatching an event notifying the gesture to an application. Pelvis discloses user interface control of a head-worn computer which also receives and interprets control inputs such as gestures and movements. The head worn computer relates with external user interfaces used in connection with to head worn computing. Pelvis discloses a well-known technique of displaying a menu and then using a finger swipe gesture across the touch sensor to initiate scrolling thru a menu is a user controlling an interface through gesture recognition of a movement improves the user’s control of items displayed on an interface. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev and Pelis the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Pelis’ teachings improve the user control and response of a touch and display interface by recognizing the user is interacting with said interface and responding to a user swipe gesture upon the interface to cause the scrolling through data presented on the interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Erivantcev and Pelis do teach (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling However Erivantcev in view of Pelis does not teach (3) which is continued based on the motion sensor data; Speck teaches a user wearing an input device which is designed as a ring and can record data of the relative position of at least one finger of a hand with respect to the input device (i.e. including movement of the finger) and Speck discloses incorporating movements of a hand into an input. Speck through its ability to incorporated hand movements into input controls teaches which is continued based on the motion sensor data. As discussed above Erivantcev in view of Pelis teaches a ring input device and a touch sensor which for example, when there is a manipulation movement of one or more finger on a sensor data presented on a display is scrolled and this is well known in the art of controlling a display input and interface. Speck teaches that a gradation of commands, whether … or via the motion sensors, … while when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered. [Speck para 0116] Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev, Pelis and Speck in the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Speck improves users control of a touch and display interface by allowing the input device to learn hand movements which are recognized through motion sensors tracking hand movements and then converted into simple gestures to control the data input and response of a touch and display interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Regarding claim 36 Erivantcev teaches an electronic device system comprising a wearable electronic device The hand-worn devices can be used for interaction with VR/AR applications implemented on mobile platforms for data input and/or control interface [Erivantcev para 0037] and an electronic ring-shaped device, one ring (104b) includes a touch pad (106) mounted on the ring (104b) [Erivantcev para 0043] wherein the wearable electronic device comprises a motion sensor, include multiple-sensor, hand-worn devices for motion capturing and positioning of hand and fingers in virtual reality (“VR”) or augmented/mixed reality (“AR”) applications (collectively, “VR/AR” applications) and controlling computing devices via gesture recognition. [Erivantcev para 0037] and wherein the electronic ring-shaped device comprises one or more touch sensors, one ring (104b) includes a touch pad (106) mounted on the ring (104b) [Erivantcev para 0043] wherein the electronic device system is configured to: communicate, by the electronic ring-shaped device, touch sensor data from the one or more touch sensors to the wearable electronic device adapted to receive touch inputs activated by the thumb of the same hand of the user including one or more click buttons incorporated into the touch pad mount. [Erivantcev para 0043]; and control a user interface (UI) by the wearable electronic device, based on the touch sensor data and further based on motion sensor data from the motion sensor. include multiple-sensor, hand-worn devices for motion capturing and positioning of hand and fingers in virtual reality (“VR”) or augmented/mixed reality (“AR”) applications (collectively, “VR/AR” applications) and controlling computing devices via gesture recognition. However, Erivantcev does not explicitly teach (1) wherein said controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling (3) which is continued based on the motion sensor data; Pelis teaches a known technique of using a single finger swipe gesture by using a finger and running it across the touch sensor to affect a page scroll which teaches (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling. More specifically Pelis teaches if the user launches an email application, the user may automatically be presented with a ‘reader's’ user interface such as a projected or displayed touch pad. The user may use the touch pad to interact with the email program to assist in reading, scrolling, moving to another email, etc. [Pelis para 0097] The touch sensor 220 may be used to take gesture style input from the user. For example, the user may be able to take a single finger and run it across the touch sensor 220 to affect a page scroll. [Pelis para 0037] Erivantcev discloses a data input device and method of operating the same. In one embodiment, a data input device comprises a plurality of inertial sensor units, one or more touch input devices, a microcontroller configured to collect sensor data from the inertial sensors and the one or more touch input devices and process the sensor data to generate processed sensor data, and a wireless transceiver configured to transmit the processed sensor data to a host computer. In another embodiment, a method comprises receiving sensor data from a handheld device; calculating hand movement characteristics in three dimensional space based on the sensor data; calculating the position and orientation of the components of the handheld device; identifying positions and movements of one or more fingers of a user manipulating the handheld device; identifying a gesture from the positions and movements of one or more fingers of a user manipulating the handheld device; identifying a recognized gesture corresponding to the identified gesture; and dispatching an event notifying the gesture to an application. Pelvis discloses user interface control of a head-worn computer which also receives and interprets control inputs such as gestures and movements. The head worn computer relates with external user interfaces used in connection with to head worn computing. Pelvis discloses a well-known technique of displaying a menu and then using a finger swipe gesture across the touch sensor to initiate scrolling thru a menu is a user controlling an interface through gesture recognition of a movement improves the user’s control of items displayed on an interface. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev and Pelis the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Pelis’ teachings improve the user control and response of a touch and display interface by recognizing the user is interacting with said interface and responding to a user swipe gesture upon the interface to cause the scrolling through data presented on the interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Erivantcev and Pelis do teach (1) wherein controlling the UI comprises scrolling through items of the display and (2) wherein the touch sensor data initiates the scrolling However Erivantcev in view of Pelis does not teach (3) which is continued based on the motion sensor data; Speck teaches a user wearing an input device which is designed as a ring and can record data of the relative position of at least one finger of a hand with respect to the input device (i.e. tracking the movement of a hand). Speck discloses incorporating movements of a hand into an input. Speck through its ability to incorporated hand movements into input controls teaches continuing an action such as scrolling based on the motion sensor data. Speck uses well known techniques to interpret and convert tracked hand movement into an input gestures which can make the scrolling occurring on a display faster through gesture recognition. As discussed above Erivantcev in view of Pelis teaches a ring input device and a touch sensor which for example, when there is a manipulation movement of one or more finger on a sensor data presented on a display is scrolled and this is well known in the art of controlling a display input and interface. Speck teaches that a gradation of commands, whether … or via the motion sensors, … while when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered. [Speck para 0116] Speck through the use of motion captured by a motion sensor makes scrolling faster which is to say that a second user gesture can increase the speed of the scrolling already occurring on the display as a faster scrolling can only occur if scrolling was already occurring on the display. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev, Pelis and Speck in the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Speck improves users control of a touch and display interface by allowing the input device to learn hand movements which are recognized through motion sensors tracking hand movements and then converted into simple gestures to control the data input and response of a touch and display interface. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Claim(s) 22, 23, 25, 28, 32 and 39 is/are rejected under 35 U.S.C. 103 as being unpatentable over Erivantcev, Pelvis and Speck, and further in view of Vescovi et al., US Patent Application (20150277559), hereinafter “Vescovi” Regarding claim 22 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition Speck teaches controlling is based on a second touch sensor data communicated from the electronic ring-shaped device. a gradation of commands, whether via the recording unit or via the motion sensors, is possible. For example, when there is a manipulation movement of one or more finger a virtual sensor is slowly scrolled, while when swiping with the entire hand, that is recorded with a motion sensor, a faster scrolling or browsing is triggered. [Speck para 0116] (scrolling faster is also continuing scrolling) the input device furthermore comprises at least one accelerometer and/or a gyroscope, by means of which spatial changes of position of the input device can be recorded. [Speck para 0109] Erivantcev, Pelvis and Speck does not expressly teach but Vescovi teaches comprising: triggering controlling the UI based on the touch sensor data and further based on the motion sensor data, wherein the triggering is based on a first touch sensor data communicated from the electronic ring-shaped the response of the device to inputs detected by the device depends on criteria based on the contact intensity during the input. For example, for some “light press” inputs, the intensity of a contact exceeding a first intensity threshold during the input triggers a first response. [Vescovi para 0129] Vescovi discloses known hardware and user controls of an external electronic device with a finger-ring-mounted touchscreen that includes a computer processor, wireless transceiver, and rechargeable power source; the ring is worn on a first finger receives an input from a second finger, selects one of a plurality of touch events associated with the input, and wirelessly transmits a command associated with the touch event to the external electronic device. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev, Pelis, Speck and Vescovi in the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Vescovi improves touch detection accuracy by using the ring computing device to automatically detect whether it is being worn on the user's right or left hand and on which finger thereon, and this information may be used to improve the accuracy of the detection. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Regarding claim 23 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition further Erivantcev, Pelvis and Speck does not teach but Vescovi teaches further comprising: activating a function of the UI based on further touch sensor data from the electronic ring-shaped device. the touch screen, device 700 optionally includes a touchpad (not shown) for activating or deactivating particular functions. [Vescovi para 0055] Regarding claim 25 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition further Erivantcev, Pelvis and Speck does not teach but Vescovi teaches wherein controlling the UI comprises moving a cursor on a display of the UI and/or moving a content of the display. the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user. [Vescovi para 0054] Regarding claim 28 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition further Erivantcev, Pelvis and Speck does not teach but Vescovi teaches wherein the UI is a graphical UI (GUI) of the wearable electronic device. GUI updater 778 prepares display information and sends it to graphics module 732 for display on a touch-sensitive display. [Vescovi para 0117] Regarding claim 32 Erivantcev, Pelvis and Speck s teaches everything in claim 31 in addition further Erivantcev, Pelvis and Speck is does not teach but Vescovi teaches being further configured to trigger controlling the UI based on the touch sensor data and further based on the motion sensor data, Use of the ring computing device … and/or paired computer to disable the computer touch pad to thereby prevent errant gestures made thereto. [Vescovi para 0161] wherein the triggering is based on a first touch sensor data communicated from the electronic ring-shaped device the response of the device to inputs detected by the device depends on criteria based on the contact intensity during the input. For example, for some “light press” inputs, the intensity of a contact exceeding a first intensity threshold during the input triggers a first response. [Vescovi para 0129] and controlling is based on a second touch sensor data communicated from the electronic ring-shaped device. for some “deep press” inputs, the intensity of a contact exceeding a second intensity threshold during the input, greater than the first intensity threshold for a light press, triggers a second response only if a delay time has elapsed between meeting the first intensity threshold and meeting the second intensity threshold. [Vescovi para 0129] Vescovi discloses known hardware and user controls of an external electronic device with a finger-ring-mounted touchscreen that includes a computer processor, wireless transceiver, and rechargeable power source; the ring is worn on a first finger receives an input from a second finger, selects one of a plurality of touch events associated with the input, and wirelessly transmits a command associated with the touch event to the external electronic device. Therefore, prior to the effective date of the invention it would have been obvious to one of ordinary skill in the art to combine the teachings of Erivantcev, Pelis, Speck and Vescovi in the art of touch and display data input devices and operating the same, as one of ordinary skill in the art would have recognized that the results of the combination were predictable as the combined teachings and technologies were well known in the art. Vescovi improves touch detection accuracy by using the ring computing device to automatically detect whether it is being worn on the user's right or left hand and on which finger thereon, and this information may be used to improve the accuracy of the detection. The rationale to support a conclusion that the claim would have been obvious is that a particular known technique was recognized as part of the ordinary capabilities of one skilled in the art. One of ordinary skill in the art would have been capable of applying this known technique to a known device (method, or product) that was ready for improvement and the results would have been predictable to one of ordinary skill in the art. Regarding claim 39 Erivantcev, Pelvis and Speck teaches everything in claim 21 in addition further Erivantcev and Pelvis does not teach but Vescovi teaches a non-transitory computer-readable medium comprising, stored thereupon, a computer program comprising computer readable code units configured so that when executed on a processor the computer readable code units cause the processor to perform the method of claim 21. executable instructions for performing these functions are, optionally, included in a non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors. [Vescovi para 0009] Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT J MICHAUD whose telephone number is (571)270-3981. The examiner can normally be reached 8:30 - 5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patrick Edouard can be reached on 571-272-7603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROBERT J MICHAUD/Examiner, Art Unit 2622
Read full office action

Prosecution Timeline

Nov 08, 2023
Application Filed
Jul 13, 2024
Non-Final Rejection — §103
Oct 15, 2024
Response Filed
Nov 01, 2024
Final Rejection — §103
Jan 03, 2025
Response after Non-Final Action
Feb 05, 2025
Response after Non-Final Action
Feb 05, 2025
Notice of Allowance
Feb 07, 2025
Response after Non-Final Action
May 15, 2025
Non-Final Rejection — §103
Jul 31, 2025
Non-Final Rejection — §103
Nov 03, 2025
Response Filed
Feb 19, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589690
LIGHTING DEVICE FOR A MOTOR VEHICLE, COMPRISING AN ILLUMINATED SCREEN
2y 5m to grant Granted Mar 31, 2026
Patent 12585123
ACTIVE OPTICAL ENGINE
2y 5m to grant Granted Mar 24, 2026
Patent 12575760
Closed-loop wearable sensor and method
2y 5m to grant Granted Mar 17, 2026
Patent 12560982
DETECTION DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12563901
DISPLAY DEVICE
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
83%
Grant Probability
96%
With Interview (+12.6%)
2y 2m
Median Time to Grant
High
PTA Risk
Based on 593 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month