Prosecution Insights
Last updated: April 19, 2026
Application No. 18/614,984

OPTICAL RING THAT ENABLES THUMB-TO-INDEX GESTURES

Non-Final OA §103
Filed
Mar 25, 2024
Examiner
BOCAR, DONNA V
Art Unit
2621
Tech Center
2600 — Communications
Assignee
Meta Platforms Technologies, LLC
OA Round
3 (Non-Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
2y 7m
To Grant
77%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
212 granted / 367 resolved
-4.2% vs TC avg
Strong +19% interview lift
Without
With
+19.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
35 currently pending
Career history
402
Total Applications
across all art units

Statute-Specific Performance

§101
1.9%
-38.1% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
15.1%
-24.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 367 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1, 5-6, 8, 10-11, 13-16, and 18 have been amended. Claim 2 is cancelled. No claims have been added. Claims 1 and 3-20 are currently under review. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on October 7, 2025 has been entered. Response to Arguments Applicant’s arguments with respect to claims 1 and 3-20 have been considered but are moot because the new ground of rejection does not rely on the combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 4-6, and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (Pub. No.: US 2023/0325006 A1) hereinafter referred to as Kim in view of Park (Pub. No.: US 2017/0308117 A1) in view of Moroney (Pub. No.: US 2016/0284104) in view of Whitmire et al. (Pub. No.: US 2024/0028129 A1) hereinafter referred to as Whitmire, and in view of Noda et al. (Pub. No.: US 2010/0220054 A1) hereinafter referred to as Noda. With respect to Claim 1, Kim teaches an apparatus (figs. 2-3, item 201; ¶58) comprising: a ring configured to curve around a first finger of a wearer (fig. 3C), wherein the ring comprises an external surface (fig. 3B, item 301: outer perimeter; ¶76) and an internal surface (fig. 3B, item 303: inner perimeter; ¶76), wherein the internal surface is configured to come into contact with the first finger of the wearer (fig. 3C); one or more electrodes (fig. 3B, item 220; ¶58) coupled to the internal surface of the ring, wherein the one or more electrodes are configured to measure bioimpedance in the first finger of the wearer (¶61; ¶80); and an inertial measurement unit (fig. 3B, item 215; ¶60) coupled to the ring, wherein the inertial measurement unit is configured to measure movement of the ring (¶63); and wherein the ring is configured to: determine a movement of the ring via the inertial measurement unit (¶60; ¶63, “based on sensing information acquired from the inertial sensor 215, the processor 230 may assess the state in which the wearable device 201 is worn, or the state of rotation thereof”); determine a gesture of the wearer based on the inertial sensor (¶93, “upon detecting a gesture of the user who wears the first wearable device 201 on his/her finger made by holding and rotating the first wearable device 201 with two fingers, the electronic device 101 may control the second wearable device 207 to display a user interface related to volume control, based on the gesture”; ¶110; ¶112, “the processor 120 may determine that the wearable device 201 is worn if sensing information acquired from the inertial sensor 215 has a change, and if biometric information is acquired by the biometric sensor 220”) and the bioimpedance (¶61, “the biometric sensor 220 may include … a bioimpedance assessment (BIA)”; ¶112, “the processor 120 may determine that the wearable device 201 is worn if sensing information acquired from the inertial sensor 215 has a change, and if biometric information is acquired by the biometric sensor 220, after a touch is sensed by the touch sensor 210”). Kim does not mention a miniature optical sensor mounted on the external surface of the ring and positioned to view along a length of the finger of the wearer. Park teaches an apparatus (fig. 1; ¶45) comprising: a ring designed to curve around a finger of a wearer (fig. 4, item 100; ¶127-128), wherein the ring comprises an external surface (fig. 4) and an internal surface (fig. 4) configured to come into contact with the finger of the wearer (fig. 4); a miniature optical sensor (fig. 4, item 110; ¶128) mounted on the external surface of the ring and positioned to view along a length of the finger of the wearer (figs. 4 and 6; ¶52; ¶148); and detect biometric information on the finger of the wearer (¶78). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the apparatus of Kim, to comprise a miniature optical sensor mounted on the external surface of the ring and positioned to view along a length of the finger of the wearer, as taught by Park so as to provide a wearable device which allows a user to conveniently input data using a portable input unit (¶4). Kim and Park combined do not teach wherein the ring is configured to: determine a first contour associated with the first finger based on a first area of the first contour being greater than a threshold amount of a total image area captured by the miniature optical sensor; determine a second contour associated with a second finger based on a second area of the second contour being greater than the threshold amount of the total image area; determine a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour. Moroney teaches a system (fig. 1, item 100; ¶8) configured to: determine a first contour associated with a first part of an object (¶15, “determines a description of the shape of a silhouette of a representation of an object”) based on a first area of the first contour being greater than a threshold amount of a total image area (¶17, “the processor detects a first contour of the silhouette wherein the area of the contour with respect to itself is over a contour threshold”); determine a second contour associated with a second part of the object based on a second area of the second contour being greater than the threshold amount of the total image area (¶19, “the processor detects a second contour of the silhouette wherein the area of the contour with respect to itself is over the threshold”); determine a contact of the first part of the object and the second part of the object based on the first contour and the second contour merging into a single contour (¶19, “The processor may perform a global sweep of the silhouette of the object representation and compare a previous contour to the next points along the silhouette. In some cases, the new contour may be merged with the previously identified contour. For example, the distance between the points of the contour may be taken into account in determining whether the contours should be merged. For example, a second contour with an area with respect to itself over the threshold may be determined to be part of a previously identified contour such that one of the end points of the previous contours becomes the end point of the second contour. In some cases, there may be points in between the two contours such that the contour is determined to be a separate new contour”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined apparatus of Kim and Park, such that a first part of an object corresponds to a first finger and a second part of the object corresponds to a second finger, resulting in an apparatus configured to: determine a first contour associated with the first finger based on a first area of the first contour being greater than a threshold amount of a total image area captured by the miniature optical sensor; determine a second contour associated with a second finger based on a second area of the second contour being greater than the threshold amount of the total image area; determine a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour, as taught by Moroney so as to allow for a fast automated method for object shape identification (¶7). Kim, Park, and Moroney combined do not teach the apparatus configured to determine a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the impedance in the first finger; determine a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the impedance in the first finger, and based on the movement of the ring; and transmit the gesture from the ring to an augmented-reality or virtual-reality system. Whitmire teaches an apparatus (fig. 1A, item 102; ¶53, “wrist-wearable device 102”; fig. 12B, item 1252-1; ¶233, “the accessory device 1252-1 can be a ring that is used in conjunction with a wearable structure to utilize data measurements obtained by sensor 1258-1 to adjust a fit of the wearable structure. In another example, the accessory device 1252-1 and accessory device 1252-2 are distinct wristbands to be worn on each wrist of the user”) comprising: watch/ring configured to curve around a body part of a wearer (fig. 1A, item 102; ¶66; fig. 12B, item 1252-1), wherein the watch/ring comprises an external surface and an internal surface (fig. 1A; fig. 12B); an optical sensor (fig. 1I & 1J, item 164; fig. 12, item 1258-1; ¶72; ¶78; ¶118, “For example, an EMG sensor and/or an optical imaging sensor can detect positions and/or movements of the user's hand”; ¶240); an impedance sensor (fig. 1I & 1J, item 162; fig. 12B, item 1258; ¶72; ¶76; ¶118, “For example, an EMG sensor and/or an optical imaging sensor can detect positions and/or movements of the user's hand”; ¶215; ¶240); and an inertial measurement unit (fig. 1I & 1J, item 160; ¶72; ¶76; ¶215, “Motor actions can be detected based on the detected neuromuscular signals but can additionally (using a fusion of the various sensor inputs) … be detected using other types of sensors (such as cameras focused on viewing hand movements and/or using data from an inertial measurement unit that can detect characteristic vibration sequences or other data types to correspond to in-air hand gestures)”); wherein the watch is configured to: determine a contact of the first finger and the second finger (¶337, “the second group of sensors includes a camera sensor (e.g., the camera sensors 164) and the camera sensor (e.g., an infrared camera that is positioned on the wrist-wearable device such that its field of view is looking towards a top of the user's hand) detects the pinch-threshold distance change between the two or more of the plurality of contacts of the surface gesture”; ¶423, “and the second in-air hand gesture is a pinch gesture, that includes a contact between a thumb of the user and another finger of the user, and the operation performed in response to the second indication causes an image to be captured by a camera at the wrist-wearable device or the head-wearable device”) and further based on a change to the impedance in the first finger (fig. 3D; ¶287, “the determination is made by tracking movement of a particular location on the hand 1503 (e.g., a point on the finger, a knuckle, a ring on the user's finger or another wearable electronic device)”; ¶336, “whether two or more contacts of the plurality of contacts are performing a pinch gesture (e.g., the location-agnostic gesture 346 in 3C can be detected based on detecting that the two or more contacts of the plurality of contacts are performing a pinch gesture using the EMG sensor”);determine a movement of the ring via the inertial measurement unit (¶72; ¶76); determine a gesture by the wearer based on a contact of a first finger and a second finger (¶337, “the second group of sensors includes a camera sensor (e.g., the camera sensors 164) and the camera sensor (e.g., an infrared camera that is positioned on the wrist-wearable device such that its field of view is looking towards a top of the user's hand) detects the pinch-threshold distance change between the two or more of the plurality of contacts of the surface gesture”; ¶423, “and the second in-air hand gesture is a pinch gesture, that includes a contact between a thumb of the user and another finger of the user, and the operation performed in response to the second indication causes an image to be captured by a camera at the wrist-wearable device or the head-wearable device”) that is based on the change to the impedance in the first finger (fig. 3D; ¶287, “the determination is made by tracking movement of a particular location on the hand 1503 (e.g., a point on the finger, a knuckle, a ring on the user's finger or another wearable electronic device)”; ¶336, “whether two or more contacts of the plurality of contacts are performing a pinch gesture (e.g., the location-agnostic gesture 346 in 3C can be detected based on detecting that the two or more contacts of the plurality of contacts are performing a pinch gesture using the EMG sensor”), and based on the movement of the ring (¶72, “he symbolic view of the sensors 150 displayed alongside FIGS. 1B-1E includes graphs of prophetic data corresponding to data collected by EMG sensors and time-of-flight sensors during the performance of the various gestures illustrated in FIGS. 1B-1E… In some embodiments, one or more camera sensors 164 are used to detect user motion and gestures (e.g., in addition to, or alternative to, the sensors described above)”); and transmit the gesture from the ring to an augmented reality or virtual reality system (fig. 1A, item 104; ¶50, “Such artificial-reality can include and/or represent virtual reality (VR), augmented reality (AR), mixed artificial reality (MAR), or some combination and/or variation one of these”; ¶53, “a head-wearable device 104 (e.g., AR glasses)”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the apparatus of Kim, Park, and Moroney, since the apparatus may be a ring or watch such that the apparatus is configured to determine a contact of the first finger and the second finger and further based on a change to the impedance in the first finger; determine a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the impedance in the first finger, and based on the movement of the ring; and transmit the gesture from the ring to an augmented-reality or virtual-reality system, as taught by Whitmire, so as to allow for the wearable devices to be designed such that they are comfortable, functional, practical, and socially acceptable for day-to-day use and allow users to interact with a computer and/or user interface without requiring a fixed location or orientation for the interaction (¶17). Kim, Park, Moroney, and Whitmire combined do not teach the apparatus configured to determine a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the bioimpedance in the first finger; determine a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the bioimpedance in the first finger. Noda teaches an apparatus (figs. 1A and 1B) comprising: a ring configured to curve around a first finger of a wearer (fig. 1B), wherein the ring comprises an external surface and an internal surface (fig. 1B), wherein the internal surface is configured to come into contact with the first finger of the wearer (fig. 1B; ¶81); one or more electrodes coupled to the internal surface of the ring (fig. 1B, items 11a and 11b; ¶81), wherein the one or more electrodes are configured to measure bioimpedance in the first finger of the wearer (¶92-93); determine a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the bioimpedance in the first finger (¶89-90); transmit the gesture from the ring to an external device (fig. 5, item 110: external device; ¶86). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date to modify the apparatus of Kim, Park, Moroney, and Whitmire, such that measured impedance is a bioimpedance of a first finger, resulting in the apparatus configured to determine a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the bioimpedance in the first finger; determine a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the bioimpedance in the first finger, as taught by Noda so as to detect a user’s body movement thru an alternative method (¶10). With respect to Claim 4, claim 1 is incorporated, Kim does not mention further comprising a light-emitting diode (LED) associated with the miniature optical sensor. Park teaches an apparatus (fig. 1; ¶45) comprising: a ring designed to curve around a finger of a wearer (fig. 4, item 100; ¶127-128), wherein the ring comprises an external surface (fig. 4) and an internal surface (fig. 4) configured to come into contact with the finger of the wearer (fig. 4); a miniature optical sensor (fig. 4, item 110; ¶128) mounted on the external surface of the ring and positioned to view along a length of the finger of the wearer (figs. 4 and 6; ¶52; ¶148); and detect biometric information on the finger of the wearer (¶78); further comprising a light-emitting diode (LED) (fig. 1, item 105; ¶48) associated with the miniature optical sensor (¶78, “the wearable device 100 senses user's skin lines through a finger recognition unit including an infrared camera, an RGB camera or a ToF camera”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined apparatus of Kim, Moroney, Whitmire, and Noda, further comprising a light-emitting diode (LED) associated with the miniature optical sensor, as taught by Park so as to provide a wearable device which allows a user to conveniently input data using a portable input unit and provide a type of optical sensor (¶4). With respect to Claim 5, claim 4 is incorporated, Kim does not mention wherein the miniature optical sensor is configured to capture images of at least one of the first finger or the second finger of the wearer illuminated by the LED. Park teaches an apparatus (fig. 1; ¶45) comprising: a ring designed to curve around a finger of a wearer (fig. 4, item 100; ¶127-128), wherein the ring comprises an external surface (fig. 4) and an internal surface (fig. 4) configured to come into contact with the finger of the wearer (fig. 4); a miniature optical sensor (fig. 4, item 110; ¶128) mounted on the external surface of the ring and positioned to view along a length of the finger of the wearer (figs. 4 and 6; ¶52; ¶148); and detect biometric information on the finger of the wearer (¶78); further comprising a light-emitting diode (LED) (fig. 1, item 105; ¶48) associated with the miniature optical sensor (¶78, “the wearable device 100 senses user's skin lines through a finger recognition unit including an infrared camera, an RGB camera or a ToF camera”); wherein the miniature optical sensor is configured to capture images of at least one of the second finger of the wearer illuminated by the LED (fig. 4, the thumb is the first finger and the forefinger is the second finger; ¶52, “optical sensing unit”; ¶71, “the data processor 115 processes the optical signals received by the optical signal sensing unit 110 and generates information regarding blood vessels of the object. The image processor 130 may process the information regarding blood vessels, generate a visually confirmable pattern, and add the generated pattern to the 3D model generated based on the 3D scan information. That is, the image processor 130 may generate a 3D model formed by mapping the pattern regarding blood vessels onto the 3D model representing only the external appearance of the object”; ¶78, “the wearable device 100 senses user's skin lines through a finger recognition unit including an infrared camera, an RGB camera or a ToF camera …The wearable device 100 may detect the position and movement of a part of a user's body through such a process and output an image to a fixed position at a fixed angle”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined apparatus of Kim, Moroney, Whitmire, and Noda, wherein the miniature optical sensor is configured to capture images of at least one of the second finger of the wearer illuminated by the LED, as taught by Park so as to provide a wearable device which allows a user to conveniently input data using a portable input unit and provide a type of optical sensor (¶4). With respect to Claim 6, claim 5 is incorporated, Kim teaches wherein the first finger of the wearer is an index finger (fig. 3C). With respect to Claim 8, Kim teaches a method (figs. 5, 8, and 11; ¶56) comprising: a ring configured to curve around a first finger of a wearer (fig. 3C), wherein the ring comprises an external surface (fig. 3B, item 301: outer perimeter; ¶76) and an internal surface (fig. 3B, item 303: inner perimeter; ¶76), wherein the internal surface is configured to come into contact with the first finger of the wearer (fig. 3C); and one or more electrodes (fig. 3B, item 220; ¶58) coupled to the internal surface of the ring, wherein the one or more electrodes serve to measure bioimpedance in the first finger of the wearer (¶61; ¶80); determine a movement of the ring (¶63) via an inertial measurement unit (fig. 3B, item 215; ¶60; ¶63, “based on sensing information acquired from the inertial sensor 215, the processor 230 may assess the state in which the wearable device 201 is worn, or the state of rotation thereof”) coupled to the ring; and determine a gesture of the wearer based on the inertial sensor (¶110; ¶112) and the bioimpedance (¶61, “the biometric sensor 220 may include … a bioimpedance assessment (BIA)”; ¶112, “the processor 120 may determine that the wearable device 201 is worn if sensing information acquired from the inertial sensor 215 has a change, and if biometric information is acquired by the biometric sensor 220, after a touch is sensed by the touch sensor 210”). Kim does not mention a miniature optical sensor coupled to the ring, wherein the miniature optical sensor is configured to couple to the external surface and positioned to view along a length of the first finger of the wearer. Park teaches a ring configured to curve around a finger of a wearer (fig. 4, item 100; ¶127-128), wherein the ring comprises: an external surface (fig. 4) and an internal surface (fig. 4) configured to come into contact with the finger of the wearer (fig. 4); and a miniature optical sensor (fig. 4, item 110; ¶128) coupled to the external surface of the ring and positioned to view along a length of the finger of the wearer (figs. 4 and 6; ¶52; ¶148); and detect biometric information on the finger of the wearer (¶78). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the method of Kim, such that an additional component comprising a miniature optical sensor is coupled to the external surface of the ring and positioned to view along a length of the finger of the wearer, as taught by Park so as to provide a wearable device which allows a user to conveniently input data using a portable input unit (¶4). Kim and Park combined do not teach the method comprising: determining, a first contour associated with a first finger based on a first area of the first contour being greater than a threshold amount of a total image area captured by a miniature optical sensor coupled to a ring; determining a second contour associated with a second finger based on a second area of the second contour being greater than the threshold amount of the total image area captured by the miniature optical sensor; determining a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour. Moroney teaches a method (fig. 2; ¶14) comprising: determining a first contour associated with a first part of an object (¶15, “determines a description of the shape of a silhouette of a representation of an object”) based on a first area of the first contour being greater than a threshold amount of a total image area (¶17, “the processor detects a first contour of the silhouette wherein the area of the contour with respect to itself is over a contour threshold”); determining a second contour associated with a second part of the object based on a second area of the second contour being greater than the threshold amount of the total image area (¶19, “the processor detects a second contour of the silhouette wherein the area of the contour with respect to itself is over the threshold”); determining a contact of the first part of the object and the second part of the object based on the first contour and the second contour merging into a single contour (¶19, “The processor may perform a global sweep of the silhouette of the object representation and compare a previous contour to the next points along the silhouette. In some cases, the new contour may be merged with the previously identified contour. For example, the distance between the points of the contour may be taken into account in determining whether the contours should be merged. For example, a second contour with an area with respect to itself over the threshold may be determined to be part of a previously identified contour such that one of the end points of the previous contours becomes the end point of the second contour. In some cases, there may be points in between the two contours such that the contour is determined to be a separate new contour”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Kim and Park, such that a first part of an object corresponds to a first finger and a second part of the object corresponds to a second finger, resulting in the method comprising: determining, a first contour associated with a first finger based on a first area of the first contour being greater than a threshold amount of a total image area captured by a miniature optical sensor coupled to a ring; determining a second contour associated with a second finger based on a second area of the second contour being greater than the threshold amount of the total image area captured by the miniature optical sensor; determining a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour., as taught by Moroney so as to allow for a fast automated method for object shape identification (¶7). Kim, Park, and Moroney combined do not teach the method comprising: determining a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the impedance in the first finger; determining a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the impedance in the first finger and based on the movement of the ring; and transmitting the gesture from the ring to an augmented-reality or virtual-reality system. Whitmire teaches an apparatus (fig. 1A, item 102; ¶53, “wrist-wearable device 102”; fig. 12B, item 1252-1; ¶233, “the accessory device 1252-1 can be a ring that is used in conjunction with a wearable structure to utilize data measurements obtained by sensor 1258-1 to adjust a fit of the wearable structure. In another example, the accessory device 1252-1 and accessory device 1252-2 are distinct wristbands to be worn on each wrist of the user”) and method (figs. 9A to 9C, figs. 10A-10C, and 11A-11B; figs. 16A-16B; ¶26-27) comprising: watch/ring configured to curve around a body part of a wearer (fig. 1A, item 102; ¶66; fig. 12B, item 1252-1), wherein the watch/ring comprises an external surface and an internal surface (fig. 1A; fig. 12B); an optical sensor (fig. 1I & 1J, item 164; fig. 12, item 1258-1; ¶72; ¶78; ¶118, “For example, an EMG sensor and/or an optical imaging sensor can detect positions and/or movements of the user's hand”; ¶240); an impedance sensor (fig. 1I & 1J, item 162; fig. 12B, item 1258; ¶72; ¶76; ¶118, “For example, an EMG sensor and/or an optical imaging sensor can detect positions and/or movements of the user's hand”; ¶215; ¶240); and an inertial measurement unit (fig. 1I & 1J, item 160; ¶72; ¶76; ¶215, “Motor actions can be detected based on the detected neuromuscular signals but can additionally (using a fusion of the various sensor inputs) … be detected using other types of sensors (such as cameras focused on viewing hand movements and/or using data from an inertial measurement unit that can detect characteristic vibration sequences or other data types to correspond to in-air hand gestures)”); wherein the watch is configured to: determine a contact of the first finger and the second finger (¶337, “the second group of sensors includes a camera sensor (e.g., the camera sensors 164) and the camera sensor (e.g., an infrared camera that is positioned on the wrist-wearable device such that its field of view is looking towards a top of the user's hand) detects the pinch-threshold distance change between the two or more of the plurality of contacts of the surface gesture”; ¶423, “and the second in-air hand gesture is a pinch gesture, that includes a contact between a thumb of the user and another finger of the user, and the operation performed in response to the second indication causes an image to be captured by a camera at the wrist-wearable device or the head-wearable device”) and further based on a change to the impedance in the first finger (fig. 3D; ¶287, “the determination is made by tracking movement of a particular location on the hand 1503 (e.g., a point on the finger, a knuckle, a ring on the user's finger or another wearable electronic device)”; ¶336, “whether two or more contacts of the plurality of contacts are performing a pinch gesture (e.g., the location-agnostic gesture 346 in 3C can be detected based on detecting that the two or more contacts of the plurality of contacts are performing a pinch gesture using the EMG sensor”);determine a movement of the ring via the inertial measurement unit (¶72; ¶76); determine a gesture by the wearer based on a contact of a first finger and a second finger (¶337, “the second group of sensors includes a camera sensor (e.g., the camera sensors 164) and the camera sensor (e.g., an infrared camera that is positioned on the wrist-wearable device such that its field of view is looking towards a top of the user's hand) detects the pinch-threshold distance change between the two or more of the plurality of contacts of the surface gesture”; ¶423, “and the second in-air hand gesture is a pinch gesture, that includes a contact between a thumb of the user and another finger of the user, and the operation performed in response to the second indication causes an image to be captured by a camera at the wrist-wearable device or the head-wearable device”) that is based on the change to the impedance in the first finger (fig. 3D; ¶287, “the determination is made by tracking movement of a particular location on the hand 1503 (e.g., a point on the finger, a knuckle, a ring on the user's finger or another wearable electronic device)”; ¶336, “whether two or more contacts of the plurality of contacts are performing a pinch gesture (e.g., the location-agnostic gesture 346 in 3C can be detected based on detecting that the two or more contacts of the plurality of contacts are performing a pinch gesture using the EMG sensor”), and based on the movement of the ring (¶72, “he symbolic view of the sensors 150 displayed alongside FIGS. 1B-1E includes graphs of prophetic data corresponding to data collected by EMG sensors and time-of-flight sensors during the performance of the various gestures illustrated in FIGS. 1B-1E… In some embodiments, one or more camera sensors 164 are used to detect user motion and gestures (e.g., in addition to, or alternative to, the sensors described above)”); and transmit the gesture from the ring to an augmented reality or virtual reality system (fig. 1A, item 104; ¶50, “Such artificial-reality can include and/or represent virtual reality (VR), augmented reality (AR), mixed artificial reality (MAR), or some combination and/or variation one of these”; ¶53, “a head-wearable device 104 (e.g., AR glasses)”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the method of Kim, Park, and Moroney, since the method is implemented in an apparatus that may be a ring or watch, the method comprising: determining a contact of the first finger and the second finger and further based on a change to the impedance in the first finger, resulting in the method comprising: determining a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the impedance in the first finger; determining a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the impedance in the first finger and based on the movement of the ring; and transmitting the gesture from the ring to an augmented-reality or virtual-reality system, as taught by Whitmire so as to so as to allow for the wearable devices to be designed such that they are comfortable, functional, practical, and socially acceptable for day-to-day use and allow users to interact with a computer and/or user interface without requiring a fixed location or orientation for the interaction (¶17). Kim, Park, Moroney, and Whitmire combined do not teach the method comprising: determining a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the bioimpedance in the first finger; determining a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the bioimpedance in the first finger. Noda teaches an apparatus (figs. 1A and 1B) comprising: a ring configured to curve around a first finger of a wearer (fig. 1B), wherein the ring comprises an external surface and an internal surface (fig. 1B), wherein the internal surface is configured to come into contact with the first finger of the wearer (fig. 1B; ¶81); one or more electrodes coupled to the internal surface of the ring (fig. 1B, items 11a and 11b; ¶81), wherein the one or more electrodes are configured to measure bioimpedance in the first finger of the wearer (¶92-93); determine a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the bioimpedance in the first finger (¶89-90); transmit the gesture from the ring to an external device (fig. 5, item 110: external device; ¶86). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date to modify the method of Kim, Park, Moroney, and Whitmire, such that measured impedance is a bioimpedance of a first finger, resulting in the method comprising: determining a contact of the first finger and the second finger based on the first contour and the second contour merging into a single contour and further based on a change to the bioimpedance in the first finger; determining a gesture by the wearer based on the contact of the first finger and the second finger that is based on the change to the bioimpedance in the first finger, as taught by Noda so as to detect a user’s body movement thru an alternative method (¶10). Claims 3 and 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over Kim, Park, Moroney, Whitmire, and Noda as applied to claims 1 and 8 above, and further in view of Mai (Pub. No.: US 2025/0021167 A1). With respect to Claim 3, claim 2 is incorporated, Kim teaches further comprising integrating an inertial measurement unit (fig. 3B, item 215; ¶60) into the ring that measures movement of the ring. Kim, Park, Moroney, Whitmire, and Noda combined do not mention the inertial measurement unit measures movement of the ring in at least nine axes. Mai teaches an apparatus (figs. 1 and 3, item 200) comprising: a ring configured to curve around a first finger of a wearer (fig. 1; ¶62), wherein the ring comprises an external surface and an internal surface, wherein the internal surface is configured to come into contact with the first finger of the wearer (fig. 1); and an inertial measurement unit that measures movement of the ring (¶100, “The IMU (that is, the motion sensor 243) may detect an acceleration signal of a hand of a user when the target finger performs an action such as tapping or pinching, and further detect information such as an angular velocity and a geomagnetic field of the hand of the user”); wherein the inertial measurement unit is configured to measure movement of the ring in at least nine axes (¶94). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined apparatus of Kim, Park, Moroney, Whitmire, and Noda, wherein the inertial measurement unit measures movement of the ring in at least nine axes, as taught by Mai so as to improve accuracy of a gesture input recognition result (¶5). With respect to Claim 9, claim 8 is incorporated, Kim teaches further comprising integrating an inertial measurement unit (fig. 3B, item 215; ¶60) into the ring that measures movement of the ring. Kim, Park, Moroney, Whitmire, and Noda combined do not mention the inertial measurement unit measures movement of the ring in at least nine axes. Mai teaches an apparatus (figs. 1 and 3, item 200) and method (fig. 4; ¶104) comprising: a ring configured to curve around a first finger of a wearer (fig. 1; ¶62), wherein the ring comprises an external surface and an internal surface, wherein the internal surface is configured to come into contact with the first finger of the wearer (fig. 1); and an inertial measurement unit that measures movement of the ring (¶100, “The IMU (that is, the motion sensor 243) may detect an acceleration signal of a hand of a user when the target finger performs an action such as tapping or pinching, and further detect information such as an angular velocity and a geomagnetic field of the hand of the user”); wherein the inertial measurement unit measures movement of the ring in at least nine axes (¶94). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Kim, Park, Moroney, Whitmire, and Noda, wherein the inertial measurement unit measures movement of the ring in at least nine axes, as taught by Mai so as to improve accuracy of a gesture input recognition result (¶5). With respect to Claim 10, claim 9 is incorporated, Kim, Moroney, Whitmire, Noda, and Mai combined do not mention wherein: the ring includes a light-emitting diode (LED) associated with the miniature optical sensor; and the miniature optical sensor is configured to capture images of at least one of first finger or the second finger of the wearer illuminated by the LED. Park teaches a ring configured to curve around a finger of a wearer (fig. 4, item 100; ¶127-128), wherein the ring comprises: an external surface (fig. 4) and an internal surface (fig. 4) configured to come into contact with the finger of the wearer (fig. 4); and a miniature optical sensor (fig. 4, item 110; ¶128) coupled to the external surface of the ring and positioned to view along a length of the finger of the wearer (figs. 4 and 6; ¶52; ¶148); and detect biometric information on the finger of the wearer (¶78); wherein: the ring includes a light-emitting diode (LED) (fig. 1, item 105; ¶48) associated with the miniature optical sensor; and the miniature optical sensor is configured to capture images of at least one of the second finger of the wearer illuminated by the LED (fig. 4, the thumb is the first finger and the forefinger is the second finger; ¶52, “optical sensing unit”; ¶71, “the data processor 115 processes the optical signals received by the optical signal sensing unit 110 and generates information regarding blood vessels of the object. The image processor 130 may process the information regarding blood vessels, generate a visually confirmable pattern, and add the generated pattern to the 3D model generated based on the 3D scan information. That is, the image processor 130 may generate a 3D model formed by mapping the pattern regarding blood vessels onto the 3D model representing only the external appearance of the object”; ¶78, “the wearable device 100 senses user's skin lines through a finger recognition unit including an infrared camera, an RGB camera or a ToF camera …The wearable device 100 may detect the position and movement of a part of a user's body through such a process and output an image to a fixed position at a fixed angle”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the combined method of Kim, Moroney, Whitmire, Noda, and Mai wherein: the ring includes a light-emitting diode (LED) associated with the miniature optical sensor; and the miniature optical sensor is configured to capture images of at least one of the second finger of the wearer illuminated by the LED, as taught by Park so as to provide a wearable device which allows a user to conveniently input data using a portable input unit (¶4). With respect to Claim 11, claim 10 is incorporated, Kim, Park, Moroney, Noda, and Mai combined do not teach wherein determining the contact of the first finger and the second finger and determining movement of the ring is via a controller. Whitmire teaches an apparatus (fig. 1A, item 102; ¶53, “wrist-wearable device 102”; fig. 12B, item 1252-1; ¶233, “the accessory device 1252-1 can be a ring that is used in conjunction with a wearable structure to utilize data measurements obtained by sensor 1258-1 to adjust a fit of the wearable structure. In another example, the accessory device 1252-1 and accessory device 1252-2 are distinct wristbands to be worn on each wrist of the user”) and method (figs. 9A to 9C, figs. 10A-10C, and 11A-11B; figs. 16A-16B; ¶26-27) comprising: watch/ring configured to curve around a body part of a wearer (fig. 1A, item 102; ¶66; fig. 12B, item 1252-1), wherein the watch/ring comprises an external surface and an internal surface (fig. 1A; fig. 12B); an optical sensor (fig. 1I & 1J, item 164; fig. 12, item 1258-1; ¶72; ¶78; ¶118, “For example, an EMG sensor and/or an optical imaging sensor can detect positions and/or movements of the user's hand”; ¶240); an impedance sensor (fig. 1I & 1J, item 162; fig. 12B, item 1258; ¶72; ¶76; ¶118, “For example, an EMG sensor and/or an optical imaging sensor can detect positions and/or movements of the user's hand”; ¶215; ¶240); and an inertial measurement unit (fig. 1I & 1J, item 160; ¶72; ¶76; ¶215, “Motor actions can be detected based on the detected neuromuscular signals but can additionally (using a fusion of the various sensor inputs) … be detected using other types of sensors (such as cameras focused on viewing hand movements and/or using data from an inertial measurement unit that can detect characteristic vibration sequences or other data types to correspond to in-air hand gestures)”); wherein the watch is configured to: determine a contact of the first finger and the second finger (¶337, “the second group of sensors includes a camera sensor (e.g., the camera sensors 164) and the camera sensor (e.g., an infrared camera that is positioned on the wrist-wearable device such that its field of view is looking towards a top of the user's hand) detects the pinch-threshold distance change between the two or more of the plurality of contacts of the surface gesture”; ¶423, “and the second in-air hand gesture is a pinch gesture, that includes a contact between a thumb of the user and another finger of the user, and the operation performed in response to the second indication causes an image to be captured by a camera at the wrist-wearable device or the head-wearable device”) and further based on a change to the impedance in the first finger (fig. 3D; ¶287, “the determination is made by tracking movement of a particular location on the hand 1503 (e.g., a point on the finger, a knuckle, a ring on the user's finger or another wearable electronic device)”; ¶336, “whether two or more contacts of the plurality of contacts are performing a pinch gesture (e.g., the location-agnostic gesture 346 in 3C can be detected based on detecting that the two or more contacts of the plurality of contacts are performing a pinch gesture using the EMG sensor”);determine a movement of the ring via the inertial measurement unit (¶72; ¶76); determine a gesture by the wearer based on a contact of a first finger and a second finger (¶337, “the second group of sensors includes a camera sensor (e.g., the camera sensors 164) and the camera sensor (e.g., an infrared camera that is positioned on the wrist-wearable device such that its field of view is looking towards a top of the user's hand) detects the pinch-threshold distance change between the two or more of the plurality of contacts of the surface gesture”; ¶423, “and the second in-air hand gesture is a pinch gesture, that includes a contact between a thumb of the user and another finger of the user, and the operation performed in response to the second indication causes an image to be captured by a camera at the wrist-wearable device or the head-wearable device”) that is based on the change to the impedance in the first finger (fig. 3D; ¶287, “the determination is made by tracking movement of a particular location on the hand 1503 (e.g., a point on the finger, a knuckle, a ring on the user's finger or another wearable electronic device)”; ¶336, “whether two or more contacts of the plurality of contacts are performing a pinch gesture (e.g., the location-agnostic gesture 346 in 3C can be detected based on detecting that the two or more contacts of the plurality of contacts are performing a pinch gesture using the EMG sensor”), and based on the movement of the ring (¶72, “he symbolic view of the sensors 150 displayed alongside FIGS. 1B-1E includes graphs of prophetic data corresponding to data collected by EMG sensors and time-of-flight sensors during the performance of the various gestures illustrated in FIGS. 1B-1E… In some embodiments, one or more camera sensors 164 are used to detect user motion and gestures (e.g., in addition to, or alternative to, the sensors described above)”); and transmit the gesture from the ring to an augmented reality or virtual reality system (fig. 1A, item 104; ¶50, “Such artificial-reality can include and/or represent virtual reality (VR), augmented reality (AR), mixed artificial reality (MAR), or some combination and/or variation one of these”; ¶53, “a head-wearable device 104 (e.g., AR glasses)”); wherein determining the contact of the first finger and the second finger and determining movement of the ring is via a controller (fig. 12A, item 1204 comprises controllers; fig. 12B, item 1254-1 comprises controllers; ¶216, “The information collected about the portion of the user's body can include neuromuscular signals that can be used by the one or more processors 1204 of the wearable device 1202 to determine a motor action that the user intends to perform with their hand and/or fingers”; ¶219, “the one or more processors 1204 are configured to receive the data detected by the sensors 1216 and determine whether a user gesture performed by the user corresponds to an in-air gesture or a surface gesture”). Therefore it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the method of Kim, Park, and Moroney, since the method is implemented in an apparatus that may be a ring or watch resulting in the method comprising: wherein determining the contact of the first finger and the second finger and determining movement of the ring is via a controller, as taught by Whitmire so as to so as to allow for the wearable devices to be designed such that they are comfortable, functional, practical, and socially acceptable for day-to-day use and allow users to interact with a computer and/or user interface without requiring a fixed location or orientatio
Read full office action

Prosecution Timeline

Mar 25, 2024
Application Filed
Feb 21, 2025
Non-Final Rejection — §103
May 13, 2025
Interview Requested
May 22, 2025
Examiner Interview Summary
Jun 10, 2025
Response Filed
Jul 02, 2025
Final Rejection — §103
Sep 30, 2025
Interview Requested
Oct 07, 2025
Request for Continued Examination
Oct 10, 2025
Response after Non-Final Action
Nov 04, 2025
Non-Final Rejection — §103
Jan 21, 2026
Examiner Interview Summary
Jan 21, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591297
MULTIMODAL TASK EXECUTION AND TEXT EDITING FOR A WEARABLE SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12536977
BRIGHTNESS CONTROL METHOD AND APPARATUS FOR DISPLAY PANEL
2y 5m to grant Granted Jan 27, 2026
Patent 12475825
DISPLAY SUBSTRATE INCLUDING SHIFT REGISTER AND DISPLAY DEVICE
2y 5m to grant Granted Nov 18, 2025
Patent 12451088
LIQUID CRYSTAL DISPLAY DEVICE AND CONTROL MODULE THEREOF, AND INTEGRATED BOARD
2y 5m to grant Granted Oct 21, 2025
Patent 12451091
TEMPERATURE CONTROL CIRCUIT AND TEMPERATURE CONTROL METHOD OF DRIVER CHIP AND TIMING CONTROL DRIVER BOARD
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
77%
With Interview (+19.4%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 367 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month