Prosecution Insights
Last updated: April 19, 2026
Application No. 18/587,637

Wearable Device For Adjusting Haptic Responses Based On A Fit Characteristic

Non-Final OA §102§103
Filed
Feb 26, 2024
Examiner
FIBBI, CHRISTOPHER J
Art Unit
2174
Tech Center
2100 — Computer Architecture & Software
Assignee
Meta Platforms Technologies, LLC
OA Round
1 (Non-Final)
53%
Grant Probability
Moderate
1-2
OA Rounds
4y 3m
To Grant
90%
With Interview

Examiner Intelligence

Grants 53% of resolved cases
53%
Career Allow Rate
199 granted / 376 resolved
-2.1% vs TC avg
Strong +38% interview lift
Without
With
+37.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 3m
Avg Prosecution
40 currently pending
Career history
416
Total Applications
across all art units

Statute-Specific Performance

§101
9.8%
-30.2% vs TC avg
§103
62.9%
+22.9% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
10.2%
-29.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 376 resolved cases

Office Action

§102 §103
DETAILED ACTION Priority This action is in response to the original filing dated 26 February 2024 which claims priority to a U.S. provisional application, dated 07 April 2023. Claims 1-19 are pending and have been considered below. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claims 11, 12 and 17 are objected to because of the following informalities: claims 11, 12 and 17 refer to elements that were already established in previous claim dependencies as “a element” instead of “the element” (e.g. “a user” [claim 11] and “an emulated feature” [claims 12 and 17]). Examiner suggests amending these recitations to “the” element. Appropriate correction is required. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-10 and 12-19 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sedal et al. (US 2021/0081048 A1). As for independent claim 1, Sedal discloses the non-transitory computer-readable storage medium comprising: when executed by an artificial-reality system that includes a wearable device, cause the artificial-reality system to perform operations including: [(e.g. see Sedal paragraph 0014) ”The wearable devices discussed above, in some instances, are worn on the user's body (e.g., a hand, an arm, a wrist, or an ankle) and can be used to stimulate areas of the body. Moreover, the wearable device can be in communication with a remote device (e.g., a virtual reality device and/or an augmented reality device, among others), and the wearable device can stimulate the body based on an instruction from the remote device. As an example, the remote device may display media content to a user (e.g., via a head-mounted display), and the remote device may also instruct the wearable device to create haptic stimulations that correspond to the media content displayed to the user and/or other information collected by the wearable device”]. after a user has donned the wearable device on a body part of the user: [(e.g. see Sedal paragraph 0385) ”the information received from the sensors may indicate that a user has donned (or doffed) the wearable device”]. obtaining, based on data from a sensor of the wearable device, one or more fit characteristics indicating how the wearable device fits on the body part of the user [(e.g. see Sedal paragraphs 0039, 0065, 0066, 0145) ”The sensors 124 include one or more hardware devices that detect spatial and motion information about the wearable device 120. Spatial and motion information can include information about the position, orientation, velocity, rotation, and acceleration of the wearable device 120 or any subdivisions of the wearable device 120, such as fingers, fingertips, knuckles, the palm, or the wrist when the wearable device 120 is a glove. The sensors 124 may be IMUs, as discussed above with reference to the sensors 114 … systems for determining whether a wearable device is in proper contact with a user's skin or clothing, in order to provide a consistent fit and haptic feedback … worn on the user's wrist (or various other body parts) and is used to send and receive signals identifying whether one or more sensors are in direct contact with the user. In some embodiments, the wearable device adjusts a fit of itself, or a separate wearable structure, to provide a custom fit for the user (i.e., the fit is dynamically changed based on the present circumstances … provide optimal grounding forces and fit to a particular user (e.g., body size will change from user to user)”]. and in accordance with a determination that the user is interacting with an object within an artificial reality presented via the artificial-reality system using the wearable device, providing a fit-adjusted haptic response based (i) on the one or more fit characteristics and (ii) an emulated feature associated with the object [(e.g. see Sedal paragraphs 0066, 0156) ”The wearable device in some instances is worn on the user's wrist (or various other body parts) and is used to send and receive signals identifying whether one or more sensors are in direct contact with the user. In some embodiments, the wearable device adjusts a fit of itself, or a separate wearable structure, to provide a custom fit for the user (i.e., the fit is dynamically changed based on the present circumstances). Moreover, the wearable device can be in communication with a host system (e.g., a virtual reality device and/or an augmented reality device, among others), and the wearable device can adjust a fit of itself, or the separate wearable structure, based on instructions from the host system. As an example, the host system may present media to a user (e.g., may instruct a head-mounted display to display images of the user holding a cup), and the host system may also instruct the wearable device to adjust a fit of the wearable device so that haptic feedback generated by the wearable device (or, a particular structure of the wearable device) is properly applied to the user (e.g., adjust the fit so that an actuator (or some other component) of the wearable device is placed in proper contact with the user's skin) … The artificial-reality engine 134 may also provide feedback to the user that the action was performed. The provided feedback may be visual via the electronic display 112 in the head-mounted display 110 (e.g., displaying the simulated hand as it picks up and lifts the virtual coffee mug) and/or haptic feedback via the haptic assembly 122 in the wearable device 120. For example, the haptic feedback may prevent (or, at a minimum, hinder/resist movement of) one or more of the user's fingers from curling past a certain point to simulate the sensation of touching a solid coffee mug. To do this, the wearable device 120 changes (either directly or indirectly) a pressurized state of one or more of the haptic assemblies 122. Each of the haptic assemblies 122 includes a mechanism that, at a minimum, provides resistance when the respective haptic assembly 122 is transitioned from a first pressurized state (e.g., atmospheric pressure or deflated) to a second pressurized state (e.g., inflated to a threshold pressure)”]. As for dependent claim 2, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the instructions, when executed by the artificial-reality system, further cause the artificial-reality system to perform operations including: after a second user has donned the wearable device on a body part of the second user: [(e.g. see Sedal paragraph 0473) ”the wearable device is transferred from a first user to a second user”]. obtaining, based on data from the sensor of the wearable device, one or more second fit characteristics of the wearable device on the body part of the second user [(e.g. see Sedal paragraphs 0039, 0066, 0367) ”fit to a particular user (e.g., body size will change from user to user) … the sensor 2310 may determine (it may be determined from information collected by the sensor 2310) that a first user has a first sized finger 2308, while a second user has a second sized finger 2308 that is smaller than the first user's finger … one or more sensors are in direct contact with the user. In some embodiments, the wearable device adjusts a fit of itself, or a separate wearable structure, to provide a custom fit for the user (i.e., the fit is dynamically changed based on the present circumstances). Moreover, the wearable device can be in communication with a host system (e.g., a virtual reality device and/or an augmented reality device, among others), and the wearable device can adjust a fit of itself, or the separate wearable structure, based on instructions from the host system. As an example, the host system may present media to a user (e.g., may instruct a head-mounted display to display images of the user holding a cup), and the host system may also instruct the wearable device to adjust a fit of the wearable device so that haptic feedback generated by the wearable device (or, a particular structure of the wearable device) is properly applied to the user (e.g., adjust the fit so that an actuator (or some other component) of the wearable device is placed in proper contact with the user's skin)”]. and in accordance with a determination that the second user is interacting with the object within an artificial reality presented via the artificial-reality system, provide an additional fit-adjusted haptic response based on the one or more second fit characteristics, wherein the additional fit-adjusted haptic response is distinct from the fit-adjusted haptic response [(e.g. see Sedal paragraph 0367) ”the desired pressures for the first and second bladders 2204-A, 2204-B can be set (e.g., by the controller 2214) based on the size of the user's finger determined by the sensor 2310. For example, the sensor 2310 may determine (it may be determined from information collected by the sensor 2310) that a first user has a first sized finger 2308, while a second user has a second sized finger 2308 that is smaller than the first user's finger. In such an example, the desired pressures for the first and second bladders 2204-A, 2204-B, as applied to the first user, can be set lower relative to the desired pressures for the first and second bladders 2204-A, 2204-B, as applied to the second user. In this way an appropriate force is applied to the first user (and the second user) to secure the haptic-feedback mechanism 2300 to the first user's finger”]. As for dependent claim 3, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the fit-adjusted haptic response is only provided while the user is interacting with the object [(e.g. see Sedal paragraph 0160) ”The controller 214 is configured to control operation of the pressure-changing device 210, and in turn operation of the wearable devices 120. For example, the controller 214 sends one or more signals to the pressure-changing device 210 to activate the pressure-changing device 210 (e.g., turn it on and off). The one or more signals may specify a desired pressure (e.g., pounds-per-square inch) to be output by the pressure-changing device 210. Generation of the one or more signals, and in turn the pressure output by the pressure-changing device 210, may be based on information collected by the sensors 114 and/or the sensors 124 (FIG. 1). For example, the one or more signals may cause the pressure-changing device 210 to increase the pressure (e.g., fluid pressure) inside a first haptic assembly 122 at a first time, based on the information collected by the sensors 114 and/or the sensors 124 (e.g., the user makes contact with the artificial coffee mug)”]. As for dependent claim 4, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein: the instructions for obtaining the one or more fit characteristics include instructions for obtaining one or more zone-specific fit characteristics at each of a plurality of fit-sensing zones of the wearable device [(e.g. see Sedal paragraphs 0047, 0190) ”For example, as shown in FIG. 5, the wearable device includes a garment 202 that is a glove and five haptic assembly regions 522-A to 522-E, each corresponding to a digit (e.g., finger or thumb) of the glove. In this example, each haptic assembly region 522 corresponds to a finger region or thumb region on the garment 202 and may include one or more haptic assemblies 122 or 123, as described above … the haptic device also includes a sensor configured to measure a size the user's finger. In such embodiments, the desired pressures for the first and second bladders are set based on the size of the user's finger measured by the sensor. In some embodiments, the sensor is configured to measure a grounding force applied to the user and said measurements are used to adaptively adjust the desire pressures for the first and second bladders to obtain a desired comfortable grounding force”]. the instructions providing the fit-adjusted haptic response include instructions for providing a respective zone-specific fit-adjusted haptic response at each of selected fit-sensing zones of the plurality of fit-sensing zones of the wearable device [(e.g. see Sedal paragraph 0196) ”In conjunction with displaying the visual data, one or more bladders of the wearable device are inflated or deflated to the pressure (as noted above). As an example, the wearable device may include one or more haptic assemblies 122 or 123 coupled to a garment 202. Each haptic assembly 122 or 123 includes: includes (i) a bladder 206, and (ii) a support structure 204 or 404 attached to a portion of the bladder, where the bladder is pneumatically coupled to the pressure-changing device 210 that is configured to control a pressurized state of the bladder”]. wherein: the selected fit-sensing zones correspond to areas of the wearable device determined to be in simulated contact with the object when the fit-adjusted haptic response is provided [(e.g. see Sedal paragraph 0160) ”the one or more signals may cause the pressure-changing device 210 to increase the pressure (e.g., fluid pressure) inside a first haptic assembly 122 at a first time, based on the information collected by the sensors 114 and/or the sensors 124 (e.g., the user makes contact with the artificial coffee mug). Then, the controller may send one or more additional signals to the pressure-changing device 210 that cause the pressure-changing device 210 to further increase the pressure inside the first haptic assembly 122 at a second time after the first time, based on additional information collected by the sensors 114 and/or sensors 124 (e.g., the user grasps and lifts the artificial coffee mug). Further, the one or more signals may cause the pressure-changing device 210 to inflate one or more bladders 206 in a first wearable device 120-A”]. As for dependent claim 5, Sedal discloses the medium as described in claim 4 and Sedal further discloses: wherein each respective zone-specific fit-adjusted haptic response is based on (i) one or more zone specific fit characteristics [(e.g. see Sedal paragraph 0047) ”the haptic device also includes a sensor configured to measure a size the user's finger. In such embodiments, the desired pressures for the first and second bladders are set based on the size of the user's finger measured by the sensor. In some embodiments, the sensor is configured to measure a grounding force applied to the user and said measurements are used to adaptively adjust the desire pressures for the first and second bladders to obtain a desired comfortable grounding force”]. As for dependent claim 6, Sedal discloses the medium as described in claim 4 and Sedal further discloses: wherein the instructions for providing the fit-adjusted haptic response include instructions for each respective zone-specific fit-adjusted haptic response, include: activating two or more haptic feedback generating components within the respective zone of the plurality of fit-sensing zones in accordance with the respective zone-specific fit-adjusted haptic response [(e.g. see Sedal paragraph 0365) ”The first bladder 2204-A is configured to (i) inflate in response to receiving a fluid from the fluid source and (ii) tighten around the distal phalange of the user's finger 2308 when inflated to a desired pressure. Similarly, the second bladder 2204-B is configured to (i) inflate in response to receiving the fluid from the source and (ii) tighten around the joint connecting the distal phalange and the intermediate phalange of the user's finger when inflated to a desired pressure. In doing so, the first bladder 2204-A and the second bladder 2204-B secure the housing 2202 to the user's finger 2308 (i.e., the housing 2202, and the haptic-feedback mechanism 2300 as a whole, are grounded to the user's body). In some embodiments, the first bladder 2204-A and the second bladder 2204-B are inflated to the same pressure (i.e., the desired pressures are the same). In other embodiments, the first bladder 2204-A and the second bladder 2204-B are inflated to a different pressure (i.e., the desired pressures differ)”]. As for dependent claim 7, Sedal discloses the medium as described in claim 6 and Sedal further discloses: wherein the two or more haptic feedback generating components within the respective zone of the plurality of fit-sensing zones are different from each other, allowing for nuanced zone-specific fit-adjusted haptic responses [(e.g. see Sedal paragraph 0190) ”For example, as shown in FIG. 5, the wearable device includes a garment 202 that is a glove and five haptic assembly regions 522-A to 522-E, each corresponding to a digit (e.g., finger or thumb) of the glove. In this example, each haptic assembly region 522 corresponds to a finger region or thumb region on the garment 202 and may include one or more haptic assemblies 122 or 123, as described above. In particular, a haptic assembly 122 or 123 may be positioned on a palm region of the user's hand, or on palmar portions of the user's fingers or thumb (e.g., any of haptic assembly regions 522-A to 522-E). Thus, each of these regions of the user's body can experience one or more haptic stimulations. In some embodiments, one or more types of haptic assemblies 122 or 123 may be included in the wearable device 120. For example, haptic assembly region 522-A, which corresponds to a thumb region, may include one or more haptic assemblies 122. In contrast, haptic assembly region 522-B, which corresponds to a forefinger region, may include one or more haptic assemblies 123 and haptic assembly region 522-D, which corresponds to a ring finger region, may include a combination of haptic assemblies 122 and 123. Note that various other combinations of haptic assemblies 122/123 can be used in the haptic assembly regions 522”]. As for dependent claim 8, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the fit-adjusted haptic response is provided via a haptic-feedback generator integrated into the wearable device [(e.g. see Sedal paragraph 0218) ”the haptic feedback systems may be incorporated with the artificial-reality systems (e.g., systems 700, 800, and 900 may include the wearable device 120 shown in FIG. 1). Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, shear, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms”]. As for dependent claim 9, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein obtaining one or more fit characteristics indicating how the wearable device fits on the body part of the user is obtained by recording data from a sensor different from a component that provided the fit-adjusted haptic response [(e.g. see Sedal paragraphs 0065, 0066) ”determining whether a wearable device is in proper contact with a user's skin or clothing, in order to provide a consistent fit and haptic feedback. Embodiments herein are directed toward a sensor system that employs transmit and receive electrodes to determine whether contact (and, in some cases, the quality of the contact) is made between the wearable device and the user … The wearable device in some instances is worn on the user's wrist (or various other body parts) and is used to send and receive signals identifying whether one or more sensors are in direct contact with the user. In some embodiments, the wearable device adjusts a fit of itself, or a separate wearable structure, to provide a custom fit for the user (i.e., the fit is dynamically changed based on the present circumstances). Moreover, the wearable device can be in communication with a host system (e.g., a virtual reality device and/or an augmented reality device, among others), and the wearable device can adjust a fit of itself, or the separate wearable structure, based on instructions from the host system. As an example, the host system may present media to a user (e.g., may instruct a head-mounted display to display images of the user holding a cup), and the host system may also instruct the wearable device to adjust a fit of the wearable device so that haptic feedback generated by the wearable device (or, a particular structure of the wearable device) is properly applied to the user (e.g., adjust the fit so that an actuator (or some other component) of the wearable device is placed in proper contact with the user's skin)”]. As for dependent claim 10, Sedal discloses the medium as described in claim 9 and Sedal further discloses: wherein the sensor is an inertial measurement unit sensor, wherein data from the inertial measurement sensor can be used to determine performance of the fit-adjusted haptic response [(e.g. see Sedal paragraphs 0141, 0145) ”the sensors 114 may include one or more inertial measurement units (IMUs) … The sensors 124 include one or more hardware devices that detect spatial and motion information about the wearable device 120. Spatial and motion information can include information about the position, orientation, velocity, rotation, and acceleration of the wearable device 120 or any subdivisions of the wearable device 120, such as fingers, fingertips, knuckles, the palm, or the wrist when the wearable device 120 is a glove. The sensors 124 may be IMUs, as discussed above with reference to the sensors 114”]. As for dependent claim 12, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the instructions that, when executed by the artificial-reality system, further cause the artificial-reality system to perform operations includes: after providing the fit-adjusted haptic response based on the one or more fit characteristics and an emulated features associated with the object: obtaining an additional one or more fit characteristics indicating how the wearable device fits on the body part of the user; in accordance with a determination that the user is interacting with the object within the artificial reality using the wearable device, providing another fit-adjusted haptic response based on the additional one or more fit characteristics and the emulated feature associated with the object [(e.g. see Sedal paragraph 0160) ”The one or more signals may specify a desired pressure (e.g., pounds-per-square inch) to be output by the pressure-changing device 210. Generation of the one or more signals, and in turn the pressure output by the pressure-changing device 210, may be based on information collected by the sensors 114 and/or the sensors 124 (FIG. 1). For example, the one or more signals may cause the pressure-changing device 210 to increase the pressure (e.g., fluid pressure) inside a first haptic assembly 122 at a first time, based on the information collected by the sensors 114 and/or the sensors 124 (e.g., the user makes contact with the artificial coffee mug). Then, the controller may send one or more additional signals to the pressure-changing device 210 that cause the pressure-changing device 210 to further increase the pressure inside the first haptic assembly 122 at a second time after the first time, based on additional information collected by the sensors 114 and/or sensors 124 (e.g., the user grasps and lifts the artificial coffee mug)”]. As for dependent claim 13, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein: the wearable device is a wearable-glove device [(e.g. see Sedal paragraph 0190 and Fig. 5) ”the wearable device includes a garment 202 that is a glove and five haptic assembly regions 522-A to 522-E, each corresponding to a digit (e.g., finger or thumb) of the glove”]. the one or more fit characteristics indicating how the wearable device fits on the body part of the user is obtained via an inertial measurement unit (IMU) located on different parts of the wearable-glove device [(e.g. see Sedal paragraph 0145) ”The sensors 124 include one or more hardware devices that detect spatial and motion information about the wearable device 120. Spatial and motion information can include information about the position, orientation, velocity, rotation, and acceleration of the wearable device 120 or any subdivisions of the wearable device 120, such as fingers, fingertips, knuckles, the palm, or the wrist when the wearable device 120 is a glove. The sensors 124 may be IMUs, as discussed above with reference to the sensors 114”]. the fit-adjusted haptic response is provided by a haptic feedback generator, wherein the haptic feedback generator is configured to alter its feedback or change its shape [(e.g. see Sedal paragraph 0157) ”The haptic assemblies 122 are configured to conform to a shape of the portion of the wearer's body when in the first pressurized state … once in the second pressurized state, the haptic assemblies 122 may take different shapes”]. As for dependent claim 14, Sedal discloses the medium as described in claim 13 and Sedal further discloses: wherein the wearable-glove device includes a bladder that is configured to expand and contract and causes the haptic feedback generator to move closer or away from the body part of the user [(e.g. see Sedal paragraphs 0039, 0066) ”One example of active grounding involves bladders, which can be inflated or deflated to attach or detach a wearable device to a user's body Active grounding devices can be computer controlled, meaning that said devices can be controlled to provide optimal grounding forces and fit to a particular user … The host system may also instruct the wearable device to adjust a fit of the wearable device so that haptic feedback generated by the wearable device (or, a particular structure of the wearable device) is properly applied to the user (e.g., adjust the fit so that an actuator (or some other component) of the wearable device is placed in proper contact with the user's skin)”]. As for dependent claim 15, Sedal discloses the medium as described in claim 13 and Sedal further discloses: wherein the wearable-glove device includes a bifurcated finger-tip sensor configured to detect forces acting on a tip of a finger [(e.g. see Sedal paragraph 0145) ”The sensors 124 include one or more hardware devices that detect spatial and motion information about the wearable device 120. Spatial and motion information can include information about the position, orientation, velocity, rotation, and acceleration of the wearable device 120 or any subdivisions of the wearable device 120, such as fingers, fingertips, knuckles, the palm, or the wrist when the wearable device 120 is a glove. The sensors 124 may be IMUs, as discussed above with reference to the sensors 114”]. As for dependent claim 16, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the fit-adjusted haptic response is provided via an inflatable bubble array or a vibrational motor [(e.g. see Sedal paragraphs 0168, 0218) ”Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms … Various haptic assembly 122 configurations may be used, and each of the haptic assemblies 122 is configured to create one or more haptic stimulations when the bladder 206 is pressurized. Additionally, the various bladders 206 may be designed to create haptic stimulations by way of positive pressure and/or negative pressure. “Haptic stimulations” (e.g., tactile feedback and/or haptic feedback) include but are not limited to a touch stimulation, a swipe stimulation, a pull stimulation, a push stimulation, a rotation stimulation, a heat stimulation, a pulsating stimulation, a vibration stimulation, and/or a pain stimulation”]. As for dependent claim 17, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the instructions that, when executed by the artificial-reality system further cause the artificial-reality system to perform operations including: after providing the fit-adjusted haptic response based on the one or more fit characteristics and an emulated feature associated with the object: in accordance with a determination that the user is interacting with another object within the artificial reality using the wearable device, providing another fit-adjusted haptic response based on the one or more fit characteristics and an emulated feature associated with the other object [(e.g. see Sedal paragraphs 0005, 0264) ”artificial-reality scenes that involve grasping (or other similar interactions) with virtual objects … In another example, the haptic-feedback mechanism 1022 may simulate the sensation a user's finger (or fingers) touching and otherwise interacting with a solid object, such as a glass of water. Specifically, the haptic-feedback mechanism 1022 is capable of creating forces on finger phalanges, as one example, in directions that are very similar to the forces induced by physical objects during natural hand-object interaction (i.e., simulate the forces that would actually be felt by a user when he or she touches, lifts, and empties a full glass of water in the real world). To do this, the wearable device 1020 and/or the computer system 1030 changes (either directly or indirectly) a pressurized state inside one or more channels 1104 of the haptic-feedback mechanism 1022. In particular, one or more first channels are pressurized during a first stage of the interaction (e.g., grasping the glass of water) to render contact normal forces proportional to a grasping force, while one or more second channels are pressurized during a second stage of the interaction (e.g., lifting the glass of water) to render shear forces proportional to the weight and inertia of the glass. Finally, one or more third channels are pressurized during a third stage of the interaction (e.g., pouring the water from the glass) to render shear forces proportional to the weight of the glass being emptied. Importantly, with the last step, the shear forces are changed dynamically based on the rate at which the glass is being emptied]. Examiner notes that, the gloves can provide a haptic response for a virtual glass of water in addition to a virtual coffee mug. As for dependent claim 18, Sedal discloses the medium as described in claim 1 and Sedal further discloses: wherein the artificial-reality system includes a head-worn wearable device configured to display the object within the artificial reality [(e.g. see Sedal paragraph 0014, 0066 and Figs. 8-9) ”the remote device may display media content to a user (e.g., via a head-mounted display), and the remote device may also instruct the wearable device to create haptic stimulations that correspond to the media content displayed to the user and/or other information collected by the wearable device … the wearable device can be in communication with a host system (e.g., a virtual reality device and/or an augmented reality device”]. As for independent claim 19, Sedal discloses a device. Claim 19 discloses substantially the same limitations as claim 1. Therefore, it is rejected with the same rational as claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Sedal et al. (US 2021/0081048 A1) in view of Komaki (US 2017/0150897 A1). As for dependent claim 11, Sedal teaches the medium as described in claim 1, but does not specifically teach wherein the instructions that, when executed by the artificial-reality system, further cause the artificial-reality system to perform operations including: after a user has donned the wearable device on a body part of the user: obtaining one or more fit characteristics indicating how the wearable device fits on the body part of the user and in accordance with a determination that the one or more fit characteristics indicate that the wearable device is properly affixed to the body part of the user, forgoing adjusting the fit-adjusted haptic response based on the one or more fit characteristics. However, in the same field of invention or solving similar problems, Komaki teaches: wherein the instructions that, when executed by the artificial-reality system, further cause the artificial-reality system to perform operations including: after a user has donned the wearable device on a body part of the user: obtaining one or more fit characteristics indicating how the wearable device fits on the body part of the user and in accordance with a determination that the one or more fit characteristics indicate that the wearable device is properly affixed to the body part of the user, forgoing adjusting the fit-adjusted haptic response based on the one or more fit characteristics [(e.g. see Komaki paragraphs 0128, 0141 and Fig. 17) ”When a user puts the goggles on such that the face firmly contacts the four electro-oculographic detection electrodes (EOG electrodes 151a, 151b, 152a, and 152b) on the frame 110, the user stably maintains the contact (ST10) … If the calibration is not performed (No in ST12), whether or not calibration data of the current user are registered in the memory bank (11b of FIG. 19) is checked (ST40). The check is performed through an AR display of registered user names, for example (and the AR display is performed in, for example, film liquid crystal displays 12L/12R in FIGS. 18 to 21). If there is the name of the current user (which may be a nickname or a user ID code) in the AR display (Yes in ST40), the user refers to the name and selects the item by, for example, closing of the eyes for a few seconds. The calibration data related to the selected item are then used (ST42). If there is not the user name in the AR display (No in ST40), the calibration is skipped and the goggles are used in preliminarily set default condition (ST44)”]. Therefore, considering the teachings of Sedal and Komaki, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to add wherein the instructions that, when executed by the artificial-reality system, further cause the artificial-reality system to perform operations including: after a user has donned the wearable device on a body part of the user: obtaining one or more fit characteristics indicating how the wearable device fits on the body part of the user and in accordance with a determination that the one or more fit characteristics indicate that the wearable device is properly affixed to the body part of the user, forgoing adjusting the fit-adjusted haptic response based on the one or more fit characteristics, as taught by Komaki, to the teachings of Sedal because registering calibration data allows different users to quickly switch using the device without having to going through the calibration processes every time (e.g. see Komaki paragraphs 0140, 0141). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. U.S. Patent 10,133,358 B1 issued to Chen et al. on 20 November 2018. The subject matter disclosed therein is pertinent to that of claims 1-19 (e.g. measuring the fit of a device using an IMU). U.S. PGPub 2021/0096649 A1 issued to Mok on 01 April 2021. The subject matter disclosed therein is pertinent to that of claims 1-19 (e.g. VR haptic feedback glove for interaction with displayed objects). Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER J FIBBI whose telephone number is (571)-270-3358. The examiner can normally be reached Monday - Thursday (8am-6pm). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Bashore can be reached at (571)-272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHRISTOPHER J FIBBI/Primary Examiner, Art Unit 2174
Read full office action

Prosecution Timeline

Feb 26, 2024
Application Filed
Dec 19, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585866
AUTOMATED ENTRY OF EXTRACTED DATA AND VERIFICATION OF ACCURACY OF ENTERED DATA THROUGH A GRAPHICAL USER INTERFACE
2y 5m to grant Granted Mar 24, 2026
Patent 12561152
METHODS AND SYSTEMS FOR ADAPTIVE CONFIGURATION
2y 5m to grant Granted Feb 24, 2026
Patent 12535930
INTEROPERABILITY FOR TRANSLATING AND TRAVERSING 3D EXPERIENCES IN AN ACCESSIBILITY ENVIRONMENT
2y 5m to grant Granted Jan 27, 2026
Patent 12535941
USER INTERFACE FOR MANAGING INPUT TECHNIQUES
2y 5m to grant Granted Jan 27, 2026
Patent 12519999
Location Based Playback System Control
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
53%
Grant Probability
90%
With Interview (+37.6%)
4y 3m
Median Time to Grant
Low
PTA Risk
Based on 376 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month