Prosecution Insights
Last updated: April 19, 2026
Application No. 18/790,936

Barometric Sensing of Arm Position In a Pointing Controller System

Final Rejection §103
Filed
Jul 31, 2024
Examiner
ALMEIDA, CORY A
Art Unit
2628
Tech Center
2600 — Communications
Assignee
Arkh Litho Holdings LLC
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
89%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
528 granted / 790 resolved
+4.8% vs TC avg
Strong +22% interview lift
Without
With
+22.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
22 currently pending
Career history
812
Total Applications
across all art units

Statute-Specific Performance

§101
1.7%
-38.3% vs TC avg
§103
56.9%
+16.9% vs TC avg
§102
30.1%
-9.9% vs TC avg
§112
7.1%
-32.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 790 resolved cases

Office Action

§103
DETAILED ACTION Status of the Claims The response filed 10/23/25 is entered. Claims 2-11 and 20-21 are amended. Claims 1 and 18-19 are canceled. Claims 22-23 are new. Claims 2-17 and 20-23 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments filed on 10/23/25 have been fully considered but they are directed to newly amended claims and therefore believed to be answered by and thus moot in view of new grounds of rejections presented below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2-10, 12-14, and 20-23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bradner, US-20190290999, in view of Pahlavan, US-20030142065, in further view of Huang, US-20130324254. In regards to claim 2, Bradner discloses a method for controlling interactions with virtual objects in an augmented reality environment using a pointing controller (Par. 0003-0004 controller for virtual reality, i.e. augmented reality) having a wearable form factor suitable for wearing by one or more fingers of a user (Fig. 4, 100 controller worn with fingers through the opening) and the pointing controller having a touch-sensitive interface (Fig. 4, 116 track pad) accessible when the pointing controller is worn by the one or more fingers (Fig. 4, 116 track pad), the method comprising: displaying the virtual object on an augmented reality display device (Fig. 30, 3006 virtual object); obtaining sensor data from an inertial measurement unit of the pointing controller (Par. 0093 and 0199 IMU and other sensors determining motion of the controller; Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); controlling a pointing direction of a pointing vector through a three-dimensional virtual space based on the sensor data from the inertial measurement unit such that the pointing vector originates at a simulated position of the pointing controller in the three-dimensional virtual space and the pointing direction changes responsive to a pointing direction of the one or more fingers wearing the pointing controller (Par. 0093 and 0199 IMU and other sensors determining motion of the controller, which would be from a starting position in 3d space and resultant simulated 3d position and in a direction based on the movement and pointing direction of the hand and associated fingers wearing the controller); detecting a pointing-based interaction of the pointing vector with the virtual object in the three-dimensional virtual space (Par. 0093 and 0199 IMU and other sensors determining motion of the controller; Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); placing the virtual object in a grabbed state based at least in part on detecting the pointing-based interaction (Par. 0093 and 0199 IMU and other sensors determining motion of the controller; Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); detecting a touch-based gesture applied to the touch-sensitive interface (Fig. 4, 116 track pad) of the pointing controller while the virtual object is in the grabbed state (Fig. 4, 116 track pad receives input while virtual object is grabbed). Bradner does not disclose expressly a pointing controller having a ring form factor suitable for wearing on an index finger of a user and the pointing controller having a touch-sensitive interface on an exterior convex surface that is accessible to a thumb on the same hand where the pointing controller is worn on the index finger; pointing direction of the index finger wearing the pointing controller having the ring form factor; detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor; responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector. Pahlavan discloses a pointing controller having a ring form factor suitable for wearing on an index finger of a user and the pointing controller (Par. 0022 “ring-like pointing device using inertial sensors to detect motion along and about different axes of the three dimensional space”) having a touch-sensitive interface on an exterior convex surface (Par 0035 “said ring is equipped with a number of buttons or touch sensors on its periphery”) that is accessible to a thumb on the same hand (Par. 0026 thumb provides input) where the pointing controller is worn on the index finger (Par. 0026 “the ring is worn on the second segment of the index finger”); pointing direction of the index finger wearing the pointing controller having the ring form factor (Par. 0042 “there can be as many as 3 accelerometers and 3 rate gyros, or any other combination of them that can provide the inertial information needed for controlling position and motion of a three-dimensional object in a three-dimensional space.”); detecting a touch-based gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor (Par 0035 “said ring is equipped with a number of buttons or touch sensors on its periphery”, which detect input. i.e. a gesture). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art that the wearable device of Bradner can be placed in a ring form factor as Pahlavan discloses. The motivation for doing so would have been to provide a small form factor wearable device. Bradner and Pahlavan do not disclose expressly detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor; responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector. Huang discloses a three-dimensional multi-positional controller (Fig. 2A; Par. 0005) comprising: detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 20, Bradner discloses a non-transitory computer-readable storage medium storing instructions (Par. 0202-0203 memory and processors for a controller) for controlling interactions with a virtual object in an augmented reality environment using a pointing controller (Par. 0003-004 controller for virtual reality, i.e. augmented reality) having a wearable form factor suitable for wearing by one or more fingers of a user (Fig. 4, 100 controller worn with fingers through the opening) and the pointing controller having a touch-sensitive interface (Fig. 4, 116 track pad) accessible when the pointing controller is worn by the one or more fingers (Fig. 4, 116 track pad), the instructions when executed by one or more processors causing the one or more processors to perform steps including: displaying the virtual object on an augmented reality display device (Fig. 30, 3006 virtual object); obtaining sensor data from an inertial measurement unit of the pointing controller (Par. 0093 and 0199 IMU and other sensors determining motion of the controller; Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); controlling a pointing direction of a pointing vector through a three-dimensional virtual space based on the sensor data from the inertial measurement unit such that the pointing vector originates at a simulated position of the pointing controller in the three-dimensional virtual space and the pointing direction changes responsive to a pointing direction of the one or more fingers wearing the pointing controller (Par. 0093 and 0199 IMU and other sensors determining motion of the controller, which would be from a starting position in 3d space and resultant simulated 3d position and in a direction based on the movement and pointing direction of the hand and associated fingers wearing the controller); detecting a pointing-based interaction of the pointing vector with the virtual object in the three-dimensional virtual space (Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); placing the virtual object in a grabbed state based at least in part on detecting the pointing-based interaction (Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); detecting a touch-based gesture applied to the touch-sensitive interface (Fig. 4, 116 track pad) of the pointing controller while the virtual object is in the grabbed state (Fig. 4, 116 track pad receives input while virtual object is grabbed). Bradner does not disclose expressly a pointing controller having a ring form factor suitable for wearing on an index finger of a user and the pointing controller having a touch-sensitive interface on an exterior convex surface that is accessible to a thumb on the same hand where the pointing controller is worn on the index finger; pointing direction of the index finger wearing the pointing controller having the ring form factor; detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor; responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector. Pahlavan discloses a pointing controller having a ring form factor suitable for wearing on an index finger of a user and the pointing controller (Par. 0022 “ring-like pointing device using inertial sensors to detect motion along and about different axes of the three dimensional space”) having a touch-sensitive interface on an exterior convex surface (Par 0035 “said ring is equipped with a number of buttons or touch sensors on its periphery”) that is accessible to a thumb on the same hand (Par. 0026 thumb provides input) where the pointing controller is worn on the index finger (Par. 0026 “the ring is worn on the second segment of the index finger”); pointing direction of the index finger wearing the pointing controller having the ring form factor (Par. 0042 “there can be as many as 3 accelerometers and 3 rate gyros, or any other combination of them that can provide the inertial information needed for controlling position and motion of a three-dimensional object in a three-dimensional space.”); detecting a touch-based gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor (Par 0035 “said ring is equipped with a number of buttons or touch sensors on its periphery”, which detect input. i.e. a gesture). Bradner and Pahlavan do not disclose expressly detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor; responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art that the wearable device of Bradner can be placed in a ring form factor as Pahlavan discloses. The motivation for doing so would have been to provide a small form factor wearable device. Huang discloses a three-dimensional multi-positional controller (Fig. 2A; Par. 0005) comprising: detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 21, Bradner discloses a system for controlling interactions with a virtual object in an augmented reality environment (Par. 0003-004 controller for virtual reality, i.e. augmented reality), the system comprising: a pointing controller having a wearable form factor suitable for wearing by one or more fingers of a user (Fig. 4, 100 controller worn with fingers through the opening) and the pointing controller having a touch-sensitive interface (Fig. 4, 116 track pad) accessible when the pointing controller is worn by the one or more fingers (Fig. 4, 116 track pad); a non-transitory computer-readable storage medium storing instructions for executing by one or more processors (Par. 0202-0203 memory and processors for a controller), the instructions when executed causing the one or more processors to perform steps comprising: displaying the virtual object on an augmented reality display device (Fig. 30, 3006 virtual object); obtaining sensor data from an inertial measurement unit of the pointing controller (Par. 0093 and 0199 IMU and other sensors determining motion of the controller; Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); controlling a pointing direction of a pointing vector through a three-dimensional virtual space based on the sensor data from the inertial measurement unit such that the pointing vector originates at a simulated position of the pointing controller in the three-dimensional virtual space and the pointing direction changes responsive to a pointing direction of the one or more fingers wearing the pointing controller (Par. 0093 and 0199 IMU and other sensors determining motion of the controller, which would be from a starting position in 3d space and resultant simulated 3d position and in a direction based on the movement and pointing direction of the hand and associated fingers wearing the controller); detecting a pointing-based interaction of the pointing vector with the virtual object in the three-dimensional virtual space (Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); placing the virtual object in a grabbed state based at least in part on detecting the pointing-based interaction (Fig. 30, 3006 virtual object; Par. 0207 “the controller 100 and/or the connected display device may analyze the sensor data to determine that the user intends to grasp the virtual object”); detecting a touch-based gesture applied to the touch-sensitive interface (Fig. 4, 116 track pad) of the pointing controller while the virtual object is in the grabbed state (Fig. 4, 116 track pad receives input while virtual object is grabbed); Bradner does not disclose expressly a pointing controller having a ring form factor suitable for wearing on an index finger of a user and the pointing controller having a touch-sensitive interface on an exterior convex surface that is accessible to a thumb on the same hand where the pointing controller is worn on the index finger; pointing direction of the index finger wearing the pointing controller having the ring form factor; detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor; responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector. Pahlavan discloses a pointing controller having a ring form factor suitable for wearing on an index finger of a user and the pointing controller (Par. 0022 “ring-like pointing device using inertial sensors to detect motion along and about different axes of the three dimensional space”) having a touch-sensitive interface on an exterior convex surface (Par 0035 “said ring is equipped with a number of buttons or touch sensors on its periphery”) that is accessible to a thumb on the same hand (Par. 0026 thumb provides input) where the pointing controller is worn on the index finger (Par. 0026 “the ring is worn on the second segment of the index finger”); pointing direction of the index finger wearing the pointing controller having the ring form factor (Par. 0042 “there can be as many as 3 accelerometers and 3 rate gyros, or any other combination of them that can provide the inertial information needed for controlling position and motion of a three-dimensional object in a three-dimensional space.”); detecting a touch-based gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor (Par 0035 “said ring is equipped with a number of buttons or touch sensors on its periphery”, which detect input. i.e. a gesture). Bradner and Pahlavan do not disclose expressly detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor; responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art that the wearable device of Bradner can be placed in a ring form factor as Pahlavan discloses. The motivation for doing so would have been to provide a small form factor wearable device. Huang discloses a three-dimensional multi-positional controller (Fig. 2A; Par. 0005) comprising: detecting a touch-based swipe gesture applied by the thumb to the touch-sensitive interface on the exterior convex surface of the pointing controller having the ring form factor (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); responsive to detecting the touch-based gesture, displaying via the augmented reality display device, simulated movement of the virtual object through the three-dimensional virtual space along the pointing vector (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 3, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 2. Huang further discloses detecting the touch-based gesture and displaying the simulated movement of the virtual object comprises: displaying the simulated movement of the virtual object in a forward direction away from the simulated position of the pointing controller in response to the swipe gesture being associated with the forward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear) or displaying the simulated movement of the virtual object in a backward direction towards the simulated position of the pointing controller in response to the swipe gesture being associated with the backward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 4, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 2. Huang further discloses detecting the touch-based gesture and displaying the simulated movement of the virtual object comprises: detecting a detected swipe distance of the touch-based swipe gesture (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); and displaying the simulated movement of the virtual object such that a change in distance of the virtual object corresponds to a linear function of the detected swipe distance (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 5, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 2. Huang further discloses detecting the touch-based swipe gesture and displaying the simulated movement of the virtual object comprises: detecting the touch-based gesture as a swipe gesture having a detected swipe distance of the touch-based swipe gesture (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); and displaying the simulated movement of the virtual object such that a change in distance of the virtual object corresponds to a non-linear function of the detected swipe distance (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 6, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 2. Huang further discloses detecting the touch-based swipe gesture and displaying the simulated movement of the virtual object comprises: detecting a detected swipe velocity of the touch-based swipe gesture (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); and displaying the simulated movement of the virtual object such that a motion characteristic of the virtual object corresponds to a linear function of the detected swipe velocity (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 7, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 2. Huang further discloses detecting the touch-based swipe gesture and displaying the simulated movement of the virtual object comprises: detecting a detected swipe velocity of the touch-based swipe gesture (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); and displaying the simulated movement of the virtual object such that a motion characteristic of the virtual object corresponds to a non-linear function of the detected swipe velocity (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 8, Bradner, Pahlavan, and Huang, as combined above, disclose the simulated movement continues after completing the swipe gesture with a decaying object velocity that decays over time until the virtual object comes to rest (Bradner Par. 0065 “the hardware and/or software may calculate one or more of a velocity of the object (e.g., speed and direction), a position at which the virtual object is to be released, a trajectory of the virtual object from the position of release to the landing or other cessation point of the virtual object, a landing location of the virtual object, and/or the like.”). Huang further discloses detecting the touch-based swipe gesture and displaying the simulated movement of the virtual object comprises: detecting a detected swipe velocity of the touch-based swipe gesture (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); and displaying the simulated movement of the virtual object such that an initial object velocity of the virtual object corresponds to a function of the detected swipe velocity and in which the simulated movement continues after completing the swipe gesture with a decaying object velocity that decays over time until the virtual object comes to rest (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 9, Bradner, Pahlavan, and Huang, as combined above, disclose displaying the simulated movement of the virtual object comprises: displaying the simulated movement of the virtual object based at least in part on a physics model of the virtual object defining simulated physical characteristics of the virtual object (Bradner Par. 0065 “the hardware and/or software may calculate one or more of a velocity of the object (e.g., speed and direction), a position at which the virtual object is to be released, a trajectory of the virtual object from the position of release to the landing or other cessation point of the virtual object, a landing location of the virtual object, and/or the like.”, i.e. physics modeling). In regards to claim 10, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 2. Huang further discloses detecting the touch-based swipe gesture and displaying the simulated movement of the virtual object comprises: detecting one more detected motion parameters of the touch-based swipe gesture (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear); and displaying the simulated movement of the virtual object based at least on the one or more detected motion parameters of the touch-based swipe gesture and a physics model of the virtual object defining a simulated weight of the virtual object (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 12, Bradner, Pahlavan, and Huang, as combined above disclose placing the virtual object in the grabbed state further comprises: placing the virtual object in a selected state responsive to detecting the pointing-based interaction (Bradner Par. 0211 identifying a force-grip state such as a force-pinch-state); and while in the selected state, detecting a pinching gesture via a pressure-sensitive inter-digit interface of the pointing controller to place the virtual object in the grabbed state (Bradner Par. 0211 “if a force sensor located on a top side of the handheld controller beneath where a thumb of the user sits returns a force value greater than a particular threshold (e.g., 1% of a maximum force reading of the sensor, 25% of a maximum force reading, etc.) and if a trigger button selectable by a pointer or other finger of the user is pressed, then the described techniques may identify the force-pinch state. That is, in response to these sensor readings, the force-pinch calculator 3122 may determine that the user is attempting to hold a virtual object via a “pinch””; pressure-sensitive inter-digit interface being between thumb and pointer or other finger). In regards to claim 13, Bradner, Pahlavan, and Huang, as combined above disclose detecting release of the pinching gesture via the pressure-sensitive inter-digit interface after the simulated movement of the virtual object to a relocated position (Bradner Par. 0211 and 0213-0214 initiating release of a virtual object; pressure-sensitive inter-digit interface being between thumb and pointer or other finger); and placing the virtual object in a free state at the relocated position during which the virtual object ceases to respond to controls of the pointing controller (Bradner Par. 0211 and 0213-0214 initiating release of a virtual object). In regards to claim 14, Bradner, Pahlavan, and Huang, as combined above disclose while the virtual object is in the grabbed state, tracking movement of the pointing controller based on the sensor data from the inertial measurement unit (Bradner Par. 0093 and 0199 IMU and other sensors determining motion of the controller, which would be from a starting position in 3d space and resultant simulated 3d position and in a direction based on the movement and pointing direction of the hand and associated fingers wearing the controller); and displaying, by the display device, simulated movement of the virtual object through the three-dimensional virtual space corresponding to the movement of the pointing controller (Bradner Par. 0093 and 0199 IMU and other sensors determining motion of the controller, which would be from a starting position in 3d space and resultant simulated 3d position and in a direction based on the movement and pointing direction of the hand and associated fingers wearing the controller). In regards to claim 22, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 20. Huang further discloses detecting the touch-based gesture and displaying the simulated movement of the virtual object comprises: displaying the simulated movement of the virtual object in a forward direction away from the simulated position of the pointing controller in response to the swipe gesture being associated with the forward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear) or displaying the simulated movement of the virtual object in a backward direction towards the simulated position of the pointing controller in response to the swipe gesture being associated with the backward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 22, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 20. Huang further discloses detecting the touch-based gesture and displaying the simulated movement of the virtual object comprises: displaying the simulated movement of the virtual object in a forward direction away from the simulated position of the pointing controller in response to the swipe gesture being associated with the forward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear) or displaying the simulated movement of the virtual object in a backward direction towards the simulated position of the pointing controller in response to the swipe gesture being associated with the backward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). In regards to claim 23, Bradner, Pahlavan, and Huang, as combined above, disclose the invention of claim 21. Huang further discloses detecting the touch-based gesture and displaying the simulated movement of the virtual object comprises: displaying the simulated movement of the virtual object in a forward direction away from the simulated position of the pointing controller in response to the swipe gesture being associated with the forward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear) or displaying the simulated movement of the virtual object in a backward direction towards the simulated position of the pointing controller in response to the swipe gesture being associated with the backward direction (Par. 0068 a swipe gesture on a touch surface of a controller provides a direction, magnitude of displacement, acceleration, etc… of a displayed virtual object, wherein the function can be linear or non-linear). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to utilize the touch input swipe gestures of Huang as gestures on the touch pad of Bradner and Pahlavan. The motivation for doing so would have been to provide additional inputs to control virtual objects and “translate the attributes of the touch input data to appropriate actions at the virtual game application.” (Huang Par. 0068). Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bradner, US-20190290999, Pahlavan, US-20030142065, and Huang, US-20130324254, as combined above in regards to claim 2, in further view of Buhlmann, US-20170228921. In regards to claim 11, Bradner, Pahlavan, and Huang do not disclose expressly detecting the pointing-based interaction comprises: generating a pointing cone having a central axis aligned with the pointing vector, an origin proximate to the simulated position of the pointing controller, and a radius that increases with distance from the origin of the pointing cone; and detecting the pointing-based interaction responsive to the pointing cone overlapping with coordinates occupied by the virtual object. Buhlmann discloses generating a pointing cone having a central axis aligned with the pointing vector, an origin proximate to the simulated position of the pointing controller, and a radius that increases with distance from the origin of the pointing cone; and detecting the pointing-based interaction responsive to the pointing cone overlapping with coordinates occupied by the virtual object (Fig 6A-6E; Par. 0039 a pointing cone and detecting intersection of pointing vector and a virtual object). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art the base pointing device of Bradner, Pahlavan, and Huang can be improved in the same way as Buhlmann and utilize the improvement of a pointing cone. The motivation for doing so would have been a further way to manipulate objects (Buhlmann Par. 0039). Claim(s) 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bradner, US-20190290999, Pahlavan, US-20030142065, and Huang, US-20130324254, as combined above in regards to claim 2, in further view of Smith, US-8570273. In regards to claim 15, Bradner, Pahlavan, and Huang do not disclose expressly tracking the movement of the pointing controller is based at least in part on a stored armed model. Smith discloses the movement of the pointing controller is based at least in part on a stored armed model (Smith col. 17, line 55-col. 18, line 30 detecting motions of the arm and other body parts, i.e. arm model). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art the base pointing device of Bradner, Pahlavan, and Huang can be improved in the same way as Smith and utilize the improvement of an arm model. The motivation for doing so would have been to provide accurate arm and hand tracking (Smith col. 17, line 55-col. 18, line 30 detecting motions of the arm and other body parts, i.e. arm model). Allowable Subject Matter Claims 16 and 17 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: See parent case 18/040405. In regards to claim 16, the prior art of record fails to disclose, teach or fairly suggest to one of ordinary skill in the art, in conjunction with all the other claimed limitations: a method for controlling interactions with virtual objects in an augmented reality environment using a pointing controller having a wearable form factor suitable for wearing by one or more fingers of a user and the pointing controller having a touch- sensitive interface accessible when the pointing controller is worn by the one or more finger and specifically including “wherein tracking the movement comprises: detecting whether a user of the pointing controller is sitting or standing; anda djusting parameters of the arm model depending on whether the user of the pointing controller is sitting or standing.” In regards to claim 17, the prior art of record fails to disclose, teach or fairly suggest to one of ordinary skill in the art, in conjunction with all the other claimed limitations: a method for controlling interactions with virtual objects in an augmented reality environment using a pointing controller having a wearable form factor suitable for wearing by one or more fingers of a user and the pointing controller having a touch- sensitive interface accessible when the pointing controller is worn by the one or more finger and specifically including “wherein tracking the movement comprises: detecting a fatigue level associated with the user of the pointing controller; and adjusting parameters of the arm model depending on the detected fatigue level.” Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CORY A ALMEIDA whose telephone number is (571)270-3143. The examiner can normally be reached M-Th 9AM-730PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nitin (Kumar) Patel can be reached at (571) 272-7677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CORY A ALMEIDA/Primary Examiner, Art Unit 2628 11/4/25
Read full office action

Prosecution Timeline

Jul 31, 2024
Application Filed
May 22, 2025
Non-Final Rejection — §103
Oct 23, 2025
Response Filed
Nov 04, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601922
WAVEGUIDES WITH ENHANCED MODAL DENSITIES
2y 5m to grant Granted Apr 14, 2026
Patent 12591406
SYSTEM AND METHOD FOR GENERATING INTERACTIVE MEDIA
2y 5m to grant Granted Mar 31, 2026
Patent 12586521
DISPLAY PANEL, DISPLAY DEVICE, AND METHOD FOR DRIVING DISPLAY PANEL
2y 5m to grant Granted Mar 24, 2026
Patent 12586522
Correction Method Of Display Apparatus And Correction System Of The Display Apparatus
2y 5m to grant Granted Mar 24, 2026
Patent 12586492
ELECTRONIC DEVICE AND METHOD PROVIDING 3-DIMENSION IMAGE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
89%
With Interview (+22.5%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 790 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month