DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
2. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
3. Claim(s) 1-6 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Elias et al. (US 2019/0101981 A1, hereinafter referred as “Elias”).
Regarding claim 1, Elias discloses a data collection system (¶0005 discloses VR glove capable of measuring the movement of individual finger and thumb bones) comprising:
an augmented reality (AR) headset configured to be worn by a user (¶0003 and ¶0022 discloses VR headset and the VR gloves can be attached to a user); and
a wearable data collection device (VR glove) comprising:
a hand element configured to receive a hand of the user (Fig. 2 and abstract discloses the VR glove can include a plurality of inertial measurement units (IMUs) to track the movement of one or more finger and/or hand sections);
a plurality of finger elements extending from the hand element (Fig. 2 and abstract discloses the VR glove can include a plurality of inertial measurement units (IMUs) to track the movement of one or more finger and/or hand sections);
a plurality of joints, wherein each joint of the plurality of joints couples a finger element of the plurality of finger elements to the hand element (Fig. 2, ¶0029 and ¶0034 discloses the VR glove 230 can be capable of fine-level motion capture. That is, the VR glove 230 can be capable of discerning between the movement of an entire finger (e.g., the user waiving his or her index finger) and the movement of a finger joint (e.g., the user bending the index finger));
a controller mount on the wearable data collection device (VR glove) configured to secure a controller (Fig. 2 and ¶0057 discloses the VR glove 330 can be a knitted or woven glove where one or more (e.g., all) electronics components can be integrated into the fabric of the glove) associated with the AR headset (Fig. 3 and ¶0053 discloses hand controller 332 can one or more components including, but not limited to, a memory 325, a connector 327, and a transceiver 338… The connector 327 can be used to connect the VR glove 330 to one or more components (e.g., VR headset) for wired communications), wherein the AR headset is configured to track a position and orientation of the controller (Fig. 4A and ¶0069 discloses controller located on a VR headset can …receive information associated with the motions of steps 414-426 and can update the simulated environment; Figs. 2-3 and ¶0065 discloses controller (e.g., controller 323 illustrated in FIG. 3) can receive one or more signals, which can be indicative of the local frame 492, from the respective IMU (step 418 of process 400); and abstract discloses the IMUs can include one or more motion sensors, such as a gyroscope and an accelerometer, for measuring the orientation, position, and velocity of objects (e.g., finger bones) that the IMU can be attached);
a plurality of sensors mounted on the wearable data collection device (VR glove) configured to capture sensor data (Fig. 2 and ¶0027 discloses the IMUs 202 can be configured to measure the acceleration and the rotational rate of the user's bone in order to capture the motion of the user's hand and/or fingers) during a recording session (¶0061 discloses examples of the disclosure can include determining (e.g., including recording) the range of motion for the user's hand); and
a processing circuit operatively coupled to the plurality of sensors configured to collect (¶0032 discloses one or more (e.g., each) IMUs 302 can be coupled to a microcontroller unit (MCU) 304. The IMU 302 can measure inertial motion (e.g., acceleration and rotational rate) of the corresponding finger or thumb bone and can communicate the information to the MCU 304. The MCU 304 can process the information and/or communicate the information to a controller 323) and transmit the sensor data to the AR headset (Fig. 3 and ¶0053 discloses hand controller 332 can one or more components including, but not limited to, a memory 325, a connector 327, and a transceiver 338… The connector 327 can be used to connect the VR glove 330 to one or more components (e.g., VR headset)).
Regarding claim 2, Elias discloses the data collection system of claim 1, wherein the AR headset includes one or more cameras configured to track the controller to determine the position and orientation of the wearable data collection device (VR glove) (¶0078 discloses the camera (e.g., included in a VR headset) could take images of the user's fingers and can check whether the orientation of the fingers in the images differ from the system's simulation. Additionally or alternatively, the camera can be used to track the position(s) of the user's hand(s); abstract and ¶0062 discloses taking measurements of the orientation and the position of objects that the IMU can be attached to and can include using one or more cameras (not shown); and Fig. 2 and ¶0057 discloses the VR glove 330 can be a knitted or woven glove where one or more (e.g., all) electronics components including controller 223 can be integrated into the fabric of the glove).
Regarding claim 3, Elias discloses the data collection system of claim 1, wherein the controller mount secures the controller to the wearable data collection device (VR glove) such that the controller moves in coordination with the wearable data collection device (VR glove) (Fig. 2 and ¶0057 discloses the VR glove 330 can be a knitted or woven glove where one or more (e.g., all) electronics components can be integrated into the fabric of the glove).
Regarding claim 4, Elias discloses the data collection system of claim 1, wherein the AR headset further comprises a camera mounted on the AR headset configured to capture visual data of an environment in a field of view of the user (¶0019 discloses the one or more cameras can be used to capture the user's real environment in AR technology).
Regarding claim 5, Elias discloses the data collection system of claim 1, wherein the plurality of sensors comprises: at least one pressure sensor positioned on each of the plurality of finger elements (Figs. 2-3 and ¶0042 discloses the VR glove can also include one or more force sensors 306. The force sensors 306 can be located at the fingertips of the VR glove 330); at least one position sensor at each of the plurality of joints configured to capture angle data (¶0029 and ¶0034 discloses the controller 323 can process the signals from the respective bus 322 individually to track the motion of a specific finger bone and/or can process two or more signals collectively to track the motion of the finger joint(s)); and at least one camera mounted on the wearable data collection device (VR glove) (¶0026 discloses VR glove 230 can include a plurality of electronic components; and ¶0069 discloses one or more other components (e.g., cameras, optical sensors, the reset electrodes 214 illustrated in FIG. 2) can be provide the same and/or additional information related to one or more the user's hand movement, location, and position).
Regarding claim 6, Elias discloses the data collection system of claim 1, further comprising a connection interface configured to transmit the sensor data from the processing circuit to the AR headset, wherein the connection interface comprises at least one of a wired connection (Fig. 3 and ¶0053 discloses hand controller 332 can one or more components including, but not limited to, a memory 325, a connector 327, and a transceiver 338… The connector 327 can be used to connect the VR glove 330 to one or more components (e.g., VR headset) for wired communications) and a wireless connection (Fig. 2 and ¶0027 discloses the transceiver 238 can be configured to communicate with an external device (e.g., the VR headset and/or the host device)).
Claim Rejections - 35 USC § 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. Claim(s) 7, 15 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis et al. (US 2024/0408757 A1, hereinafter referred as “Jarvis8757”) and in further view of Ranjbar et al. (US 12,365,093 B1, hereinafter referred as “Ranjbar”).
Regarding claim 7, Elias doesn’t disclose the data collection system of claim 1, wherein the data collection system is configured to: record head position and orientation data from the AR headset along with the sensor data from the wearable data collection device; and use the sensor data and the head position and orientation data to train a neural network that controls a robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device.
However, in the same field of endeavor, Jarvis8757 discloses wherein the data collection system is configured to: record head position and orientation data from the AR headset (¶0048 and ¶0073 discloses connect to the MR devices 340 to get the position/pose information of the human data collector) along with the sensor data from the wearable data collection device (¶0064 discloses the human data collector 412 may wear an intelligent glove having various sensors embedded in the glove… The collected data may then be provided to the wearable computation subsystem for recording); and use the sensor data and the head position and orientation data to train a neural network (¶0014 discloses the data collection device may comprise a human-machine operation interface worn by the human data collector and used to perform the various human-operated robot tasks related to testing and/or training the machine learning model or other software; and ¶0017 discloses the computation device includes processing capabilities that allow it to execute the machine learning model and the other software so that the machine learning model and other software can be trained and/or tested) that controls a robotic counterpart device (¶0009 discloses a robot teaching and testing system that performs human-operated robot tasks according to instructions generated from generative AI models)...
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that inconsistencies or noise in human hand movement can be eliminated.
Elias as modified doesn’t disclose robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device.
However, in the same field of endeavor, Ranjbar discloses robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device (col. 6, lines 11-24 discloses the sensor configuration of the sensor glove 104 may generally match a sensor configuration of a target robot hand... For example, if a robot hand of interest has tactile sensors, the sensor glove 104 may include tactile sensors).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias so that so that the synthetic source data can contain the same scope of sensor data for the subject hand that true robot data would have for the target robot hand (col. 6, lines 13-16).
Regarding claim 15, Elias discloses a … method comprising:
receiving sensor data captured… by a plurality of sensors on a wearable data collection device (VR glove) (Fig. 2 and ¶0027 discloses the IMUs 202 can be configured to measure the acceleration and the rotational rate of the user's bone in order to capture the motion of the user's hand and/or fingers), wherein:
the wearable data collection device (VR glove) comprises a hand element configured to receive a hand of a user and a plurality of finger elements extending from the hand element (Fig. 2 and abstract discloses the VR glove can include a plurality of inertial measurement units (IMUs) to track the movement of one or more finger and/or hand sections); and
a controller associated with an augmented reality (AR) headset (Fig. 3 and ¶0053 discloses hand controller 332 can one or more components including, but not limited to, a memory 325, a connector 327, and a transceiver 338… The connector 327 can be used to connect the VR glove 330 to one or more components (e.g., VR headset) for wired communications) is secured to the wearable data collection device (VR glove) (Fig. 2 and ¶0057 discloses the VR glove 330 can be a knitted or woven glove where one or more (e.g., all) electronics components can be integrated into the fabric of the glove);
receiving position and orientation data of the wearable data collection device (VR glove) captured by the AR headset (Fig. 4A and ¶0069 discloses controller located on a VR headset can …receive information associated with the motions of steps 414-426 and can update the simulated environment; Figs. 2-3 and ¶0065 discloses controller (e.g., controller 323 illustrated in FIG. 3) can receive one or more signals, which can be indicative of the local frame 492, from the respective IMU (step 418 of process 400); and abstract discloses the IMUs can include one or more motion sensors, such as a gyroscope and an accelerometer, for measuring the orientation, position, and velocity of objects (e.g., finger bones) that the IMU can be attached)…
Elias doesn’t disclose method of training a robotic control model …data captured during a recording session; processing the sensor data and the position and orientation data to generate training data for a neural network; and training the neural network using the training data to generate a trained neural network model, wherein the trained neural network model is configured to control a robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device.
However, in the same field of endeavor, Jarvis8757 discloses method of training a robotic control model (title discloses Human-In-Loop Robot Training And Testing System With Generative Artificial Intelligence (Ai))…data captured during a recording session (¶0050 discloses the collected data may be provided to a wearable computation subsystem for recording); processing the sensor data (¶0064 discloses the human data collector 412 may wear an intelligent glove having various sensors embedded in the glove… The collected data may then be provided to the wearable computation subsystem for recording) and the position and orientation data (¶0048 and ¶0073 discloses connect to the MR devices 340 to get the position/pose information of the human data collector) to generate training data for a neural network (¶0014 discloses the data collection device may comprise a human-machine operation interface worn by the human data collector and used to perform the various human-operated robot tasks related to testing and/or training the machine learning model or other software; and ¶0017 discloses the computation device includes processing capabilities that allow it to execute the machine learning model and the other software so that the machine learning model and other software can be trained and/or tested); and training the neural network using the training data to generate a trained neural network model (¶0014 discloses the data collection device may comprise a human-machine operation interface worn by the human data collector and used to perform the various human-operated robot tasks related to testing and/or training the machine learning model or other software; and ¶0017 discloses the computation device includes processing capabilities that allow it to execute the machine learning model and the other software so that the machine learning model and other software can be trained and/or tested), wherein the trained neural network model is configured to control a robotic counterpart device (¶0009 discloses a robot teaching and testing system that performs human-operated robot tasks according to instructions generated from generative AI models)...
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that inconsistencies or noise in human hand movement can be eliminated.
Elias as modified doesn’t disclose robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device.
However, in the same field of endeavor, Ranjbar discloses robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device (col. 6, lines 11-24 discloses the sensor configuration of the sensor glove 104 may generally match a sensor configuration of a target robot hand... For example, if a robot hand of interest has tactile sensors, the sensor glove 104 may include tactile sensors).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias so that so that the synthetic source data can contain the same scope of sensor data for the subject hand that true robot data would have for the target robot hand (col. 6, lines 13-16).
Regarding claim 17, Elias discloses the method of claim 15, wherein the sensor data comprises: pressure data from at least one pressure sensor positioned on each finger element of the plurality of finger elements (Figs. 2-3 and ¶0042 discloses the VR glove can also include one or more force sensors 306. The force sensors 306 can be located at the fingertips of the VR glove 330); angle data from at least one position sensor at each of a plurality of joints that couple the plurality of finger elements to the hand element (¶0029 and ¶0034 discloses the controller 323 can process the signals from the respective bus 322 individually to track the motion of a specific finger bone and/or can process two or more signals collectively to track the motion of the finger joint(s)); and visual data from at least one camera mounted on the wearable data collection device (VR glove) (¶0026 discloses VR glove 230 can include a plurality of electronic components; and ¶0069 discloses one or more other components (e.g., cameras, optical sensors, the reset electrodes 214 illustrated in FIG. 2) can be provide the same and/or additional information related to one or more the user's hand movement, location, and position).
6. Claim(s) 8-10 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis et al. (US 2023/0072317 A1, hereinafter referred as “Jarvis2317”).
Regarding claim 8, Elias discloses a method of collecting training data using a data collection system (¶0005 discloses VR glove capable of measuring the movement of individual finger and thumb bones) comprising an augmented reality (AR) headset and a wearable data collection device (VR glove) (¶0003 and ¶0022 discloses VR headset and the VR gloves can be attached to a user), the method comprising:
…tracking, by the AR headset, a position and orientation of a controller (Fig. 4A and ¶0069 discloses controller located on a VR headset can …receive information associated with the motions of steps 414-426 and can update the simulated environment; Figs. 2-3 and ¶0065 discloses controller (e.g., controller 323 illustrated in FIG. 3) can receive one or more signals, which can be indicative of the local frame 492, from the respective IMU (step 418 of process 400); and abstract discloses the IMUs can include one or more motion sensors, such as a gyroscope and an accelerometer, for measuring the orientation, position, and velocity of objects (e.g., finger bones) that the IMU can be attached) secured to the wearable data collection device (VR glove) (Fig. 2 and ¶0057 discloses the VR glove 330 can be a knitted or woven glove where one or more (e.g., all) electronics components can be integrated into the fabric of the glove);
capturing sensor data via a plurality of sensors mounted on the wearable data collection device (VR glove) (Fig. 2 and ¶0027 discloses the IMUs 202 can be configured to measure the acceleration and the rotational rate of the user's bone in order to capture the motion of the user's hand and/or fingers) during the recording session (¶0061 discloses examples of the disclosure can include determining (e.g., including recording) the range of motion for the user's hand);
transmitting the sensor data from the wearable data collection device (VR glove) to the AR headset (Fig. 3 and ¶0053 discloses hand controller 332 can one or more components including, but not limited to, a memory 325, a connector 327, and a transceiver 338… The connector 327 can be used to connect the VR glove 330 to one or more components (e.g., VR headset)) via a processing circuit (¶0032 discloses one or more (e.g., each) IMUs 302 can be coupled to a microcontroller unit (MCU) 304. The IMU 302 can measure inertial motion (e.g., acceleration and rotational rate) of the corresponding finger or thumb bone and can communicate the information to the MCU 304. The MCU 304 can process the information and/or communicate the information to a controller 323).
Elias doesn’t disclose initiating a recording session in response to receiving a user input and terminating the recording session in response to receiving a second user input.
However, in the same field of endeavor, Jarvis2317 discloses initiating a recording session in response to receiving a user input and terminating the recording session in response to receiving a second user input (Fig. 1B and ¶0035 discloses the data collector 105 may use the voice user interface 122 to provide audio commands such as ‘begin recording’ at the start of a data collection process or at the start of an instructed action or ‘stop recording’ at the end of the data collection process or the end of an instructed action).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that bounded, intentional demonstrations are used for inputting data for machine learning.
Regarding claim 9, Elias discloses the method of claim 8, further comprising capturing visual data of an environment in a field of view of a user using a camera on the AR headset (¶0019 discloses the one or more cameras can be used to capture the user's real environment in AR technology).
Elias doesn’t disclose …during a recording session.
However, in the same field of endeavor, Jarvis2317 discloses …during the recording session (Fig. 1B and ¶0035 discloses the data collector 105 may use the voice user interface 122 to provide audio commands such as ‘begin recording’ at the start of a data collection process or at the start of an instructed action or ‘stop recording’ at the end of the data collection process or the end of an instructed action).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that bounded, intentional demonstrations are used for inputting data for machine learning.
Regarding claim 10, Elias discloses the method of claim 8, wherein the plurality of sensors comprises: at least one pressure sensor positioned on each of a plurality of finger elements of the wearable data collection device (VR glove) (Figs. 2-3 and ¶0042 discloses the VR glove can also include one or more force sensors 306. The force sensors 306 can be located at the fingertips of the VR glove 330); at least one position sensor at each of a plurality of joints that couple the plurality of finger elements to a hand element of the wearable data collection device (VR glove) (¶0029 and ¶0034 discloses the controller 323 can process the signals from the respective bus 322 individually to track the motion of a specific finger bone and/or can process two or more signals collectively to track the motion of the finger joint(s)); and at least one camera mounted on the wearable data collection device (VR glove) (¶0026 discloses VR glove 230 can include a plurality of electronic components; and ¶0069 discloses one or more other components (e.g., cameras, optical sensors, the reset electrodes 214 illustrated in FIG. 2) can be provide the same and/or additional information related to one or more the user's hand movement, location, and position).
Regarding claim 12, Elias discloses the method of claim 8, wherein transmitting the sensor data to the AR headset comprises transmitting the sensor data via at least one of a wired connection (Fig. 3 and ¶0053 discloses hand controller 332 can one or more components including, but not limited to, a memory 325, a connector 327, and a transceiver 338… The connector 327 can be used to connect the VR glove 330 to one or more components (e.g., VR headset) for wired communications) and a wireless connection (Fig. 2 and ¶0027 discloses the transceiver 238 can be configured to communicate with an external device (e.g., the VR headset and/or the host device)).
7. Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis2317 and in further view of Harris et al. (US 2015/0253847 A1, hereinafter referred as “Harris”).
Regarding claim 13, Elias as modified doesn’t disclose the method of claim 8, wherein the controller is mechanically mounted to the wearable data collection device such that the controller moves in coordination with the wearable data collection device.
However, in the same field of endeavor, Harris discloses wherein the controller is mechanically mounted to the wearable data collection device such that the controller moves in coordination with the wearable data collection device (Fig. 1 and ¶0030 discloses the control module 100 is attachable to the glove 110, but the control module 100 is not enclosed by the glove. For example, the control module 100 may comprise a clasp that is configured to attach to a slit in the fabric of the glove 110).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias so that the control module 100 individually can comprise water proof or water resistant construction (¶0030).
8. Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis2317, in further view of Jarvis8757 and still in further view of Ranjbar.
Regarding claim 14, Elias as modified doesn’t disclose the method of claim 8, further comprising training, using the sensor data and the position and orientation of the controller, a neural network that controls a robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device.
However, in the same field of endeavor, Jarvis8757 discloses training, using the sensor data (¶0064 discloses the human data collector 412 may wear an intelligent glove having various sensors embedded in the glove… The collected data may then be provided to the wearable computation subsystem for recording) and the position and orientation of the controller (¶0048 and ¶0073 discloses connect to the MR devices 340 to get the position/pose information of the human data collector), a neural network (¶0014 discloses the data collection device may comprise a human-machine operation interface worn by the human data collector and used to perform the various human-operated robot tasks related to testing and/or training the machine learning model or other software; and ¶0017 discloses the computation device includes processing capabilities that allow it to execute the machine learning model and the other software so that the machine learning model and other software can be trained and/or tested) that controls a robotic counterpart device (¶0009 discloses a robot teaching and testing system that performs human-operated robot tasks according to instructions generated from generative AI models)...
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that inconsistencies or noise in human hand movement can be eliminated.
Elias as modified doesn’t disclose …robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device.
However, in the same field of endeavor, Ranjbar discloses robotic counterpart device having a joint and sensor configuration that matches the wearable data collection device (col. 6, lines 11-24 discloses the sensor configuration of the sensor glove 104 may generally match a sensor configuration of a target robot hand... For example, if a robot hand of interest has tactile sensors, the sensor glove 104 may include tactile sensors).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias so that the synthetic source data can contain the same scope of sensor data for the subject hand that true robot data would have for the target robot hand (col. 6, lines 13-16).
9. Claim(s) 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis8757, in further view of Ranjbar and still in further view of Buckley et al. (US 5,673,367 A, hereinafter referred as “Buckley”).
Regarding claim 16, Elias as modified doesn’t disclose the method of claim 15, wherein controlling the robotic counterpart device with the trained neural network model comprises: receiving real-time sensor data from multiple sensors on the robotic counterpart device; processing the real-time sensor data using the trained neural network model to determine control signals; and transmitting the control signals to the robotic counterpart device to control movement of the robotic counterpart device.
However, in the same field of endeavor, Buckley discloses wherein controlling the robotic counterpart device with the trained neural network model comprises: receiving real-time sensor data from multiple sensors on the robotic counterpart device (col. 6, lines 14-16 discloses the prosthetic hand 100 is made with environmental feedback sensors 102 on each finger joint 104); processing the real-time sensor data using the trained neural network model to determine control signals (col. 4, lines 12-19 discloses neural network receives input from environmental feedback sensors that indicate the state of the robot; this information is used in formulating the robot's response to environmental factors); and transmitting the control signals to the robotic counterpart device to control movement of the robotic counterpart device (abstract and col. 6, lines 50-53 discloses the neural network controlled robotic hand is able to grasp a wide variety of different objects by generalizing from the training sets).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias for the purpose of using a back-propagation or other self-organizing neural network for this control method is that the robot will learn from the corrective process just described and adapt itself to the new conditions, gaining a sort of coordination that can be used when the same or similar conditions are encountered again (col. 5, lines 15-20).
10. Claim(s) 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis8757, in further view of Ranjbar, in further view of Jarvis2317 and still in further view of Gildert et al. (US 2016/0243701 A1, hereinafter referred as “Gildart”).
Regarding claim 18, Elias as modified doesn’t disclose the method of claim 15, further comprising: receiving head position and orientation data captured by the AR headset during the recording session; and incorporating the head position and orientation data into the training data for the neural network.
However, in the same field of endeavor, Jarvis2317 discloses …during the recording session (Fig. 1B and ¶0035 discloses the data collector 105 may use the voice user interface 122 to provide audio commands such as ‘begin recording’ at the start of a data collection process or at the start of an instructed action or ‘stop recording’ at the end of the data collection process or the end of an instructed action).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that bounded, intentional demonstrations are used for inputting data for machine learning.
Elias as modified doesn’t disclose receiving head position and orientation data captured by the AR headset…; and incorporating the head position and orientation data into the training data for the neural network.
However, in the same field of endeavor, Gildart discloses receiving head position and orientation data captured by the AR headset (¶0129 discloses virtual reality headset may include the IMU 114, which may be configured to be mounted on the operator's head and sense position and orientation of the operator's head)…; and incorporating the head position and orientation data into the training data for the neural network (¶0337 discloses head orientation stream frame record 2900 includes an IMU x-axis field 2901, an IMU y-axis field 2902 and an IMU z-axis field 2903… send an operator interface head orientation stream frame record representing the head orientation information… to the analyzer 992; ¶0109 discloses the autonomous control information may be generated by an analyzer and may represent an artificial intelligence model, or a learning model, such as, for example, a deep learning model).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias so that including head pose lets the model infer task phase transitions.
11. Claim(s) 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elias in view of Jarvis8757, in further view of Ranjbar, in further view of Jarvis2317 and still in further view of Gupta et al. (US 2024/0103612 A1, hereinafter referred as “Gupta”).
Regarding claim 19, Elias as modified doesn’t disclose the method of claim 15, further comprising: receiving visual data of an environment captured by a camera on the AR headset during the recording session; and incorporating the visual data into the training data for the neural network.
However, in the same field of endeavor, Jarvis2317 discloses …during the recording session (Fig. 1B and ¶0035 discloses the data collector 105 may use the voice user interface 122 to provide audio commands such as ‘begin recording’ at the start of a data collection process or at the start of an instructed action or ‘stop recording’ at the end of the data collection process or the end of an instructed action).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Elias so that bounded, intentional demonstrations are used for inputting data for machine learning.
Elias as modified still doesn’t disclose receiving visual data of an environment captured by a camera on the AR headset…; and incorporating the visual data into the training data for the neural network.
However, in the same field of endeavor, Gupta discloses receiving visual data of an environment captured by a camera on the AR headset (¶0024 discloses detecting movements of a wearable head gear configured to present virtual content to a user, and generating sensor data and visual data using an inertial sensor and a camera, respectively, wherein the visual data captures a field of view of the user relative to one or more frame of reference)…; and incorporating the visual data into the training data for the neural network (¶0024 discloses mapping the visual data to a virtual world using an image associated with the visual data to localize the user in the virtual world; providing the visual data to a first Machine Learning (ML) mode).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Elias for the purpose of determining intent using visual data of the egocentric view.
Allowable Subject Matter
12. Claims 11 and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PRIYANK J SHAH whose telephone number is (571)270-3732. The examiner can normally be reached on 10:00 - 6:00 M-F.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached on 5712727671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PRIYANK J SHAH/Primary Examiner, Art Unit 2621