DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-15 are currently pending and prosecuted.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 16 April 2025 was considered by the examiner.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-15 are rejected under 35 U.S.C. 103 as being unpatentable over Lam et al., U SPG-Pub 2023/0112539, hereinafter Lam, in view of Winold et al., US PG-Pub 2022/0035443, hereinafter Winold.
Regarding Claim 1, Lam teaches a method for an electronic device (artificial reality system 100), the method comprising:
preprocessing ultrawideband (UWB) data (ultra-wideband impulse radio (UWB-IR) communication 140) and inertial measurement unit (IMU) sensor data (sensor 1240; [0121], “sensor 1240 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 1240”) for a pre-specified gesture period ([0058]-[0060], “transmitter 114 may apply and/or incorporate a time stamp into a UWB-IR signal transmitted to receivers 118.”);
generating gesture prediction data, based on the preprocessed UWB data and IMU data, by using a pretrained classifier ([0052], “processor 120(1) or 120(2) may execute and/or implement one or more software models and/or trained inferential models or classifiers”; [0074]); and
detecting a gesture, based on the gesture prediction data ([0074], “wearable 102 may detect a spike pattern indicative of the specific gesture via the machine-learning classifier and then determine that the user made the specific gesture based at least in part on the spike pattern”),
wherein the preprocessing comprises filtering the UWB data by using at least one first filter ([0155], “the output of one or more of the sensing components can be optionally processed using hardware signal processing circuitry (e.g., to perform amplification, filtering, and/or rectification).”), and
wherein the UWB data comprises at least one of ranging result data ([0028]), angle of arrival (AoA) data ([0060], “processor 120(2) may calculate and/or compute an angle of arrival for the UWB-IR signal relative to receivers 118 based at least in part on the first and second times of arrival and the time stamp”), or channel impulse response (CIR) data ([0032], “The head-mounted display may identify and/or detect the times of arrival for the impulse signals as received by the ultra-wideband antennas”) acquired through UWB ranging between the electronic device and at least one peripheral device ([0060], “processor 120(2) may calculate and/or compute an angle of arrival for the UWB-IR signal relative to receivers 118 based at least in part on the first and second times of arrival and the time stamp”; head-mounted display 104 and wearable 102).
However, Lam does not explicitly teach wherein the preprocessing comprises filtering the IMU sensor data by using a second filter.
Winold teaches wherein the preprocessing filtering the IMU sensor data by using a second filter (Winold: [0068], “Position and orientation may also be corrected by a gravity equation derived from a fusion of the IMU's accelerometer and gyroscope by means of a Kalman filter sensor fusion”).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to incorporate the second filter taught by Winold into the device taught by Lam in order to model the position and orientation of the user input correctly (Winold: [0068]), thereby providing a more accurate artificial reality device.
Regarding Claim 2, Lam, as modified by Winold, teaches the method of claim 1, wherein the pretrained classifier is a support vector machine (SVM)-based classifier (Winold: [0089], “ML may comprise . . . support vector machines”) or a convolutional neural network (CNN)-based classifier (Winold: [0089], “ML may comprise . . . convolutional neural networks”).
Regarding Claim 3, Lam, as modified by Winold, teaches the method of claim 2, wherein in case that the pretrained classifier is the SVM-based classifier, the generating of the gesture prediction data comprises:
acquiring PCA feature data, based on the preprocessed UWB data and the preprocessed IMU data, by using a principal component analysis (PCA) algorithm (Winold: [0089], “ML may comprise . . . principal components analysis”); and
generating the gesture prediction data, based on the PCA feature data, by using the SVM-based classifier (Winold: [0087]-[0089], specifically, [0088]).
Regarding Claim 4, Lam, as modified by Winold, teaches the method of claim 1, wherein the UWB ranging is UWB two-way ranging (TWR) (Lam: [0053]),
wherein the electronic device is configured to serve as a controller and an initiator for the UWB TWR (Lam: [0053]-[0055], noting how the head mounted display can have a transceiver), and
wherein the at least one peripheral device is configured to serve as a controlee and a responder for the UWB TWR (Lam: [0053]-[0055], noting how the wearable can have a transceiver).
Regarding Claim 5, Lam, as modified by Winold, teaches the method of claim 4, wherein the at least one peripheral device is configured to transmit, to the electronic device, a data information element comprising at least one of the UWB data acquired through the UWB TWR (Lam: [0058]-[0060]) and the IMU sensor data acquired by an IMU sensor of the corresponding peripheral device (Lam:[0121], [0132]).
Regarding Claim 6, Lam, as modified by Winold, teaches the method of claim 5, wherein the data information element is included in a ranging response message for the UWB TWR and transmitted through the ranging response message (Lam: Fig. 5, and corresponding descriptions, [0076]-[0079]).
Regarding Claim 7, Lam, as modified by Winold, teaches the method of claim 5, wherein the data information element comprises a length field specifying a length of a data field (Lam: Fig. 5, and corresponding descriptions, [0076]-[0079]), and a data field comprising at least one of the UWB data and the IMU sensor data (Lam: Fig. 5, and corresponding descriptions, [0076]-[0079], [0132]).
Regarding Claim 8, Lam, as modified by Winold, teaches the method of claim 5, wherein the data information element comprises a length field specifying the number of data table fields and data table fields of the number specified by the length field (Lam: Fig. 5, and corresponding descriptions, [0080]-[0082]; Winold: [0058], noting the signal is represented by a matrix),
wherein each data table field comprises a data type ID field specifying a data type included in a data field (Winold: [0058]-[0079], which describes the various matrix and the specific data type ID stored in each field of the matrix), a length field specifying a length of the data field (Winold: [0058]-[0079], which describes the various matrix and the specific data type ID stored in each field of the matrix), and a data field comprising data having the data type specified by the data type ID field (Winold: [0058]-[0079], which describes the various matrix and the specific data type ID stored in each field of the matrix), and
wherein the data type is one of a first type specifying the UWB data or a second type specifying the IMU sensor data (Winold: [0058]-[0079], which describes the various matrix and the specific data type ID stored in each field of the matrix).
Regarding Claim 9, Lam, as modified by Winold, teaches the method of claim 1, wherein the at least one first filter comprises a valid gesture identification filter using a standard deviation of data to distinguish between a valid gesture and a random gesture (Winold: [0096]), and an outlier removal filter for removing outliers by using a median of surrounding values (Winold: [0096]), and
wherein the outlier removal filter is applied after the valid gesture identification filter (Winold: [0096]).
Regarding Claim 10, Lam, as modified by Winold, teaches the method of claim 1, wherein the second filter is a moving average filter (Winold: [0096]), and
wherein the moving average filter causes each data point to be substituted by a median of as many data points as a number determined by a pre-specified window length (Winold: [0096]).
Regarding Claim 11, Lam, as modified by Winold, teaches the method of claim 1, further comprising, in case of an initial fine-tuning period:
comparing an error between the prediction data and actual data to calculate a loss function (Lam: [0121], [0135]; Winold: [0058]-[0064]); and
backpropagating the loss function to adjust parameters of the pretrained classifier (Lam: [0121], [0135]; Winold: [0058]-[0064]).
Regarding Claim 12, Lam teaches an electronic device (artificial reality system 100) comprising:
a transceiver ([0053], “transmitter 114 and/or receivers 118 may each be included in and/or represent part of a transceiver that facilitates and/or supports UWB-IR communications”); and
a controller (processor 120(1) or 120(2)) connected to the transceiver (Fig. 1, and corresponding descriptions),
wherein the controller is configured to:
preprocess ultrawideband (UWB) data (ultra-wideband impulse radio (UWB-IR) communication 140) and inertial measurement unit (IMU) sensor data (sensor 1240; [0121], “sensor 1240 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 1240”) for a pre-specified gesture period ([0058]-[0060], “transmitter 114 may apply and/or incorporate a time stamp into a UWB-IR signal transmitted to receivers 118.”);
generate gesture prediction data, based on the preprocessed UWB data and IMU data, by using a pretrained classifier ([0052], “processor 120(1) or 120(2) may execute and/or implement one or more software models and/or trained inferential models or classifiers”; [0074]); and
detect a gesture, based on the gesture prediction data ([0074], “wearable 102 may detect a spike pattern indicative of the specific gesture via the machine-learning classifier and then determine that the user made the specific gesture based at least in part on the spike pattern”),
wherein the controller is further configured to, for the preprocessing, filter the UWB data by using at least one first filter ([0155], “the output of one or more of the sensing components can be optionally processed using hardware signal processing circuitry (e.g., to perform amplification, filtering, and/or rectification).”), and
wherein the UWB data comprises at least one of ranging result data ([0028]), angle of arrival (AoA) data ([0060], “processor 120(2) may calculate and/or compute an angle of arrival for the UWB-IR signal relative to receivers 118 based at least in part on the first and second times of arrival and the time stamp”), or channel impulse response (CIR) data ([0032], “The head-mounted display may identify and/or detect the times of arrival for the impulse signals as received by the ultra-wideband antennas”) acquired through UWB ranging between the electronic device and at least one peripheral device ([0060], “processor 120(2) may calculate and/or compute an angle of arrival for the UWB-IR signal relative to receivers 118 based at least in part on the first and second times of arrival and the time stamp”; head-mounted display 104 and wearable 102).
However, Lam does not explicitly teach wherein the controller is further configured to, for the preprocessing, filter the IMU sensor data by using a second filter.
Winold teaches wherein the controller is further configured to, for the preprocessing, filter the IMU sensor data by using a second filter (Winold: [0068], “Position and orientation may also be corrected by a gravity equation derived from a fusion of the IMU's accelerometer and gyroscope by means of a Kalman filter sensor fusion”).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to incorporate the second filter taught by Winold into the device taught by Lam in order to model the position and orientation of the user input correctly (Winold: [0068]), thereby providing a more accurate artificial reality device.
Regarding Claim 13, Lam, as modified by Winold, teaches the electronic device of claim 12, wherein the pretrained classifier is a support vector machine (SVM)-based classifier (Winold: [0089], “ML may comprise . . . support vector machines”) or a convolutional neural network (CNN)-based classifier (Winold: [0089], “ML may comprise . . . convolutional neural networks”).
Regarding Claim 14, Lam, as modified by Winold, teaches the electronic device of claim 13, wherein in case that the pretrained classifier is the SVM-based classifier, the controller is configured to:
acquire PCA feature data, based on the preprocessed UWB data and the preprocessed IMU data, by using a principal component analysis (PCA) algorithm (Winold: [0089], “ML may comprise . . . principal components analysis”); and
generate the gesture prediction data, based on the PCA feature data, by using the SVM-based classifier (Winold: [0087]-[0089], specifically, [0088]).
Regarding Claim 15, Lam, as modified by Winold, teaches the electronic device of claim 12, wherein the UWB ranging is UWB two-way ranging (TWR) (Lam: [0053]),
wherein the electronic device is configured to serve as a controller and an initiator for the UWB TWR (Lam: [0053]-[0055], noting how the head mounted display can have a transceiver), and
wherein the at least one peripheral device is configured to serve as a controlee and a responder for the UWB TWR (Lam: [0053]-[0055], noting how the wearable can have a transceiver).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEPHEN T REED whose telephone number is (571)272-7234. The examiner can normally be reached M-F: 0800-1800.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
STEPHEN T. REED
Primary Examiner
Art Unit 2627
/Stephen T. Reed/Primary Examiner, Art Unit 2627