Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
DETAILED ACTION
Claims status
Claims 1-20 are pending as the applicant filed response on 11/15/2023.
Claim Rejections - 35 USC § 112
2. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claims 1-20, the term “confidence” is vague and a relative term that renders the claim indefinite. The term “confidence” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably appraised of the scope of the invention. An artisan doing measuring and testing would not know at what point “confidence” within the scope of the claim had been accomplished because nothing within the disclosure establishes when a sufficient “confidence” occurs.
Note: In view of the PTO compact prosecution, the Examiner notes that due to the indefiniteness issues described above all consideration of the merits of the claims in view of prior art is as best understood.
Claims 1 and 11 (dependent claims 2-10 and 11-20) claim static or dynamic data (one condition) yet the claims claim both condition (static and dynamic, for example in response to static, in response to dynamic), when is static only is unclear how to reproduce dynamic data, when is dynamic not sure how is related to static data. The examiner assumed the or is an and correction is required.
Note: In view of the PTO compact prosecution, the Examiner notes that due to the indefiniteness issues described above all consideration of the merits of the claims in view of prior art is as best understood.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Claim 1, Step 1 the claim is a process (or machine) (Yes),
Step 2A Prong One, does the claim recite an abstract idea? current claim related to a tracking system with a calibration method, comprising: a camera, configured to obtain camera data, wherein the camera data comprises a body image of a body portion of a user; an inertial measurement unit (IMU) sensor, configured to obtain sensor data, wherein the IMU sensor is adapted to be mounted on the body portion; and a processor configured appears to be an abstract idea of mental process (MPEP 2106.04(a)) and/or data gathering equivalent to mathematical concept or mathematical manipulation function (MPEP 2106.04 (a) (2) (concept need not be expressed in mathematical symbols, because "[w]ords used in a claim operating on data to solve a problem can serve the same purpose as a formula), (OR Mathematical Concepts and Mental Processes) Step 2A Prong One: Yes.
Step 2A Prong Two, is the claim directed to an abstract idea? In other words, does claim recite additional elements that integrate the Judicial Exception into a practical application? the additional elements of determine the body portion being static or dynamic based on the camera data or the sensor data; in response to the body portion being static, determine a pose confidence of a current pose of the body portion based on the camera data; and calibrate an accumulative error of the sensor data based on the pose confidence and the camera data;
in response to the body portion being dynamic, determine a first moving track of a body moving track of the body portion based on the camera data, determine a second moving track of the body moving track of the body portion based on the sensor data, and calibrate the accumulative error of the sensor data based on the first moving track and the second moving track are recited at a high level of generality and merely amount to a particular field of use (see MPEP 2106.05(h)) and/or insignificant post-solution activity (MPEP 2106.05(g)), this does not integrate the Judicial Exception into a practical application,
Step 2A Prong Two: NO.
Step 2B, Does the claim recite additional element that amount to significantly more than the Judicial exception? The additional element of track the body portion based on the sensor data appears to be field of use (See MPEP 2106.05(h) and MPEP 2106.05(f)) and/or merely amounts to insignificant extra-solution output of the results (see MPEP 2106.05(g)) and therefore fails to integrate the abstract idea into a practical application or amount to significantly more. Step 2B: No. claim 1 not eligible.
Claim 11, Step 1 the claim is a process (or machine) (Yes),
Step 2A Prong One, does the claim recite an abstract idea? current claim related a calibration method for a tracking system, comprising: obtaining camera data from a camera, wherein the camera data comprises a body image of a body portion of a user; obtaining sensor data from an inertial measurement unit (IMU) sensor, wherein the IMU sensor is adapted to be mounted on the body portion appears to be an abstract idea of mental process (MPEP 2106.04(a)) and/or data gathering equivalent to mathematical concept or mathematical manipulation function (MPEP 2106.04 (a) (2) (concept need not be expressed in mathematical symbols, because "[w]ords used in a claim operating on data to solve a problem can serve the same purpose as a formula), (OR Mathematical Concepts and Mental Processes) Step 2A Prong One: Yes.
Step 2A Prong Two, is the claim directed to an abstract idea? In other words, does claim recite additional elements that integrate the Judicial Exception into a practical application? the additional elements of determining the body portion being static or dynamic based on the camera data or the sensor data;
in response to the body portion being static, determining a pose confidence of a current pose of the body portion based on the camera data; and calibrating an accumulative error of the sensor data based on the pose confidence and the camera data; in response to the body portion being dynamic, determining a first moving track of a body moving track of the body portion based on the camera data, determining a second moving track of the body moving track of the body portion based on the sensor data, and calibrating the accumulative error of the sensor data based on the first moving track and the second moving track are recited at a high level of generality and merely amount to a particular field of use (see MPEP 2106.05(h)) and/or insignificant post-solution activity (MPEP 2106.05(g)), this does not integrate the Judicial Exception into a practical application,
Step 2A Prong Two: NO.
Step 2B, Does the claim recite additional element that amount to significantly more than the Judicial exception? The additional element tracking the body portion based on the sensor data appears to be field of use (See MPEP 2106.05(h) and MPEP 2106.05(f)) and/or merely amounts to insignificant extra-solution output of the results (see MPEP 2106.05(g)) and therefore fails to integrate the abstract idea into a practical application or amount to significantly more. Step 2B: No. claim 11 not eligible.
Claim 2, related to obtain the first moving track based on an outside-in tracking algorithm; and obtain the second moving track based on an inside-out tracking algorithm, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 2 not eligible.
Claim 3, related to align the first moving track with the second moving track; calculate a drift calibration matrix between an outside-in coordinate system of the first moving track and an inside-out coordinate system of the second moving track; and determine a calibrated data by applying the drift calibration matrix to the sensor data, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 3 not eligible.
Claim 4, related to generate the first moving track according to the body image based on a simultaneous localization and mapping algorithm, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 4 not eligible.
Claim 5, related to wherein the IMU sensor is configured to detect a linear acceleration or an angular velocity of the body portion, and the processor is configured to generate the second moving track according to the linear acceleration or the angular velocity, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 5 not eligible.
Claim 6, related to in response to the body portion being static and the pose confidence being greater than a confidence threshold, determine a camera coordinate value of the current pose; and calibrate the accumulative error of the sensor data based on a transformation relationship from the camera coordinate value to a IMU coordinate value of the IMU sensor, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 6 not eligible.
Claim 7, related to in response to the body portion being static and the pose confidence not being greater than a confidence threshold, obtain an additional frame of additional camera data from the camera to fuse with a current frame of the camera data; and determine the pose confidence of the current pose of the body portion based on the camera data and the additional data, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 7 not eligible.
Claim 8, related to in response to the body portion being static and the pose confidence being greater than a confidence threshold, obtain an additional sensor data from an additional IMU sensor, wherein the additional IMU sensor is adapted to be mounted on an additional body portion, determine a first distance between the body portion and the additional body portion based on the camera data, determine a second distance between the body portion and the additional body portion based on the sensor data and the additional sensor data, and calibrate the accumulative error of the sensor data based on the first distance and the second distance, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 8 not eligible.
Claim 9, related to related to in response to the body portion being static and the pose confidence not being greater than a confidence threshold, obtain an additional camera data from an additional camera to fuse with the camera data; and determine the pose confidence of the current pose of the body portion based on the camera data and the additional data, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 9 not eligible.
Claim 10, related to wherein the body portion is a first joint of the user and the additional body portion is a second joint of the user, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 10 not eligible.
Claim 12, related to obtaining the first moving track based on an outside-in tracking algorithm; and obtaining the second moving track based on an inside-out tracking algorithm, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 12 not eligible.
Claim 13, related to aligning the first moving track with the second moving track; calculating a drift calibration matrix between an outside-in coordinate system of the first moving track and an inside-out coordinate system of the second moving track; and determining a calibrated data by applying the drift calibration matrix to the sensor data, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 13 not eligible.
Claim 14, related to generating the first moving track according to the body image based on a simultaneous localization and mapping algorithm, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 14 not eligible.
Claim 15, related to wherein the IMU sensor is configured to detect a linear acceleration or an angular velocity of the body portion, and the calibration method further comprises: generating the second moving track according to the linear acceleration or the angular velocity, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 15 not eligible.
Claim 16, related to in response to the body portion being static and the pose confidence being greater than a confidence threshold, determining a camera coordinate value of the current pose; and calibrating the accumulative error of the sensor data based on a transformation relationship from the camera coordinate value to a IMU coordinate value of the IMU sensor, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 16 not eligible.
Claim 17, related to in response to the body portion being static and the pose confidence not being greater than a confidence threshold, obtaining an additional frame of additional camera data from the camera to fuse with a current frame of the camera data; and determining the pose confidence of the current pose of the body portion based on the camera data and the additional data, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 17 not eligible.
Claim 18, related to in response to the body portion being static and the pose confidence being greater than a confidence threshold, obtaining an additional sensor data from an additional IMU sensor, wherein the additional IMU sensor is adapted to be mounted on an additional body portion, determining a first distance between the body portion and the additional body portion based on the camera data, determining a second distance between the body portion and the additional body portion based on the sensor data and the additional sensor data, and calibrating the accumulative error of the sensor data based on the first distance and the second distance, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 18 not eligible.
Claim 19, related to related to in response to the body portion being static and the pose confidence not being greater than a confidence threshold, obtaining an additional camera data from an additional camera to fuse with the camera data; and determining the pose confidence of the current pose of the body portion based on the camera data and the additional data, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 19 not eligible.
Claim 20, related to related to wherein the body portion is a first joint of the user and the additional body portion is a second joint of the user, this appears recite further data characterization and mathematical concepts that are part of the abstract idea, claim 20 not eligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-20 are rejected under 35 U.S.C. 102 (a) (1) as being anticipated by . WANG, CN 113853508 A, DATE PUBLISHED: 2021-12-28, CPC B60W 60/0015.
Regarding claim 1:
WANG described a tracking system with a calibration method, comprising:
a camera (page 10, camera), configured to obtain camera data, wherein the camera data (page 10, sensor data) comprises a body image of a body portion of a user (page 10, sensors can be used to detect objects and their respective characteristics (position, shape, direction, velocity, etc);
an inertial measurement unit (IMU) sensor, configured to obtain sensor data, wherein the IMU sensor is adapted to be mounted on the body portion; and a processor, configured to (page 2, inertial navigation data):
determine the body portion being static or dynamic based on the camera data or the sensor data (page 14, static and dynamic object);
in response to the body portion being static:
determine a pose confidence of a current pose of the body portion based on the camera data (page 16, capture static and dynamic digital video image); and
calibrate an accumulative error of the sensor data based on the pose confidence and the camera data (page 16, capture static and dynamic digital video image);
in response to the body portion being dynamic (page 16, capture static and dynamic digital video image),
determine a first moving track of a body moving track of the body portion based on the camera data, determine a second moving track of the body moving track of the body portion based on the sensor data (page 21, the inertial sensor and the speed data of the vehicle, measuring the moving distance and orientation,), and calibrate the accumulative error of the sensor data based on the first moving track and the second moving track (page 30, estimate the speed scale factor error); and track the body portion based on the sensor data (page 29, two adjacent track points can be obtained by multiplying the speed and the sampling interval, the adjacent track point related parameter calculation is changed to calculate the interval 3; 5 or 8 of the related parameter of the track point).
Regarding claim 11:
WANG described a calibration method for a tracking system (page 12, tracking object), comprising:
obtaining camera data from a camera, wherein the camera data comprises a body image of a body portion of a user (page 10, sensors can be used to detect objects and their respective characteristics (position, shape, direction, velocity, etc);
obtaining sensor data from an inertial measurement unit (IMU) sensor, wherein the IMU sensor is adapted to be mounted on the body portion (page 2, inertial navigation data):
determining the body portion being static or dynamic based on the camera data or the sensor data (page 14, static and dynamic object);
in response to the body portion being static,
determining a pose confidence of a current pose of the body portion based on the camera data (page 16, capture static and dynamic digital video image); and calibrating an accumulative error of the sensor data based on the pose confidence and the camera data (page 4, temperature error);
in response to the body portion being dynamic (page 29, two adjacent track points can be obtained by multiplying the speed and the sampling interval, the adjacent track point related parameter calculation is changed to calculate the interval 3; 5 or 8 of the related parameter of the track point).,
determining a first moving track of a body moving track of the body portion based on the camera data page 21, the inertial sensor and the speed data of the vehicle, measuring the moving distance and orientation,),,
determining a second moving track of the body moving track of the body portion based on the sensor data page 21, the inertial sensor and the speed data of the vehicle, measuring the moving distance and orientation), and
calibrating the accumulative error of the sensor data based on the first moving track and the second moving track (page 30, estimate the speed scale factor error), and tracking the body portion based on the sensor data (page 29, two adjacent track points can be obtained by multiplying the speed and the sampling interval, the adjacent track point related parameter calculation is changed to calculate the interval 3; 5 or 8 of the related parameter of the track point).
.
Regarding claim 2, WANG further described obtain the first moving track based on an outside-in tracking algorithm (page 19, including a gyroscope and an accelerometer); and obtain the second moving track based on an inside-out tracking algorithm (page 19, speed and attitude).
Regarding claim 3, WANG further described align the first moving track with the second moving track; calculate a drift calibration matrix between an outside-in coordinate system of the first moving track and an inside-out coordinate system of the second moving track; and determine a calibrated data by applying the drift calibration matrix to the sensor data (page 18, offset of the IMU, zero offset drift will continue)
Regarding claim 4, WANG further described generate the first moving track according to the body image based on a simultaneous localization and mapping algorithm (page 29, two adjacent track point scan be obtained by multiplying the speed and the sampling interval).
Regarding claim 5, WANG further described the IMU sensor is configured to detect a linear acceleration (page 32, acceleration) or an angular velocity of the body portion, and the processor is configured to generate the second moving track according to the linear acceleration or the angular velocity (page 32, offset of acceleration).
Regarding claim 6, WANG further described in response to the body portion being static and the pose confidence being greater than a confidence threshold, determine a camera coordinate value of the current pose (page 36-27, threshold value); and calibrate the accumulative error of the sensor data based on a transformation relationship from the camera coordinate value to a IMU coordinate value of the IMU sensor (page 36-37, offset error).
Regarding claim 7, WANG further described in response to the body portion being static and the pose confidence not being greater than a confidence threshold (page 36-27, threshold value), obtain an additional frame of additional camera data from the camera to fuse with a current frame of the camera data (page 3, collect the original data); and determine the pose confidence of the current pose of the body portion based on the camera data and the additional data (page 3, the average value as the first zero deviation).
Regarding claim 8, WANG further described in response to the body portion being static and the pose confidence being greater than a confidence threshold, obtain an additional sensor data from an additional IMU sensor, wherein the additional IMU sensor is adapted to be mounted on an additional body portion (page 3, page 3, the average value as the first zero deviation), determine a first distance between the body portion and the additional body portion based on the camera data (page 6, distance of the vehicle, and accumulating in the initial position), determine a second distance between the body portion and the additional body portion based on the sensor data and the additional sensor data, and calibrate the accumulative error of the sensor data based on the first distance and the second distance (page 6, a constant zero offset error, proportional factor error).
Regarding claim 9, WANG further described in response to the body portion being static and the pose confidence not being greater than a confidence threshold, obtain an additional camera data from an additional camera to fuse with the camera data; and determine the pose confidence of the current pose of the body portion based on the camera data and the additional data (page 6, inertial sensor is the inertial navigation data or the data after being compensated by the inertial navigation data).
Regarding claim 10, WANG further described the body portion is a first joint of the user and the additional body portion is a second joint of the user (page 19, axes s overlapped).
Regarding claim 12, WANG further described obtain the first moving track based on an outside-in tracking algorithm (page 19, including a gyroscope and an accelerometer); and obtain the second moving track based on an inside-out tracking algorithm (page 19, speed and attitude).
Regarding claim 13, WANG further described align the first moving track with the second moving track; calculate a drift calibration matrix between an outside-in coordinate system of the first moving track and an inside-out coordinate system of the second moving track; and determine a calibrated data by applying the drift calibration matrix to the sensor data (page 18, offset of the IMU, zero offset drift will continue)
Regarding claim 14, WANG further described generate the first moving track according to the body image based on a simultaneous localization and mapping algorithm (page 29, two adjacent track point scan be obtained by multiplying the speed and the sampling interval).
Regarding claim 15, WANG further described the IMU sensor is configured to detect a linear acceleration (page 32, acceleration) or an angular velocity of the body portion, and the processor is configured to generate the second moving track according to the linear acceleration or the angular velocity (page 32, offset of acceleration).
Regarding claim 16, WANG further described in response to the body portion being static and the pose confidence being greater than a confidence threshold, determine a camera coordinate value of the current pose (page 36-27, threshold value); and calibrate the accumulative error of the sensor data based on a transformation relationship from the camera coordinate value to a IMU coordinate value of the IMU sensor (page 36-37, offset error).
Regarding claim 17, WANG further described in response to the body portion being static and the pose confidence not being greater than a confidence threshold (page 36-27, threshold value), obtain an additional frame of additional camera data from the camera to fuse with a current frame of the camera data (page 3, collect the original data); and determine the pose confidence of the current pose of the body portion based on the camera data and the additional data (page 3, the average value as the first zero deviation).
Regarding claim 18, WANG further described in response to the body portion being static and the pose confidence being greater than a confidence threshold, obtain an additional sensor data from an additional IMU sensor, wherein the additional IMU sensor is adapted to be mounted on an additional body portion (page 3, page 3, the average value as the first zero deviation), determine a first distance between the body portion and the additional body portion based on the camera data (page 6, distance of the vehicle, and accumulating in the initial position), determine a second distance between the body portion and the additional body portion based on the sensor data and the additional sensor data, and calibrate the accumulative error of the sensor data based on the first distance and the second distance (page 6, a constant zero offset error, proportional factor error).
Regarding claim 19, WANG further described in response to the body portion being static and the pose confidence not being greater than a confidence threshold, obtain an additional camera data from an additional camera to fuse with the camera data; and determine the pose confidence of the current pose of the body portion based on the camera data and the additional data (page 6, inertial sensor is the inertial navigation data or the data after being compensated by the inertial navigation data).
Regarding claim 20, WANG further described the body portion is a first joint of the user and the additional body portion is a second joint of the user (page 19, axes s overlapped).
Contact information
5. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tung Lau whose telephone number is (571)272-2274, email is Tungs.lau@uspto.gov. The examiner can normally be reached on Tuesday-Friday 7:00 AM-5:00 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TURNER SHELBY, can be reached on 571-272-6334. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll- free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272- 1000.
/TUNG S LAU/Primary Examiner, Art Unit 2857
Technology Center 2800
March 11, 2026