DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-6, 8-13, 15, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lang (WO 2019/051464) in view of Davies et al. (US 2014/0118339), Lin et al. (US 2020/0202491), and Yasutake (US 2015/0371447).
With respect to claims 1, 3, and 15, Lang discloses a method for performing surgical imaging based on mixed reality (MR) to be implemented using a system that includes an MR device to be worn by a user (Optical Head Mounted Displays; OHMD; page 29), the MR device including a processor, an infrared (IR) image capturing unit (page 38, lines 22-24 and 30; optical tracking system; 50), a color image capturing unit (page 30, lines 5-6 and 16-22) and a display lens (page 29 lines 1-7), the method comprising: obtaining, by the processor, a three-dimensional virtual model of a body part of a subject, the 3D virtual model including a plurality of model reference points that are associated with a plurality of marks on the body part, respectively (page 43, lines 15 to page 44, line 26; pre-operative data; 16; 40); controlling, by the processor, the IR image capturing unit to continuously capture IR images of the body part of the subject, each of the IR images of the body part including a plurality of IR reference points that correspond in location with the marks on the body part, respectively (page 33, line 14 to page 34 line 6; page 38, lines 21 to page 39, line 20); calculating, by the processor, a first projection matrix based on a plurality of mark coordinate sets associated respectively with the plurality of marks on the body part in a global 3D coordinate system, and a plurality of IR coordinate sets associated respectively with the plurality of IR reference points in a first two-dimensional (2D) coordinate system (page 32, lines 19-31; page 38, line 21 to page 40, line 15); controlling, by the processor, the color image capturing unit to continuously capture color images of the body part of the subject, each of the color images of the body part including a plurality of color reference points that correspond in location with the marks on the body part, respectively (page 38, line 21 to page 40, line 15; page 44, line 11 to 22; intra-operative; 41; live data); obtaining, by the processor, a plurality of color coordinate sets associated respectively with the plurality of color reference points in a second 2D coordinate system (page 27, lines 20-25; page 32, lines 19-31); calculating, by the processor, a second projection matrix based on the plurality of mark coordinate sets and the plurality of color coordinate sets (page 27, lines 20-25; page 32, lines 19-31); controlling, by the processor, the display lens to display a plurality of calibration points, and an instruction for instructing the user to perform a calibration operation with respect to each of the calibration points, to thereby obtain a plurality of screen coordinate sets that are associated respectively with the plurality of calibration points on the display lens and a plurality of calibrated coordinate sets that are associated with the calibration points in the global 3D coordinate system (page 63, line 9 to page 110, line 16; surgeon performs calibration/registration using OHMD, optical markers, anatomical landmarks, etc.); calculating, by the processor, a third projection matrix based on the plurality of screen coordinate sets and the plurality of calibrated coordinate sets (page 27, lines 20-25; page 32, lines 19-31); generating, by the processor, a to-be-projected model by performing a projection operation on the 3D virtual model, the projection operating being performed based on a plurality of original pixel coordinate sets, the first project matrix, the second projection matrix and the third projection matrix, the plurality of original pixel coordinate sets being associated respectively with a plurality of pixels that constitute the 3D virtual model in the global 3D coordinate system (page 33, lines 1-12); and controlling, by the processor, the display lens to display the to-be-projected model (page 26, line 5-31; page 32, lines 26-31; page 33, lines 1-12).
Lang discloses normalizing to a common coordinate system (page 32, lines 21-25). Lang discloses the subject matter substantially as claimed except for performing matrix multiplication. However, Davies et al. teaches in the same field of endeavor transforming matrices by matrix multiplication. Therefore, it would have been obvious to one of ordinary skill in the art to have provided Lang with performing matrix multiplication as it is a well-known operation for performing matrix transformations.
Lang discloses the subject matter substantially as claimed except for the calibration operation confirms the calibration points on the display lens overlaps with the plurality of object points. However, Lin et al. teaches in the same field of endeavor it is a conventional process for calibration of a head mounted display by aligning a set of matching points ([0004]). Therefore, it would have been obvious to one of ordinary skill in the art to have provided Lang with aligning a set of matching points on the head mounted display with the real world as it is well known to one of ordinary skill in the art.
Lang discloses the subject matter substantially as claimed except for the projection matrices comprises an intrinsic parameter matrix and an extrinsic parameter matrix including a rotation matrix and a translation vector. However, Yastake teaches in the same field of endeavor matrix transformation comprises intrinsic and extrinsic parameter matrices including rotation matrix and translation vector ([0121]). Therefore, it would have been obvious to one of ordinary skill in the art to have provided Lang with the matrix transformation calculations using intrinsic and extrinsic parameter matrices as it is well known in the art for conversion of 3D into pixel coordinates ([0120]).
Lang discloses the subject matter substantially as claimed except for the calculation by linear equations and solving for components of the intrinsic and extrinsic parameter matrices. However, the Examiner’s position is it that it is well within the skill level of one of ordinary skill in the art to perform matrix calculations and achieve the same end result of calculating matrices to perform coordinate transfers from one coordinate system to another (page 27, lines 20-25).
With respect to claim 6, Lang discloses performing rotation-translation operations (page 56, line 24 to page 57 line 14; page 57, line 30 to page 58, line 24; page 294, line 14-15).
With respect to claims 8 and 10, Lang discloses a mixed reality system for performing surgical imaging, comprising an electronic device (page 22, lines 1-5; page 190, lines 26-31) and an MR device to be worn by a user (Optical Head Mounted Displays; OHMD; page 29), said electronic device including a processor, a data storage, a communication unit, an input interface, and a display screen (page 190, lines 26-31), the MR device including a processor (page 30, line 28 to page 31, line 6), an infrared (IR) image capturing unit (page 38, lines 22-24 and 30; optical tracking system; 50), a color image capturing unit (page 30, lines 5-6 and 16-22) and a display lens (page 29 lines 1-7); wherein processors are configured to perform steps as recited in the method (see above).
With respect to claim 13, Lang discloses performing rotation-translation operations (page 56, line 24 to page 57 line 14; page 57, line 30 to page 58, line 24; page 294, line 14-15).
With respect to claim 17, Lang discloses calibration operation comprises instructing the user to perform alignment, matching, or superimposition of virtual data with live data (page 63, lines 14-19).
Response to Arguments
Applicant’s arguments with respect to claim(s) 3/13/2026 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER LUONG whose telephone number is (571)270-1609. The examiner can normally be reached M-F 9-6.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan T Nguyen can be reached at (571)272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PETER LUONG/Primary Examiner, Art Unit 3797