DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the Claims
Currently, claims 1-13 & 15-20 are pending in the application. Claims 1, 9, 13, 15, and 16-19 are amended. Claim 14 is cancelled.
Response to Arguments / Amendments
Applicant’s arguments have been fully considered but are rendered moot in view of the new ground of rejection necessitated by amendments initiated by the applicant.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 5, 8-13 and 15-19 are rejected under 35 U.S.C. 103 as being unpatentable over Jeon (US 20190082114, hereinafter Jeon) in view of CHECHENDAEV et al. (US 20230408253, hereinafter CHECHENDAEV).
Regarding Claim 1, Jeon discloses a stereoscopic camera ([0131] FIG. 7, stereoscopic pair 700 of cameras) , comprising:
a first camera assembly comprising a first camera lens and a first camera sensor configured to detect light flowing through the first camera lens at a first region of the first camera sensor; a second camera assembly comprising a second camera lens and a second camera sensor configured to detect light flowing through the second camera lens ([0132] FIG. 7, two cameras, referred to as a left camera 710L and a right camera 710R, respectively. [0135], stereoscopic pair 700 have a predetermined spaced distance between the left camera 710L and the right camera 710R; [0135], cameras 710L and 710R of the stereoscopic pair 700 are spaced apart from each other by the ICS along the axis 712, which corresponds to the line, which connects the cameras 710L and 710R and is generally perpendicular to the orientations 711L and 711R)
a controller configured to:
receive image data from the first camera sensor and the second camera sensor ([0132] the left camera 710L and the right camera 710R are capable of acquiring (or capturing) images corresponding to the left and right eyes of a person, respectively); and
encode the received image data ([0152], FIG. 11A, each of the plurality of processors 1140 and 1150 is capable of encoding (or image-processing) an electrical brightness signal obtained from a camera (or an image sensor) connected thereto into a digital image).
Jeon does not explicitly disclose a mount configured to secure the first camera lens of the first camera assembly and the second camera lens of the second camera assembly such that the first region of the first camera sensor is offset from a center of the first camera sensor.
CHECHENDAEV teaches a mount configured to secure the first camera lens of the first camera assembly and the second camera lens of the second camera assembly such that the first region of the first camera sensor is offset from a center of the first camera sensor ([0044] –[0045], FIG. 5, a 3D scanner 500 that includes, a first camera 504, a second camera 506, and a third camera 508. The first camera includes an optical system 512, and an optical sensor 520. The optical system 512 has an associated optical axis 530. The second camera includes an optical system 514, and an optical sensor 522)
PNG
media_image1.png
412
628
media_image1.png
Greyscale
Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of readout of a first region of interest as taught by CHECHENDAEV ([0037]) into the stereoscopic camera system of Jeon in order to provide systems for improving performance of the three-dimensional scanner by overlapping fields of view of two or more cameras, and the projector (CHECHENDAEV, [0006]).
Regarding Claim 2, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 1,
Jeon discloses wherein the first camera lens and the second camera lens have an interaxial distance of 64 mm ([0136], stereoscopic pair 700 have a predetermined spaced distance between the left camera 710L and the right camera 710R ( inter-camera spacing (ICS)) of 6 cm to 11 cm. Corresponding to the approximate average inter-pupillary distance (IPD) of a human being about 6.5 cm, the stereoscopic pair 700 according to various embodiments may be assumed to have an ICS of 6 cm to 7 cm)
Regarding Claim 5, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 1,
Jeon discloses wherein the first camera lens and the second camera lens have a fixed interaxial distance ([0132] FIG. 7, two cameras, referred to as a left camera 710L and a right camera 710R, respectively. [0135], stereoscopic pair 700 have a predetermined spaced distance between the left camera 710L and the right camera 710R).
Regarding Claim 8, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 1,
Jeon discloses wherein the image data includes a readout of an entirety of sensor data from the first camera sensor and the second camera sensor ([0132] the left camera 710L and the right camera 710R are capable of acquiring (or capturing) images corresponding to the left and right eyes of a person, respectively)
Regarding Claim 8, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 1,
CHECHENDAEV teaches the image data includes a readout of a first region of interest of the first camera sensor where the first camera lens overlaps the first camera sensor, and a second region of interest of the second camera sensor where the second camera lens overlaps the second camera sensor ([0037], it is desirable to increase the measurement area of the object (e.g., the area of the object for which usable data are obtained in each image, for the purposes of generating a 3D reconstruction of the object) and increasing the overlap of the fields of view of the cameras (e.g., camera 304 and camera 306) increases the measurement area of the object. When the fields of view fully overlap (e.g., at the best focus plane 324), each of the cameras in the 3D scanner 300 receives reflected light from the same measurement area. Also, when three cameras are included in the 3D scanner, the measured object simultaneously has corresponding imaging points on all three cameras and when four cameras are included in the 3D scanner, the measured object simultaneously has corresponding imaging points on all four cameras).
PNG
media_image2.png
442
634
media_image2.png
Greyscale
Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of readout of a first region of interest as taught by CHECHENDAEV ([0037]) into the stereoscopic camera system of Jeon in order to provide systems for improving performance of the three-dimensional scanner by overlapping fields of view of two or more cameras, and the projector (CHECHENDAEV, [0006]).
Regarding Claim 10, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 9,
CHECHENDAEV discloses wherein the controller is configured to encode image data within the first and second regions, and omit encoding sensor data outside the first region of interest and the second region of interest ([0037], it is desirable to increase the measurement area of the object (e.g., the area of the object for which usable data are obtained in each image, for the purposes of generating a 3D reconstruction of the object) and increasing the overlap of the fields of view of the cameras (e.g., camera 304 and camera 306) increases the measurement area of the object). The same reason or rational of obviousness motivation applied as used above in claim 9.
Regarding Claim 11, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 9,
CHECHENDAEV discloses wherein the first region of interest and the second region of interest are adjusted in accordance with the adjusted position ([0037], it is desirable to increase the measurement area of the object (e.g., the area of the object for which usable data are obtained in each image, for the purposes of generating a 3D reconstruction of the object) and increasing the overlap of the fields of view of the cameras). The same reason or rational of obviousness motivation applied as used above in claim 9.
Regarding Claim 12, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 1,
Jeon discloses wherein each of the first camera lens and the second camera lens is a high-powered lens ([0132] FIG. 7, two cameras, referred to as a left camera 710L and a right camera 710R, respectively).
Regarding Claim 13, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 9,
CHECHENDAEV discloses teaches wherein a distance from an optical center of the first camera lens to an optical center of the second camera lens, is different than a second distance is measured from a geometric center of the first camera sensor to a geometric center of the second camera sensor ([0041] A field of view of the sensor 402 (e.g., an angle of coverage of sensor 402) depends on a displacement 432 along the y-direction of the center 404 of the sensor 402. The field of view also depends on a distance 420 along the z-direction between the imaging optics 406 and the sensor 402. The distance 420 is a parameter of the device design, and is known. A dimension (e.g., along the y-direction, along the x-direction) of the sensor 402 is also known;)
PNG
media_image3.png
252
598
media_image3.png
Greyscale
Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of readout of a first region of interest as taught by CHECHENDAEV ([0037]) into the stereoscopic camera system of Jeon in order to provide systems for improving performance of the three-dimensional scanner by overlapping fields of view of two or more cameras, and the projector (CHECHENDAEV, [0006]).
Regarding Claim 15, Jeon in view of CHECHENDAEV discloses the stereoscopic camera of claim 1,
CHECHENDAEV teaches wherein an optical axis of the second camera lens is off-center of the second camera sensor ([0050] The single camera 604 includes an optical system 610 having an associated optical axis 612. The single camera 604 also includes a sensor 624. A center 626 of the sensor 625 is displaced by a distance 630 relative to the optical axis 612, along the y-direction, to the right of FIG. 6. The displacement of the center 626 of the sensor 624 results in a field of view 618 of the single camera 604 that is asymmetric with respect to the optical axis 612. The field of view 616 of the projector 602 and the field of view 618 of the single camera 604 overlap at a plane 614 in an object scene 611. The plane 614 is indicated by a line in the z-y plane. The plane 614 extends in the x-y plane. The optical axis 608 of the projector 602 is substantially parallel to the optical axis 612 of the single camera 604).
Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of readout of a first region of interest as taught by CHECHENDAEV ([0037]) into the stereoscopic camera system of Jeon in order to provide systems for improving performance of the three-dimensional scanner by overlapping fields of view of two or more cameras, and the projector (CHECHENDAEV, [0006]).
Regarding Claim 16-19, System claim 16-19 of using the corresponding stereoscopic camera claimed in claim 1-2 and 8-11 15, and the rejections of which are incorporated herein for the same reasons as used above.
Claims 3-4, 6-7 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Jeon (US 20190082114, hereinafter Jeon) in view of Burgess (US 20180295335, hereinafter Burgess).
Regarding Claim 3, Jeon discloses the stereoscopic camera of claim 1, wherein the first camera lens and the second camera lens have an interaxial distance between 58 mm and 74 mm ([0136], predetermined spaced distance between the left camera 710L and the right camera 710R ( inter-camera spacing (ICS)) of 60mm to 110mm. Corresponding to the approximate average inter-pupillary distance (IPD) of a human being about 6.5 cm [ 5 to 6.5 cm], the stereoscopic pair 700 according to various embodiments may be assumed to have an ICS of 6 cm to 7 cm)
Jeon does not explicitly disclose the specific interaxial distance between 58 mm and 60 mm.
However, the teaching of the prior art of the specific interaxial distance between 58 mm and 60 mm is one of very well-known techniques in the art. A person of ordinary skill in the art would have recognized that applying the known technique of the specific interaxial distance between 58 mm and 60 mm would have yielded predictable results of improving quality efficiency.
Furthermore, Burgess teaches stereoscopic images of an object 1312 located a distance 1340 from the cameras, with the cameras separated by a variable stereo base 1330 of variable interaxial distance ([0144], FIG. 13).
Therefore, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of specific interaxial distance as taught by Burgess ([0144]) into the stereoscopic camera system of Jeon in order to provide systems for effectively synchronized to the pulse per second (PPS) even with live video, thus eliminating unintended digital artifacts while switching from one camera feed to another camera feed (Burgess, [0037]).
Regarding Claim 4, Jeon in view of Burgess discloses the stereoscopic camera of claim 1,
Jeon discloses wherein the first camera lens and the second camera lens have an interaxial distance between 48 mm and 90 mm ([0136], stereoscopic pair 700 have a predetermined spaced distance between the left camera 710L and the right camera 710R ( inter-camera spacing (ICS)) of 60 mm to 110 mm).
Regarding Claim 6, Analogous rejection as the rejection of Claim 3 applies.
Regarding Claim 7, Jeon in view of Burgess discloses the stereoscopic camera of claim 6,
Burgess discloses wherein the controller is further configured to: move at least one of the first camera lens and the second camera lens to an adjusted position to cause a change in the variable interaxial distance; capture additional image data while the first camera lens and second camera lens are in the adjusted position; and generate a depth map based on the additional image data ([0144], FIG. 13, stereoscopic images of an object 1312 located a distance 1340 from the cameras, with the cameras separated by a variable stereo base 1330 of variable interaxial distance).
The same reason or rational of obviousness motivation applied as used above in claim 3.
Regarding Claim 20, System claim 20 of using the corresponding stereoscopic camera claimed in claim 3; and the rejections of which are incorporated herein for the same reasons as used above.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Samuel D Fereja whose telephone number is (469)295-9243. The examiner can normally be reached 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, DAVID CZEKAJ can be reached at (571) 272-7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SAMUEL D FEREJA/Primary Examiner, Art Unit 2487