DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The arguments proffered in the instant Response to Office Action of 11/24/2025 have been considered, but are unpersuasive such that the outstanding rejection based on Ishizuka is maintained for the amended claims.
To wit, in regard to the argument that the image compensation of Ishizuka results in a stitched image, this does not obviate the applicability of Ishizuka to recited features.
Second, the feature of wherein objects in the camera view are accurately represented in three dimensions when disparities of the objects are within a predetermined range; is shown at least by comparison in Figure 5, in which a virtual camera is set with a relative disparity to eliminate double imaging of object and correctly represent object in three relative dimensions-the distance/disparity and x/y axes.
And finally, the features of wherein the baseline spacings of the at least two virtual cameras are selected and dynamically adjusted over time so that at least some of the disparities of some objects captured by the virtual cameras are within the predetermined range; have already been cited in paragraphs 0100-0102 in which the baseline spacings are applied to virtual cameras have a desired disparity and objects are in range to prevent blind spots and double imaging.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-6 and 8-11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ishizuka (WO 2021/075100).
Regarding claim 1, Ishizuka discloses a camera system, in particular a surround view camera system, for a vehicle, (shown Figure 9B, surround view camera system for car) comprising
at least two cameras which generate a camera view from camera images from the cameras, with the cameras being configured to capture objects at different distances from the vehicle, (Shown Figure 5A, real camera pair 102/104 captures images close to camera pair, and far from camera pair at 142) and
at least one processor which generates a scene as a function of a disparity of at least one object, with the disparity of the at least one object arising from a distance of the at least one object from the camera and a baseline spacing of the cameras from one another, (shown Figures 1 and 3, processor 118 section determines disparity based on camera pair baseline at 128)
wherein the at least one processor creates at least two virtual cameras with a virtual baseline spacing from one another, the virtual baseline spacing of the virtual cameras and the baseline spacing of the cameras differing to allow for the scene to be captured with different disparities, (shown Figures 3 and 5A, virtual cameras between real camera 102/104 generated to provide baselines with different disparities to eliminate blind spots and double imaging)
wherein the camera view is generated on the basis of the camera images from the cameras and the virtual cameras, the at least one processor generates a displayed camera image of the camera view being generated from the camera images from the cameras and a perspective being defined by the virtual cameras on the basis of the disparity of the scene; (paragraph 0102, real images and virtual images are combined to eliminate blind spots and double images)
wherein objects in the camera view are accurately represented in three dimensions when disparities of the objects are within a predetermined range; (comparison in Figure 5, in which a virtual camera is set with a relative disparity to eliminate double imaging of object and correctly represent object in three relative dimensions-the distance/disparity and x/y axes)
wherein the baseline spacings of the at least two virtual cameras are selected and dynamically adjusted over time so that at least some of the disparities of some objects captured by the virtual cameras are within the predetermined range. (paragraphs 0100-0102 in which the baseline spacings are applied to virtual cameras have a desired disparity and objects are in range to prevent blind spots and double imaging)
Regarding independent claim 10, claim 10 is a method claim reciting features similar to claim 1, and is therefore also anticipated by Ishizuka for reasons similar to claim 1.
Regarding claim 2, Ishizuka discloses wherein the camera view is a stereoscopic or holographic view of the scene. (paragraph 0035, shown Figure 2, cameras 102/104 form stereoscopic camera view)
Regarding claim 3, Ishizuka discloses wherein the virtual baseline spacing of the virtual cameras is selected such that the virtual baseline spacing is larger or smaller than the baseline spacing of the cameras. (Shown Figure 2, virtual cameras have smaller baseline than real cameras 102/104
Regarding claim 4, Ishizuka discloses wherein the camera image is divided into various image regions. (paragraphs 0130 and 0133, camera images from different cameras divided into overlap portions)
Regarding claim 5, Ishizuka discloses wherein the baseline spacing of the virtual cameras is dynamically changed depending on a situation for all image regions or all pixels of the camera image. (paragraphs 0101/0102, virtual cameras and baselines thereof set for images regions to eliminate blind spots or double images)
Regarding claim 6, Ishizuka discloses wherein a virtual camera of the at least two virtual cameras with a corresponding virtual baseline spacing is calculated for each pixel of the camera image in order to generate an optimal scene. (paragraph 0041 sets a virtual viewpoint for every single pixel width and thus pixel; paragraph 0042 sets a viewpoint position for images, and thus constituent pixels)
Regarding claim 8, Ishizuka discloses wherein a fisheye camera or a stereo camera comprising at least two camera units is provided as the at least two cameras. (shown Figure 2, stereo camera system)
Regarding claim 9, Ishizuka discloses wherein more than two cameras or more than two virtual cameras are provided. (shown Figure 3A, three real cameras, three or more virtual cameras)
Regarding claim 11, Ishizuka discloses wherein the camera view is a stereoscopic or holographic view of the scene, which is adaptively adjusted. (shown Figure 2, stereoscopic image, paragraph 0101, adaptively adjusted per purpose or precision by setting virtual camera)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Ishizuka in view of Imada (US 2011/0149050).
Regarding claim 7, Ishizuka discloses wherein the virtual baseline spacing is calculated from ... the distance of the at least one object to be visualized or the scene to be visualized. (shown Figure 5A, baselines of virtual cameras set to eliminate blind spots or double imaging at specified distances)
Ishizuka fails to disclose wherein the virtual baseline spacing is calculated from lens properties of a camera of the at least two cameras.
However, Imada discloses wherein the virtual baseline spacing is calculated from lens properties of a camera of the at least two cameras. (paragraph 0059, parallax calculated from lens center properties together with distance)
It would have been obvious to one of ordinary skill in the art that lens properties of the cameras affect baseline setting before the effective filing date of the instant application because it is well known in the art as necessary that parallax is affected by both distance and lens properties as evinced by Imada. (paragraph 0059)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Yoon (US 11,546,568) implicates baseline of virtual cameras.
Ushiki (US 2013/0038606) implicates virtual viewpoint positions for desired parallax.
Koo (US 2009/0040295) implicates depth control.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER KINGSBURY GLOVER whose telephone number is (303)297-4401. The examiner can normally be reached Monday-Friday 8-6 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached at 571 272 2988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER KINGSBURY GLOVER/ Examiner, Art Unit 2485
/JAYANTI K PATEL/ Supervisory Patent Examiner, Art Unit 2485 March 4, 2026