DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 04/13/2024 are considered by the examiner.
Claim Objections
Claim 1 is objected to because of the following informalities:
In claim 1, line 3, the term “(1) providing an” should be changed to “in order to avoid typographical issue.
In claim 1, line 8, the term “(2) adjusting a” should be changed to “in order to avoid typographical issue.
In claim 1, line 12, the term “(3) evaluating sharpness” should be changed to “in order to avoid typographical issue.
In claim 1, line 15, the term “(4) based on” should be changed to “in order to avoid typographical issue.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over MARRON et al. (US 7405834 B1), hereinafter referenced as MARRON, in view of BANERJEE et al. (US 10152059 B2), hereinafter referenced as BANERJEE.
Regarding claim 1, MARRON explicitly teaches A passive 3D imaging method based on optical interference computational imaging (Fig. 1. Col. 7 Lines [3-5]-MARRON discloses a method to retrieve high resolution images. Further in col 12. Lines 5-15-MARRON discloses interferometric images.), comprising:
(1) providing an optical interference computational imaging system (Fig. 7A-C. Col. 12. Lines [8-12]-MARRON discloses interferometric images are shown where the dark and light banding across the individual apertures indicate TTP errors. Note that the interferometer has been adjusted so that the central subaperture has negligible TTP errors and therefore has negligible banding.) with relatively discrete baseline midpoints of an aperture pair array (Fig. 8, illustrates example aperture array geometries. Col. 15. Lines [28-34]-MARRON discloses the example illustrated the invention under the assumption of a specific geometry of subapertures (hexagonal) as illustrated in FIG. 4. The invention is not dependent on a specific geometry. Examples of geometries are shown in FIGS. 8(a)-8(f). The example used above is shown in FIG. 8(a) showing a front view of 7 densely packed hexagonal subaperture transceivers.) and performing interference recording of mutual intensity of an object (Col. 10. Lines [35-37]-MARRON discloses intensity data is recorded at the detector array 505, which is then processed digitally to recover images),
wherein the baseline midpoints formed by each aperture pair of the aperture pair array are not overlapped or at least there are a few midpoints non-overlapping in the optical interference computational imaging systems ((Fig. 8, illustrates example aperture array geometries where baseline midpoints do not overlapp. Col. 15. Lines [28-34]-MARRON discloses the example illustrated the invention under the assumption of a specific geometry of subapertures (hexagonal) as illustrated in FIG. 4. The invention is not dependent on a specific geometry. Examples of geometries are shown in FIGS. 8(a)-8(f). The example used above is shown in FIG. 8(a) showing a front view of 7 densely packed hexagonal subaperture transceivers);
and compensating a phase of mutual intensities of each spatial frequency domain corresponding to the baseline for each aperture pair (Fig. 6. Col. 10. Lines [44-47]-MARRON discloses an intensity image received at the detector array is shown as image 602. Examination of a portion 603 of this image 602 reveals that the speckle pattern is modulated by a spatial carrier frequency), and reconstructing an object image by Fourier transform algorithm to obtain a target image (Col. 10. Lines [35-39]-MARRON discloses intensity data is recorded at the detector array 505, which is then processed digitally to recover images. This is a straightforward process since for coherent imaging the image amplitude can be recovered by a Fourier transform (FT) of the pupil data.).
(3) evaluating sharpness of each reconstructed target image using an image optimization evaluation algorithm (Col. 20. Lines [1-7]-MARRON discloses another class of algorithm for maximizing image sharpness is the simplex method. This method is also an iterative method and involves first computing the sharpness for three sets of parameter values. The sharpness values are then compared and the parameter set that gives the lowest sharpness is then modified; the modification is determined using simple geometric construction.).
MARRON fails to explicitly teach (2) adjusting a reference working distance step by step within a range, and obtaining a reconstructed image with clear scene or locally clear scene and a corresponding reference working distance; (4) based on the reconstructed image with the clear scene or the locally clear scene and the corresponding reference working distance, calculating a relative position and size of an interested object in the image, and reconstructing a 3D image of the object scene to complete a passive 3D imaging and image reconstruction of the object scene.
However, BANERJEE explicitly teaches (2) adjusting a reference working distance step by step within a range (Col. 10. Lines [30-36]- BANERJEE discloses the distance determiner 116 may obtain depth information (and/or other information from which depth information may be determined) from the depth sensor(s) 108 (and/or image sensor(s) 104). The depth information may be obtained at one or multiple samplings over time (e.g., a first sampling, a second sampling, etc.).),
and obtaining a reconstructed image with clear scene or locally clear scene (Col. 16. Lines [5-10]-BANERJEE discloses the processor 112 (e.g., computer vision tracker 110) may create one or more three-dimensional (3D) models of the surrounding environment (e.g., street view, landscapes, etc.). This may involve 3D vector analysis techniques and algebraic computation to achieve location determination (e.g., fast location determination).) and a corresponding reference working distance (Col. 10. Lines [21-27]- BANERJEE discloses the distance determiner 116 may obtain information for determining a distance (e.g., depth information). Depth information may indicate one or more distances to (e.g., depth measurements of, depths of, depth values of, etc.) one or more physical bodies (e.g., objects, faces, terrain, structures, drones, etc.) from the depth sensor(s) 108.);
(4) based on the reconstructed image with the clear scene or the locally clear scene (Col. 16. Lines [5-10]-BANERJEE In some configurations, the processor 112 (e.g., computer vision tracker 110) may create one or more three-dimensional (3D) models of the surrounding environment (e.g., street view, landscapes, etc.). This may involve 3D vector analysis techniques and algebraic computation to achieve location determination (e.g., fast location determination).) and the corresponding reference working distance (Col. 16. Lines [5-10]-BANERJEE in some configurations, the processor 112 (e.g., computer vision tracker 110) may create one or more three-dimensional (3D) models of the surrounding environment (e.g., street view, landscapes, etc.). This may involve 3D vector analysis techniques and algebraic computation to achieve location determination (e.g., fast location determination).), calculating a relative position and size of an interested object in the image (Col. 7. Lines [56-63]- BANERJEE discloses examples of instructions and/or data that may be stored by the memory 120 may include measured information, depth information, depth maps, distance data, image data, object data (e.g., location, size, shape, etc.), movement data, movement instructions, tracking data, image obtainer 118 instructions, distance determiner 116 instructions, computer vision tracker 110 instructions, movement controller 114 instructions, etc.), and reconstructing a 3D image of the object scene to complete a passive 3D imaging and image reconstruction of the object scene (Col. 16. Lines [5-10]-BANERJEE discloses the processor 112 (e.g., computer vision tracker 110) may create one or more three-dimensional (3D) models of the surrounding environment (e.g., street view, landscapes, etc.). This may involve 3D vector analysis techniques and algebraic computation to achieve location determination (e.g., fast location determination).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of MARRON of a passive 3D imaging method based on optical interference computational imaging, comprising: (1) providing an optical interference computational imaging system with relatively discrete baseline midpoints of an aperture pair array and performing interference recording of mutual intensity of an object, wherein the baseline midpoints formed by each aperture pair of the aperture pair array are not overlapped or at least there are a few midpoints non-overlapping in the optical interference computational imaging systems; and compensating a phase of mutual intensities of each spatial frequency domain corresponding to the baseline for each aperture pair, and reconstructing an object image by Fourier transform algorithm to obtain a target image; (3) evaluating sharpness of each reconstructed target image using an image optimization evaluation algorithm, with the teachings of BANERJEE of (2) adjusting a reference working distance step by step within a range, and obtaining a reconstructed image with clear scene or locally clear scene and a corresponding reference working distance; (4) based on the reconstructed image with the clear scene or the locally clear scene and the corresponding reference working distance, calculating a relative position and size of an interested object in the image, and reconstructing a 3D image of the object scene to complete a passive 3D imaging and image reconstruction of the object scene.
Wherein having MARRON’s high quality imaging method (2) adjusting a reference working distance step by step within a range, and obtaining a reconstructed image with clear scene or locally clear scene and a corresponding reference working distance; (4) based on the reconstructed image with the clear scene or the locally clear scene and the corresponding reference working distance, calculating a relative position and size of an interested object in the image, and reconstructing a 3D image of the object scene to complete a passive 3D imaging and image reconstruction of the object scene.
The motivation behind the modification would have been to obtain a high quality 3D imaging that enhances the quality of the images and the efficiency of the system. Since both MARRON and BANERJEE relate to imaging objects in the atmosphere, wherein MARRON enables high resolution targeting and imaging in the presence of atmospheric turbulence, platform vibration, and practical space limitations while BANERJEE improves the quality of the depth information. Please see MARRON et al. (US 7405834 B1), Col. 1-2. Lines [64-5], and BANERJEE et al. (US 10152059 B2), Col. 11. Lines [38-44].
Conclusion
Listed below are prior arts made of record and not relied upon but are considered pertinent to applicant’s disclosure.
YU (US 20240414449 A1) - A method of high-resolution computational imaging through checkerboard constellation-based optical pupil plane interference is provided where a checkerboard constellation-based high-density data acquisition system is adopted to realize high-density sampling of the modulus G of the complex coherence coefficient μ(u, v) of the object, and then the argument angle recovery (also known as phase recovery) algorithms and the image reconstruction algorithms are combined to obtain a clear image. For the optical telescope with ultra-large aperture much larger the rocket envelope, low, medium and high-frequency cameras can be placed on different rocket satellite platforms, launched into an orbit in batches and recombined in the orbit to obtain an equivalent large-optical-aperture imaging optics system, and the equivalent aperture of the system breaks through the limitation of the rocket envelope and can be expanded to 10 meters or above.
WU et al. (US 12073578 B2) - A method for a passive single-viewpoint 3D imaging system comprises capturing an image from a camera having one or more phase masks. The method further includes using a reconstruction algorithm, for estimation of a 3D or depth image.
Marron (US 8227735 B1) - A combined active and passive imaging system comprises a radiation source unit configured to output a radiation beam towards a detector and an object in a scene. The system further comprises the detector configured to record a first instance of an intensity pattern of an interference between at least a portion of the radiation beam and at least a portion of a return radiation from the object. The detector may further be configured to record ambient light reflected from the scene. The detector may output a first signal with the recorded ambient light reflected from the scene and a second signal with the recorded intensity pattern. The system further comprises a processor communicatively coupled to the detector and the radiation source unit. The processor may receive the first signal and the second signal and select the object in the scene.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ETHAN N WOLFSON whose telephone number is (571)272-1898. The examiner can normally be reached Monday - Friday 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571) 272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ETHAN N WOLFSON/Examiner, Art Unit 2673
/CHINEYERE WILLS-BURNS/Supervisory Patent Examiner, Art Unit 2673