DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This office action is a response to an application filed on 11/25/2025, in which claims 1-10 are pending and ready for examination.
Response to Amendment
Claims 1, 5, and 8 are currently amended.
Response to Argument
Applicant's arguments filed on 11/25/2025 have been fully considered but they are not persuasive.
With respect to claims rejected under 35 USC 103, the Applicant argues, see Pg 7-8 of filed Remarks on 11/25/2025, that Nakazato does not teach (a) “obtaining a three-dimensional solid structure through filling the obtained three dimensional surface contour image with biological tissue having different optical properties and reconstructing a three-dimensional bioluminescence image or a three-dimensional molecular fluorescence image … wherein the three-dimensional bioluminescence image or the three-dimensional molecular fluorescence image indicates a three-dimensional spatial distribution of bio-optical signals in an organism”, (b) “the two-dimensional bio-optical image of the imaging target obtained in step S1 and the three-dimensional surface contour image …” by asserting there is no teaching of 3-D image reflects distribution of bio-optical signals in an inner volume of an organism and there is no teaching of completing the relevant different processes by the same camera..
Examiner cannot concur. The recited features in question only requires 3-D spatial distribution of bio-optical signals in an organism, not volumetric images of an organism. As taught in cited Para. [00165] of Dacosta, 3D bioluminescence images or 3D molecular fluorescence images of spatial distribution of bio-optical signals for an organism, also see Para. [0186] are generated.
Furthermore, as taught in Para. [0115-116], Nakazato teaches completing the processes by different units in a same imaging/camera system, as opposed to not being implemented by the same camera as asserted by the Applicant.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Nakazato (US Pub. 20160104314 A1, IDS filed on 12/03/2024) in view of Dacosta (WO 2020148721 A1).
Regarding claim 1, Nakazato discloses a three-dimensional (3D) optical imaging method, comprising the following steps (Nakazato; Fig. 4, 6, Para. [0115-117]. A 3-D optical imaging system/method is used for imaging.):
S1, obtaining a two-dimensional bio-optical image of an imaging target by a Charge Coupled Device (CCD) camera, the two-dimensional bio-optical image being a bioluminescence image or a molecular fluorescence image (Nakazato; Fig. 4, 6, Para. [0115-117]. A 2-D bio-optical image of a target, e.g. bio objects, is obtained with a CCD camera, also see Para. [0030], wherein a 2-D bio optical image being a bio luminescence image.);
S2, obtaining a three-dimensional surface contour image of the imaging target by using the CCD camera in step S1 in combination with structured light (Nakazato; Fig. 4, 6, Para. [0115-117]. A 3-D surface image is obtained by using a CCD camera, also see Para. [0030], in combination with structured pattern, also see Para. [0035].);
S3, performing an alignment on the two-dimensional bio-optical image obtained in step S1 and the three-dimensional surface contour image obtained in step S2 (Nakazato; Para. [0116]. A correspondence/alignment between a 2-D image and a 3-D surface image is obtained.); and
S4, obtaining a three-dimensional solid structure (Nakazato; Para. [0117-118, 120]. A 3-D structure is obtained.), and reconstructing a three-dimensional bioluminescence image or a three-dimensional molecular fluorescence image through combining the three-dimensional solid structure with data obtained based on the alignment in step S3 (Nakazato; Para. [0117-118, 120]. A 3-D shape bio luminance of geometric information is reconstructed with incorporating 3-D structure with correspondence information.), wherein the two-dimensional bio-optical image of the imaging target obtained in step S1 and the three-dimensional surface contour image of the imaging target obtained in step S2 are obtained by a same CCD camera (Nakazato; Fig. 4, 6, Para. [0115-117]. 2-D bio optical image and 3-D surface image are obtained by a same CCD camera, also see Para. [0030], wherein the different processes are completed by different units in a same imaging/camera system.).
But Nakazato does not specifically disclose obtaining a three-dimensional solid structure through filling the obtained three-dimensional surface contour image with biological tissues having different optical properties, wherein the three-dimensional bioluminescence image or the three-dimensional molecular fluorescence image indicates a three-dimensional spatial distribution ofbio-optical signals in an organism.
However, Dacosta teaches obtaining a three-dimensional solid structure through filling the obtained three-dimensional surface contour image with biological tissues having different optical properties (Dacosta; Fig. 38, Para. [0165]. A 3-D structure image with bio tissues having optical properties.), wherein the three-dimensional bioluminescence image or the three-dimensional molecular fluorescence image indicates a three-dimensional spatial distribution of bio-optical signals in an organism (Dacosta; Fig. 38, Para. [0165]. 3D bioluminescence images or 3D molecular fluorescence images of spatial distribution of bio-optical signals for an organism, also see Para. [0186] are generated.).
Therefore, it would have been obvious to a person with ordinary skill in the pertinent before the effective filing date of the claimed invention to modify the 3-D imaging system of Nakazato to adapt an image processing approach, by incorporating Dacosta’s teaching wherein 3-D structure is obtained via using optical properties of bio tissues for 3-D surface images, for the motivation to obtain 3-D fluorescence images for legion identification (Dacosta; Abstract.).
Regarding claim 8, Nakazato discloses a three-dimensional (3D) optical imaging system, wherein the 3D optical imaging system uses a 3D optical imaging method comprising the following steps (Nakazato; Fig. 4, 6, Para. [0115-117]. A 3-D optical imaging system/method is used for imaging.):
S1, obtaining a two-dimensional bio-optical image of an imaging target by a Charge Coupled Device (CCD) camera, the two-dimensional bio-optical image being a bioluminescence image or a molecular fluorescence image;
S2, obtaining a three-dimensional surface contour image of the imaging target by using the CCD camera in step S1 in combination with structured light;
S3, performing an alignment on the two-dimensional bio-optical image obtained in step S1 and the three-dimensional surface contour image obtained in step S2; and
S4, obtaining a three-dimensional solid structure (Nakazato; Para. [0117-118, 120]. A 3-D structure is obtained.), and reconstructing a three-dimensional bioluminescence image or a three-dimensional molecular fluorescence image through combining the three-dimensional solid structure with data obtained based on the alignment in step S3 (Nakazato; Para. [0117-118, 120]. A 3-D shape bio luminance of geometric information is reconstructed with incorporating 3-D structure with correspondence information.), wherein the two-dimensional bio-optical image of the imaging target obtained in step S1 and the three-dimensional surface contour image of the imaging target obtained in step S2 are obtained by a same CCD camera (Nakazato; Fig. 4, 6, Para. [0115-117]. 2-D bio optical image and 3-D surface image are obtained by a same CCD camera, also see Para. [0030], wherein the different processes are completed by different units in a same imaging/camera system.),
wherein the 3D optical imaging system comprises the imaging support and the imaging system Nakazato; Fig. 3, 5, Para. [0144]. An angle is adjusted between an image support and system.), the imaging system comprising the excitation light source and the CCD camera, the imaging target being fixed at the imaging support (Nakazato; Fig. 3, 5, Para. [0144]. An imaging system includes light source and a CCD camera, an object fixed at an imaging support.), the excitation light source and the CCD camera being disposed at a same side of the imaging target or the excitation light source and the CCD camera being disposed at two sides of the imaging target respectively, and the projector being disposed at a side of the imaging target (Nakazato; Fig. 3, 5, Para. [0144]. A light source and a CCD camera is disposed at a same side of an object or a light source and a CCD camera is disposed at two sides of an object.).
But Nakazato does not specifically disclose obtaining a three-dimensional solid structure through filling the obtained three-dimensional surface contour image with biological tissues having different optical properties, wherein the three-dimensional bioluminescence image or the three-dimensional molecular fluorescence image indicates a three-dimensional spatial distribution of bio-optical signals in an organism.
However, Dacosta teaches obtaining a three-dimensional solid structure through filling the obtained three-dimensional surface contour image with biological tissues having different optical properties (Dacosta; Fig. 38, Para. [0165]. A 3-D structure image with bio tissues having optical properties.),
wherein the three-dimensional bioluminescence image or the three-dimensional molecular fluorescence image indicates a three-dimensional spatial distribution of bio-optical signals in an organism (Dacosta; Fig. 38, Para. [0165]. 3D bioluminescence images or 3D molecular fluorescence images of spatial distribution of bio-optical signals for an organism, also see Para. [0186] are generated.).
Therefore, it would have been obvious to a person with ordinary skill in the pertinent before the effective filing date of the claimed invention to modify the 3-D imaging system of Nakazato to adapt an image processing approach, by incorporating Dacosta’s teaching wherein 3-D structure is obtained via using optical properties of bio tissues for 3-D surface images, for the motivation to obtain 3-D fluorescence images for legion identification (Dacosta; Abstract.).
Claims 2-7 and 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Nakazato (US Pub. 20160104314 A1, IDS filed on 12/03/2024) in view of Dacosta (WO 2020148721 A1) as applied to claim 1 above, and further in view of Zuo (US Pub. 20220221270 A1, IDS filed on 12/03/2024).
Regarding claim 2, modified Nakazato teaches step S2 (Nakazato; See remarks regarding claim 1.).
But it does not specifically step S2 comprises the following steps: S201, turning on a projector, projecting modulated stripe-patterned structured light onto a surface of the imaging target, and capturing stripes on the surface of the imaging target using the CCD camera; S202, processing the stripes to obtain a phase distribution map of the surface of the imaging target; S203, obtaining a phase-coordinate relationship subsequent to a geometric calibration, and converting the phase distribution into three-dimensional coordinates using the phase-coordinate relationship obtained subsequent to the geometric calibration; S204, adjusting an angle between an imaging support and an imaging system, and repeating steps S201 to S203; and S205, obtaining multi-angle three-dimensional coordinates of the imaging target by adjusting the angle, such that the three-dimensional surface contour image of the imaging target is obtained.
However, Zuo teaches step S2 comprises the following steps:
S201, turning on a projector, projecting modulated stripe-patterned structured light onto a surface of the imaging target, and capturing stripes on the surface of the imaging target using the CCD camera (Zuo; Para. [0004-0010]. A projector is turned to project stripe structure light to surface of an object for capturing stripes on a surface of an object using a CCD camera.);
S202, processing the stripes to obtain a phase distribution map of the surface of the imaging target (Zuo; Para. [0004-0010]. Stripes are processed to determine phase distribution map of a surface of an object.);
S203, obtaining a phase-coordinate relationship subsequent to a geometric calibration, and converting the phase distribution into three-dimensional coordinates using the phase-coordinate relationship obtained subsequent to the geometric calibration (Zuo; Para. [0004-0010]. A phase coordinate is obtained subsequent to a geometric calibration and phase distribution is converted into 3-D coordinates using phase coordinate subsequent to geometric calibration.);
S204, adjusting an angle between an imaging support and an imaging system, and repeating steps S201 to S203 (Zuo; Para. [0004-0010]. An angle is adjusted between an image support and system, and steps 201 to 203 are repeated.); and
S205, obtaining multi-angle three-dimensional coordinates of the imaging target by adjusting the angle, such that the three-dimensional surface contour image of the imaging target is obtained (Zuo; Para. [0004-0010]. Multi angle 3-D coordinate is obtained for an object by adjusting an angle, wherein 3-D surface of an object is obtained.).
Therefore, it would have been obvious to a person with ordinary skill in the pertinent before the effective filing date of the claimed invention to further modify the 3-D imaging system of modified Nakazato to adapt an image processing approach, by incorporating Zuo’s teaching wherein phase images of different projection angles are obtained, for the motivation to perform calibration for 3-D imaging system (Zuo; Abstract.).
Regarding claim 3, modified Nakazato teaches the phase distribution map of the surface of the imaging target is obtained through: obtaining stripe images of different phases contained in images captured each time (Zuo; Para. [0004-0010]. Striped images are obtained for each different time.),
obtaining a wrapped phase distribution of the stripes through performing an algebraic operation and a stitching operation on the stripe images (Zuo; Para. [0004-0010]. Wrapped phase maps are obtained using algebraic algorithm and combined operation on striped images.), and
performing a spatial phase expansion on wrapped phases based on spatial sequence information of the stripes to obtain the phase distribution map of the surface of the imaging target (Zuo; Para. [0022]. A spatial phase expansion on wrapped mages is used in accordance with spatial information of stripes for obtaining phase maps of an object surface.).
Regarding clam 4, modified Nakazato teaches the geometric calibration of the phase-coordinate relationship is performed by: given phases and CCD camera image coordinates that are known, converting the CCD camera image coordinates into three-dimensional coordinates in a CCD camera coordinate system using a camera parameter (Zuo; Para. [0004-0010]. CCD camera coordinates are converted into 3-D coordinates in accordance with camera parameters, for CCD camera coordinates being known, also see Para. [0019-20, 23].),
converting the three-dimensional coordinates in the CCD camera coordinate system into three-dimensional coordinates in a projector coordinate system based on a relative parameter between the projector and the CCD camera (Zuo; Para. [0004-10]. 3-D coordinates in CCD camera are converted into 3-D coordinates associated with a projector in accordance with parameters associated with a projector and a CCD camera, also see Para. [0019-20, 23].),
converting the three-dimensional coordinates in the projector coordinate system into projector image coordinates based on a projector parameter (Zuo; Para. [0004-10]. 3-D coordinates associated with a projector are converted into project coordinates in accordance with a project parameter, also see Para. [0019-20, 23].), and
obtaining the phase-coordinate relationship based on a one-to-one correspondence between the phases and the projector image coordinates (Zuo; Para. [0004-10]. A phase-coordinate relationship is obtained in accordance with one-to-one relationship between phases and projector coordinates, also see Para. [0019-20, 23].).
Regarding claim 5, modified Nakazato teaches the molecular fluorescence image is obtained through: turning on an excitation light source (Nakazato; Para. [0149]. A fluorescence image is obtained by using/turning on an excitation light source.),
emitting laser light by the excitation light source to irradiate the imaging target (Nakazato; Para. [0149]. A laser light source is emitted by an excitation light onto an object.),
exciting fluorescent molecules carried by an imaging object (Nakazato; Para. [0149]. Fluorescent particles/molecules of an object is excited), and
generating emission fluorescence (Nakazato; Para. [0149]. Emission fluorescence is generated.); and
obtaining the two-dimensional bio-optical image through collecting and processing, by the CCD camera (Nakazato; Para. [0149]. 2-D bio images are obtained via acquisition and processing with a CCD camera.), the generated emission fluorescence subsequent to the generated emission fluorescence being reflected by a reflection mirror and passing through a filter, or subsequent to the generated emission fluorescence passing through the filter without being reflected (Nakazato; Para. [0149]. Generated fluorescence after emission fluorescence generation is passed through filter without being reflected.).
Regarding claim 6, modified Nakazato teaches the bioluminescence image is obtained through: releasing a bioluminescence signal in response to a chemical reaction inside an imaging object, and obtaining a two-dimensional biological image through collecting and processing, by the CCD camera (Dacosta; Para. [0165-166, 171]. A bio luminance image is obtained with bio luminance signal being released in response to chemical reaction of an object and captured in a 2-D bio image via collecting and processing by a CCD camera. Nakazato; Para. [0149]. A bio luminance image is obtained with bio luminance signal being released in response to chemical reaction of an object and captured in a 2-D bio image via collecting and processing by a CCD camera.),
the released bioluminescence signal subsequent to the released bioluminescence signal being reflected by a reflection mirror and passing through a filter, or obtaining the two-dimensional bio-optical image through collecting and processing, by the CCD camera, the released bioluminescence signal subsequent to the released bioluminescence signal passing through the filter without being reflected, or obtaining the two-dimensional bio-optical image through directly collecting and processing, by the CCD camera, the released bioluminescence signal without the released bioluminescence signal being reflected or passing through the filter (Dacosta; Para. [0165-166, 171]. 2-D bio images are obtained through collecting and processing by a CCD camera, bio luminance signal passing through filter with being reflected or being reflected without through a filter. Nakazato; Para. [0165-166, 171]. 2-D bio images are obtained through collecting and processing by a CCD camera, bio luminance signal passing through filter with being reflected or being reflected without through a filter.).
Regarding claim 7, modified Nakazato teaches the data obtained based on the alignment in step S3 described in step S4 comprises a correspondence between points on the two-dimensional bio-optical image and points on the three-dimensional surface contour image and a corresponding optical signal intensity (Nakazato; Para. [0116-118, 120]. Data obtained associated with 2-D and 3-D correspondence includes correspondence between points of a 2-D bio image and points of a 3-D surface image and corresponding signal intensity.).
Regarding claim 9, modified Nakazato teaches a reflection mirror is disposed between the imaging target and the CCD camera, and the projector is disposed at a side of the reflection mirror (Zuo; Para. [0004-0010]. A reflection mirror is placed between an object and a CCD caerma, and a projector is placed at a side of a mirror.).
Regarding claim 10, modified Nakazato teaches wherein a filter is disposed between the reflection mirror and the CCD camera (Zuo; Para. [0004-0010]. A filter, e.g. a color filter, is placed between a mirror and a CCD camera.).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Razzaque (US Pat. 8641621 B2) teaches an imaging system for image-guided medical procedures.
Ntziachristos (US Pub. 20040015062 A1) teaches a system for fluorescence-mediated molecular tomography.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALBERT KIR whose telephone number is (571)272-6245. The examiner can normally be reached Monday - Friday, 8:30am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached at (571) 272-2988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ALBERT KIR/ Primary Examiner, Art Unit 2485