Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Applicant’s amendments, see Pages 4-8, Discussion of Claim Rejections under 35 U.S.C. 103, filed 01/08/2026, with respect to the rejection(s) of claim(s) 1-3 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of US-2017/0322146-A1.
Response to Arguments
Applicant’s arguments with respect to claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Objections
Claim 1 is objected to because of the following informalities: lines 21-23 will be read as “in the depth image, the plurality of points in the depth direction are represented in association with each of a plurality of measurement positions, wherein the plurality of measurement positions are selected to be aligned on a straight line, and”
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
Determining the scope and contents of the prior art.
Ascertaining the differences between the prior art and the claims at issue.
Resolving the level of ordinary skill in the pertinent art.
Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Suh et al. (US 2014/0113283 A1) in view of Murayama et al. (US 2017/0322146 A1).
Regarding independent Claim 1, Suh discloses a Raman microscope (Figure 3; [0056] “the microscope module comprises a motion controller 50 …, a Raman filtration unit 40 …, and a detector 111”) that acquires a Raman spectrum from a sample (Figure 3; [0090] “obtaining one or more Raman images from the sample”) on a stage (Figure 3; [0080] “moving a stage … in the X or Y axis direction”) by condensing laser light (Figure 3: element 30 is a microscope objective lens for focusing the laser beam; [0057]), irradiating the sample with the laser light, and receiving Raman scattered light from the sample by a detector (Figure 3; [0090] “irradiating a laser beam on the sample to generate Raman scattered light, filtering the Raman scattered light through one or more Raman filters to extract a Raman wavelength of interest, and detecting the Raman spectrum using a detector”), but does not specifically teach the Raman microscope comprising:
a depth measurement processor that performs depth measurement by changing a focal position of the laser light along a depth direction of the sample which is an irradiation direction of the laser light with respect to the sample, while acquiring the Raman spectrum of the sample at a plurality of points in the depth direction; and
a display processor that:
causes a display unit to display at least one of the Raman spectrum obtained at the plurality of points by the depth measurement,
causes the display unit to display a depth image representing the plurality of points in the depth direction under a surface of the sample,
receives a user input for selecting one point among the plurality of points in the depth image, and
causes the display unit to display the Raman spectrum which corresponds to the selected point in the depth image, wherein
the depth measurement processor can change the focal position of the laser light along the depth direction while acquiring the Raman spectra at the plurality of points in the depth direction,
in the depth image, the plurality of points in the depth direction are represented in association with each of a plurality of measurement positions, wherein the plurality of measurement positions are selected to be aligned on a straight line, and
in the depth image, the plurality of points in the depth direction are represented in association with each of the plurality of measurement positions by two-axis display of a direction in which the plurality of measurement positions are aligned and the depth direction.
However, Murayama, in the same field of Raman microscopy, teaches a depth measurement processor (Figure 9: element 905 is a controller; [0042]) that performs depth measurement by changing a focal position of the laser light (Figure 9: element 901 is a light source; [0037]) along a depth direction (Figure 9; [0039] “in the direction parallel to the optical axis (Z direction)”) of the sample (Figure 9: element 910 is a specimen; [0037]) which is an irradiation direction of the laser light with respect to the sample (Figure 9: downward arrows indicate an irradiation direction; [0038] “Light from the light source 901 is shaped as appropriate at the optical system 902, and cast upon the specimen 910”), while acquiring the Raman spectrum of the sample at a plurality of points in the depth direction (Figure 9; [0039] “Moving the stage 903 in the Z direction allows multiple so-called Z-stack images to be obtained, and a three-dimensional distribution of spectral information to be obtained”); and
a display processor (Figure 2: element 201 is a central processing unit (CPU); [0044] “CPU 201 also performs later-described computation processing, data processing, and so forth”) that:
causes a display unit to display at least one of the Raman spectrum (Figure 8E; [0105] “a window 18 displaying the three-dimensional form of the second feature region, and a window 19 displaying the Raman spectrum in the second feature region, upon the screen 10”, wherein “screen 10” is a display) obtained at the plurality of points by the depth measurement ([0076] “a space (three-dimensional region) including all data points existing in the depth direction from that plane region” is interpreted as a plurality of points),
causes the display unit (Figure 7A-B, 8A-E: element 10 is a screen; [0070]) to display a depth image ([0039] “multiple so-called Z-stack images”) representing the plurality of points in the depth direction ([0076] “a space (three-dimensional region) including all data points existing in the depth direction from that plane region” is interpreted as a plurality of points in the depth direction) under a surface of the sample (Figure 9: element 910 is a specimen; [0037]),
receives a user input for selecting one point among the plurality of points in the depth image (Figure 2; [0076] “Upon the user selecting a plane region within the PQ plane of the two-dimensional projected image, the CPU 201 (information obtaining unit) obtains the position information of that plane region. The CPU 201 then selects a space (three-dimensional region) including all data points existing in the depth direction from that plane region”, wherein “a space (three-dimensional region) including all data points existing in the depth direction from that plane region” is interpreted as a plurality of points in the depth image), and
causes the display unit to display the Raman spectrum (Figure 8E; [0105] “a window 18 displaying the three-dimensional form of the second feature region, and a window 19 displaying the Raman spectrum in the second feature region, upon the screen 10”, wherein “screen 10” is a display) which corresponds to the selected point in the depth image (Figure 8E; [0103] “this second feature region is the feature region which the user has specified”), wherein
the depth measurement processor (Figure 9: element 905 is a controller; [0042]) can change the focal position of the laser light (Figure 9: element 901 is a light source; [0037]) along the depth direction (Figure 9; [0039] “in the direction parallel to the optical axis (Z direction)”) while acquiring the Raman spectra at the plurality of points in the depth direction (Figure 9; [0039] “Moving the stage 903 in the Z direction allows multiple so-called Z-stack images to be obtained, and a three-dimensional distribution of spectral information to be obtained”),
in the depth image ([0039] “multiple so-called Z-stack images”), the plurality of points in the depth direction ([0076] “a space (three-dimensional region) including all data points existing in the depth direction from that plane region” is interpreted as a plurality of points in the depth direction) are represented in association with each of a plurality of measurement positions ([0076] “a space (three-dimensional region) including all data points existing in the depth direction from that plane region”, wherein “plane region” will comprise a plurality of positions), wherein the plurality of measurement positions are selected to be aligned on a straight line ([0075] “operating a pointer with a mouse, selecting one data point by numeral input or the like” is interpreted as selecting a plurality of points in a straight line), and
in the depth image ([0039] “multiple so-called Z-stack images”), the plurality of points in the depth direction ([0076] “a space (three-dimensional region) including all data points existing in the depth direction from that plane region” is interpreted as a plurality of points in the depth direction) are represented in association with each of the plurality of measurement positions ([0076] “a space (three-dimensional region) including all data points existing in the depth direction from that plane region”, wherein “plane region” will comprise a plurality of positions) by two-axis display of a direction in which the plurality of measurement positions are aligned and the depth direction (Figure 5C; [0063] “a schematic view observing the PQ plane from the side, where the coordinate axis R is equivalent to the depth-direction axis of the screen”, wherein the view has two axes).
Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the Raman microscope of Suh with a depth measurement processor that performs depth measurement by changing a focal position of the laser light along a depth direction of the sample which is an irradiation direction of the laser light with respect to the sample, while acquiring the Raman spectrum of the sample at a plurality of points in the depth direction; and a display processor that: causes a display unit to display at least one of the Raman spectrum obtained at the plurality of points by the depth measurement, causes the display unit to display a depth image representing the plurality of points in the depth direction under a surface of the sample, receives a user input for selecting one point among the plurality of points in the depth image, and causes the display unit to display the Raman spectrum which corresponds to the selected point in the depth image, wherein the depth measurement processor can change the focal position of the laser light along the depth direction while acquiring the Raman spectra at the plurality of points in the depth direction, in the depth image, the plurality of points in the depth direction are represented in association with each of a plurality of measurement positions, wherein the plurality of measurement positions are selected to be aligned on a straight line, and in the depth image, the plurality of points in the depth direction are represented in association with each of the plurality of measurement positions by two-axis display of a direction in which the plurality of measurement positions are aligned and the depth direction, as taught by Murayama, because “according to the microscope system of the present embodiment, depth-wise information is added to the two-dimensional projected image, so three-dimensional forms and overlapping feature regions can be identified. Accordingly, appropriate specification of feature regions can be easily performed, and analysis precision is improved.” (Murayama, para 106)
Further, “Displaying such spectral information included in the feature region enables spectral information to be visually configured in addition to the form information of the feature region, whereby analysis throughput can be improved.” (Murayama, para 113)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Akbar H Rizvi whose telephone number is (571) 272-5085. The examiner can normally be reached Monday - Friday, 9:30 am - 6:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tarifur R Chowdhury can be reached at (571) 272-2287. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AKBAR H. RIZVI/
Examiner, Art Unit 2877
/TARIFUR R CHOWDHURY/Supervisory Patent Examiner, Art Unit 2877