DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Preliminary Remarks
This is a reply to the application filed on 12/18/2024, in which, claims 1-15 remain pending in the present application with claims 1, 14, and 15 being independent claims.
When making claim amendments, the applicant is encouraged to consider the references in their entireties, including those portions that have not been cited by the examiner and their equivalents as they may most broadly and appropriately apply to any particular anticipated claim amendments.
Information Disclosure Statement
The information disclosure statements (IDS) submitted on December 18, 2024 and June 12, 2025 are in compliance with the provisions of 37 CFR 1.97 and are being considered by the Examiner.
Claim Objections
Claim 8 is objected to because of the following informalities:
The limitation, “wherein the one or more processors the emission intensity of the plurality of light emitting devices arranged in an array and the composite ratio of the first and second images are varied according to the area of the composite image” in lines 1-4 should apparently be changed to be grammatically correct. Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. - An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
Use of the word “device” (or “step for”, “unit”, “element”, “mechanism”, “module”, “means”, “engine”, “component”, “member”, “apparatus”, “machine”, “system”, “assembly”, “portion”) in a claim with functional language creates a rebuttable presumption that the claim element is to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is invoked is rebutted when the function is recited with sufficient structure, material, or acts within the claim itself to entirely perform the recited function.
Absence of the word “device” in a claim creates a rebuttable presumption that the claim element is not to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is not invoked is rebutted when the claim element recites function but fails to recite sufficiently definite structure, material or acts to perform that function.
The claim limitations use a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
the plurality of light emitting devices arranged in an array… in claim 8.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. (FP 7.30.06).
For more information, see MPEP § 2173 et seq. and Supplementary Examination Guidelines for Determining Compliance With 35 U.S.C. 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-15 are rejected under 35 U.S.C. 103 as being unpatentable over Oda et al. (US 20120249782 A1, hereinafter referred to as “Oda”) in view of Feder et al. (US 20210314507 A1, hereinafter referred to as “Feder”).
Regarding claim 1, Oda discloses a camera system comprising:
radiate terahertz waves (see Oda, paragraph [0051]: “THz wave 2 emitted by THz light source 1 is radiated to sample (measurement subject) 3 and then detected as a reflected wave or transmitted wave and captured as an image by THz camera”);
acquire a first signal at a timing at which the terahertz waves are radiated and acquire a second signal different from the first signal at a timing at which the terahertz waves are not radiated (see Oda, paragraph [0052]: “Sync signal (image capturing timing signal) 5 that represents an image capturing timing of THz camera 4 is input from THz camera” and paragraph [0055]: “THz camera 4 causes the ON/OFF periods of THz light source 1 to synchronize with image capturing timings of THz camera 4 so as to perform lock-in image capturing for sample”);
acquire a difference between the first and second signals (see Oda, paragraph [0110]: “a control unit that integrates a plurality of frame images captured by said camera during an ON period of said light source, integrates a plurality of frame images captured by said camera during an OFF period of said light source, the number of frame images captured by said camera during the ON period being the same as the number of frame images captured by said camera during the OFF period, subtracts an integrated image of the frame images captured by said camera during the OFF period of said light source from an integrated image of the frame images captured by said camera during the ON period of said light source, and obtains the difference between their images”); and
generate a first image from the difference between the first and second signals (see Oda, paragraph [0045]: “FIG. 9 is a schematic diagram showing a difference in an image in which an integrated image of a plurality of images of the sample shown in FIG. 8 captured during an OFF period of a THz light source is subtracted from an integrated image of a plurality of images of the sample shown in FIG. 8 captured during an ON period thereof for an image capturing device”) and generate a second image different from the first image from the second signal (see Oda, paragraph [0043]: “FIG. 7F is a schematic diagram describing a method for an image capturing device according to an exemplary embodiment of the present invention in which images captured during an ON period of a THz light source and images captured during an OFF period thereof are stored”).
Regarding claim 1, Oda discloses all the claimed limitations with the exception of a camera system comprising: one or more memories storing instructions; and one or more processors executing the instructions.
Feder from the same or similar fields of endeavor discloses a camera system comprising:
one or more memories storing instructions (see Feder, paragraph [0075]: “Processor subsystem 360 may include, without limitation, one or more central processing unit (CPU) cores 370, a memory interface 380, input/output interfaces unit 384, and a display interface unit 382, each coupled to an interconnect”); and
one or more processors executing the instructions (see Feder, paragraph [0075]: “The one or more CPU cores 370 may be configured to execute instructions residing within memory subsystem”).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to utilize the teachings as in Feder with the teachings as in Oda. The motivation to doing so would ensure the system to have the ability to use the system disclosed in Feder to include one or more central processing unit (CPU) cores wherein the one or more CPU cores 370 may be configured to execute instructions residing within memory subsystem; to comprise a non-transitory computer-readable medium, which may be configured to include programming instructions for execution by one or more processing units; to accumulate curve-fitting statistics for a linear or a quadratic curve fit used for implementing white-balance correction on an image; to implement blending technique to perform image blend operation wherein exemplary blending techniques known in the art include bilateral filtering; and to present the image stack or synthetic image generated from the image stack through display unit wherein the programming instructions may also implement one or more modules for merging images or portions of images within the image stack to align at least portions of each image within the image stack thus comprising one or more memories storing instructions; including one or more processors executing the instructions and configuring to store a computer program comprising instructions for executing using non-transitory computer-readable storage medium; applying a digital gain or a gamma curve to at least one of the first and second signals when the first image is generated; applying a lowpass filter or a bilateral filter to at least one of the first and second signals when the first image is generated; and generating a third image obtained by composing the first and second images and cause a display device to display the generated third image in order to perform an image correction process of changing lightness or contrast of the image by applying a digital gain or a gamma curve and to perform a noise reduction process by applying a lowpass filter or a bilateral filter so that it is possible to improve quality of the captured images.
Regarding claim 2, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors acquire the first and second signals based on a control signal output in synchronization with a radiation unit radiating the terahertz waves and a camera unit detecting the terahertz waves (see Oda, paragraph [0051]: “THz wave 2 emitted by THz light source 1 is radiated to sample (measurement subject) 3 and then detected as a reflected wave or transmitted wave and captured as an image by THz camera 4. According to this exemplary embodiment, it is assumed that THz wave 2 is detected as a reflected wave”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 3, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 2, wherein ON and OFF times of the control signal are equivalent to each other (see Oda, paragraph [0071]: “In a first method shown in FIG. 7A, three types of buffers X, Y, and Z are prepared. Eight pieces of image data captured during an ON period of THz light source 1 are stored in eight buffer memories X1˜X8 that compose buffer X; eight pieces of image data captured during an OFF period thereof are stored in eight buffer memories Y1˜Y8 that compose buffer Y; and then image data representing the differences between their images are stored in eight buffer memories Z1˜Z8 that compose buffer Z”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 4, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors apply a digital gain or a gamma curve to at least one of the first and second signals when the first image is generated (see Feder, paragraph [0083]: “The sum of color channel intensities may then be used to perform a white-balance color correction on an associated image, according to a white-balance model such as a gray-world white-balance model. In other embodiments, curve-fitting statistics are accumulated for a linear or a quadratic curve fit used for implementing white-balance correction on an image”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 5, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors apply a lowpass filter or a bilateral filter to at least one of the first and second signals when the first image is generated (see Feder, paragraph [0055]: “synthetic image 250 includes overall image detail, as well as image detail from high brightness region 220 and low brightness region 222. Image blend operation 240 may implement any technically feasible operation for blending an image stack. For example, any high dynamic range (HDR) blending technique may be implemented to perform image blend operation 240. Exemplary blending techniques known in the art include bilateral filtering”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 6, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors separately acquire a period in which the difference between the first and second signals is acquired and a period in which the second signal is acquired (see Oda, paragraph [0069]: “phase compensation circuit 16 compensates the difference between the phase of sync signal 5 and the phase of the external output image data. Specifically, phase compensation circuit 16 generates image obtaining timing signal 23 such that sync signal 5 delays for a predetermined time {(compensation process time of compensation circuit 19)+(update time of buffer 20)}. CPU 14 obtains external output image data from THz camera 4 based on image obtaining timing signal”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 7, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors generate a third image by composing the first and second images (see Feder, paragraph [0054]: “FIG. 2 illustrates generating a synthetic image 250 from an image stack 200, according to one embodiment of the present invention. As shown, image stack 200 includes images 210, 212, and 214 of a photographic scene comprising a high brightness region 220 and a low brightness region 222”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 8, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 7, wherein the one or more processors the emission intensity of the plurality of light emitting devices arranged in an array and the composite ratio of the first and second images are varied according to the area of the composite image (see Feder, paragraph [0092]: “one or more parameters computed from one or more specified subsets of pixel information sampled from pixel array 410. One exemplary parameter defines a subset of pixels to be a two-dimensional contiguous region of pixels associated with a desired exposure point. Here, an exposure parameter may be computed, for example, as a median intensity value for the region, or as a count of pixels exceeding a threshold brightness for the region. For example, a rectangular region corresponding to an exposure point may be defined within an image associated with the pixel array, and a median intensity may be generated for the rectangular region, given certain exposure parameters such as exposure time and ISO sensitivity”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 9, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors display at least one of the first and second images on a display device (see Oda, paragraph [0076]: “CPU 14 integrates image data representing the difference between images obtained as described above and displays integrated image data representing the difference between images”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 10, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors cause a display device to display at least one of the first and second images (see Oda, paragraph [0077]: “Alternatively, CPU 14 may perform a method in which image data are integrated before differences between images are obtained instead of a method in which image data are integrated after image data representing differences between images are obtained”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 11, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors generate a third image obtained by composing the first and second images and cause a display device to display the generated third image (see Feder, paragraph [0069]: “one or more modules for sampling an image stack through camera module 330, one or more modules for presenting the image stack or synthetic image generated from the image stack through display unit 312. The programming instructions may also implement one or more modules for merging images or portions of images within the image stack, aligning at least portions of each image within the image stack”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 12, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors temporally switch between the first and second images and cause the display device to display the first and second images (see Oda, paragraph [0078]: “CPU 14 integrates a plurality of pieces of image data captured during an ON period of THz light source (QCL) 1, integrates a plurality of pieces of image data captured during an OFF period thereof, calculates the differences between the integrated image data of the plurality of pieces of image data captured during the ON period and the integrated image data of the plurality of pieces of image data captured during the OFF period, and then displays the difference between images of the integrated image data”).
The motivation for combining the references has been discussed in claim 1 above.
Regarding claim 13, the combination teachings of Oda and Feder as discussed above also disclose the camera system according to claim 1, wherein the one or more processors highlight and display at least one of the first and second images when the display device is caused to display the first and second images (see Oda, paragraph [0076]: “CPU 14 integrates image data representing the difference between images obtained as described above and displays integrated image data representing the difference between images”).
The motivation for combining the references has been discussed in claim 1 above.
Claim 14 is rejected for the same reasons as discussed in claim 1 above.
Claim 15 is rejected for the same reasons as discussed in claim 1 above. In addition, the combination teachings of Oda and Feder as discussed above also disclose a non-transitory computer-readable storage medium configured to store a computer program comprising instructions for executing (see Feder, paragraph [0069]: “NV memory 316 comprises a non-transitory computer-readable medium, which may be configured to include programming instructions for execution by one or more processing units”).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NIENRU YANG whose telephone number is (571)272-4212. The examiner can normally be reached Monday-Friday 10AM-6PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, THAI TRAN can be reached at 571-272-7382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
NIENRU YANG
Examiner
Art Unit 2484
/NIENRU YANG/Examiner, Art Unit 2484
/THAI Q TRAN/Supervisory Patent Examiner, Art Unit 2484