DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 04/04/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Objections
Claims 4 and 9 are objected to because the following informalities:
In Claim 4, the phrase “a second matched filter which is different from the first matched filter” should be “a second matched filter which is different from the first matched filter is calculated.”
In Claim 9, the phrase “a improved” should be “an improved.”
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims
particularly pointing out and distinctly claiming the subject matter which the
inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out
and distinctly claiming the subject matter which the applicant regards as his
invention.
Claims 6 and 8-9 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding Claim 6, the claim recites the limitation “a distance set with respect to the … matched filter.” It is unclear if “a distance set” refers to a set of distances (e.g., a group or plurality of distances) or to a distance setting related to a matched filter. Based on paragraph [0131] of the specification, the limitation is interpreted as a distance setting related to a matched filter.
Regarding Claim 8, the claim recites the limitation “the second radar image is a radar image with a degraded signal to noise ratio of the first radar image.” It is unclear if the second radar image is a degraded version of the first radar image, or if the second radar image is a separate image that has a SNR that is degraded relative to the SNR of the first radar image.
Regarding Claim 9, the claim recites the limitation “the second radar image is a radar image with a improved signal to noise ratio of the first radar image.” It is unclear if the second radar image is an improved version of the first radar image, or if the second radar image is a separate image that has a SNR that is improved relative to the SNR of the first radar image.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C.
102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the
statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a
new ground of rejection if the prior art relied upon, and the rationale supporting the rejection,
would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the
basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in
public use, on sale, or otherwise available to the public before the effective filing
date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or
in an application for patent published or deemed published under section 122(b),
in which the patent or application, as the case may be, names another inventor and
was effectively filed before the effective filing date of the claimed invention.
Claims 1-3, 8-11, and 13-15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Tomoji (JP 2022078753 A).
Regarding Claim 1, Tomoji discloses: An information processing apparatus comprising a processor ([0013]: “signal processing unit 15”; “processor”) configured to:
acquire a first observation signal from a radar device with a plurality of antennas configured to transmit a radar signal and to receive a radar echo based on a reflected wave of the radar signal, the first observation signal being based on the radar signal and the radar echo ([0010]: “antenna 13 is a phased array antenna consisting of multiple antenna elements”; [0011]: “Antenna 13 radiates radar waves transmitted from the transmitter 11…”; “…captures the arriving signal, including echoes reflected from the target.”);
generate a first radar image by executing a signal process based on a predetermined first condition, with respect to the first observation signal ([0014]: “creates image data”; [0023]: “the input SAR image data and ISAR image data are first processed using IFFT (Inverse Fast Fourier Transform) to convert them into IQ data before SAR image processing and ISAR image processing (step S02)”; [0031]: “all of the obtained IQ data are subjected to FFT processing again to obtain SAR image data and ISAR image data (step S04)”); and
generate a second radar image by executing a signal process based on a second condition which is different from the first condition, with respect to the first observation signal ([0019]: “creating new image data”; [0023]: “step S02”; [0024]: “various processes are performed on this IQ data to increase the number of images (step S03)”; [0025-0030]; [0031]: “step S04”), wherein
the first and second radar images are used as training data for a learning model to detect an object to which the radar signal is transmitted ([0040]: “these increased SAR image data and ISAR image data … are output to the learning unit 15B (step S07)”; [0041]: “deep learning is performed using a convolutional neural network with the input image data as training data”).
Regarding Claim 2, Tomoji discloses: wherein the first radar image is generated using the first observation signal ([0012]: “outputs the obtained received signal to the signal processing unit 15”; [0022-0040]), and
the second radar image is generated using a third observation signal generated by adding a second observation signal which is different from the first observation signal to the first observation signal ([0027]: “generating an image that averages multiple low-resolution images”; [0039]: “adding noise to image data”).
Regarding Claim 3, Tomoji discloses: wherein the first radar image is generated using the first observation signal ([0012]: “outputs the obtained received signal to the signal processing unit 15”; [0022-0040]), and
the second radar image is generated using a second observation signal which is a part of the first observation signal ([0026]: “Adjusting the synthetic aperture time allows for the simulation of resolution degradation by adjusting the amount of data used in SAR image processing”).
Regarding Claim 8, Tomoji discloses: wherein the second radar image is a radar image with a degraded signal to noise ratio of the first radar image ([0033]: “superimposes speckle noise specific to radar images”; [0039]: “adding noise to image data”).
Regarding Claim 9, Tomoji discloses: wherein the second radar image is a radar image with a improved signal to noise ratio of the first radar image ([0027]: “Multi-look simulation can simulate the effects of multi-look processing for noise reduction”).
Regarding Claim 10, Tomoji discloses: wherein the first radar image is generated based on a first dynamic range ([0039]: “adjusting the brightness of the image.”; Examiner note: the image data that is unaltered would have a first dynamic range), and
the second radar image is generated based on a second dynamic range which is different from the first dynamic range ([0039]: “adjusting the brightness of the image.”).
Regarding Claim 11, Tomoji discloses: wherein the training data includes the first and second radar images, and label information representing an object included in the first radar image output from the learning model by inputting the first radar image to the learning model, or label information representing the object designated by a user ([0040]: ‘these increased SAR image data and ISAR image data, along with the classification information of the target objects located in each image, are output to the learning unit 15B (step S07)”).
Regarding Claim 13, Tomoji discloses: A system comprising:
an information processing apparatus of claim 1 ([0013]; [0022-0040]); and
the radar device ([0009]).
Regarding Claim 14, Tomoji discloses: A method executed by an information processing apparatus ([0013]: “signal processing unit 15”; “processor”), comprising:
acquiring a first observation signal from a radar device with a plurality of antennas configured to transmit a radar signal and to receive a radar echo based on a reflected wave of the radar signal, the first observation signal being based on the radar signal and the radar echo ([0010]: “antenna 13 is a phased array antenna consisting of multiple antenna elements”; [0011]: “Antenna 13 radiates radar waves transmitted from the transmitter 11…”; “…captures the arriving signal, including echoes reflected from the target.”);
generating a first radar image by executing a signal process based on a predetermined first condition, with respect to the first observation signal ([0014]: “creates image data”; [0023]: “the input SAR image data and ISAR image data are first processed using IFFT (Inverse Fast Fourier Transform) to convert them into IQ data before SAR image processing and ISAR image processing (step S02)”; [0031]: “all of the obtained IQ data are subjected to FFT processing again to obtain SAR image data and ISAR image data (step S04)”); and
generating a second radar image by executing a signal process based on a second condition which is different from the first condition, with respect to the first observation signal ([0019]: “creating new image data”; [0023]: “step S02”; [0024]: “various processes are performed on this IQ data to increase the number of images (step S03)”; [0025-0030]; [0031]: “step S04”), wherein
the first and second radar images are used as training data for a learning model to detect an object to which the radar signal is transmitted ([0040]: “these increased SAR image data and ISAR image data … are output to the learning unit 15B (step S07)”; [0041]: “deep learning is performed using a convolutional neural network with the input image data as training data”).
Regarding Claim 15, Tomoji discloses: A non-transitory computer-readable storage medium having stored thereon a program which is executed by a computer ([0013]), the program comprising instructions capable of causing the computer to execute functions of:
acquiring a first observation signal from a radar device with a plurality of antennas configured to transmit a radar signal and to receive a radar echo based on a reflected wave of the radar signal, the first observation signal being based on the radar signal and the radar echo ([0010]: “antenna 13 is a phased array antenna consisting of multiple antenna elements”; [0011]: “Antenna 13 radiates radar waves transmitted from the transmitter 11…”; “…captures the arriving signal, including echoes reflected from the target.”);
generating a first radar image by executing a signal process based on a predetermined first condition, with respect to the first observation signal ([0014]: “creates image data”; [0023]: “the input SAR image data and ISAR image data are first processed using IFFT (Inverse Fast Fourier Transform) to convert them into IQ data before SAR image processing and ISAR image processing (step S02)”; [0031]: “all of the obtained IQ data are subjected to FFT processing again to obtain SAR image data and ISAR image data (step S04)”); and
generating a second radar image by executing a signal process based on a second condition which is different from the first condition, with respect to the first observation signal ([0019]: “creating new image data”; [0023]: “step S02”; [0024]: “various processes are performed on this IQ data to increase the number of images (step S03)”; [0025-0030]; [0031]: “step S04”), wherein
the first and second radar images are used as training data for a learning model to detect an object to which the radar signal is transmitted ([0040]: “these increased SAR image data and ISAR image data … are output to the learning unit 15B (step S07)”; [0041]: “deep learning is performed using a convolutional neural network with the input image data as training data”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C.
102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the
statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a
new ground of rejection if the prior art relied upon, and the rationale supporting the rejection,
would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness
rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the
claimed invention is not identically disclosed as set forth in section 102, if the
differences between the claimed invention and the prior art are such that the
claimed invention as a whole would have been obvious before the effective filing
date of the claimed invention to a person having ordinary skill in the art to which
the claimed invention pertains. Patentability shall not be negated by the manner in
which the invention was made.
Claims 4-7 are rejected under 35 U.S.C. 103 as being unpatentable over Tomoji (JP 2022078753 A), as applied to Claim 1 above, as evidenced by MATLAB (MATLAB Documentation, “Radar Pulse Compression,” 2021).
Regarding Claim 4, Tomoji teaches: wherein the first radar image is generated by executing a synthetic aperture process by which convolution integral of the first observation signal and a predetermined first matched filter is calculated ([0022-0031]: Steps S01-S04, with unmodified IQ data in step S03), and
the second radar image is generated by a synthetic aperture process by which convolution integral of the first observation signal and a second matched filter which is different from the first matched filter ([0022-0031]: Steps S01-S04, with modified IQ data in step S03).
Tomoji does not explicitly use the terms “convolution integral” and “matched filtering,” as recited in Claim 4. However, the SAR image generation via FFT processing taught my Tomoji is a matched filtering operation, as evidenced by MATLAB (MATLAB [pg. 7]: “More commonly, radar systems employ a similar process in the digital domain called matched filtering, where the received signal is convolved with a time-reversed version of the transmitted pulse. Matched filtering is often done in the frequency domain because convolution in the time domain is equivalent to multiplication in the frequency domain, making the process faster.”).
It would have been obvious to one of ordinary skill in the art that Tomoji’s SAR image generation via FFT processing constitutes executing a synthetic aperture process by which a convolution integral of the observation signal and a matched filter is calculated, as evidenced by MATLAB. Matched filtering, which includes calculating convolution integrals, is well-known in the art and is the standard method for generating SAR images.
Regarding Claim 5, Tomoji teaches: wherein the second matched filter is generated by shifting, reversing, rotating, enlarging, or reducing the first matched filter ([0028]: “Phase shift simulation”; [0029]: “Range shift simulation”). The rationale the Tomoji’s FFT processing constitutes matched filtering persists from Claim 4 above.
Regarding Claim 6, Tomoji teaches: wherein a distance set with respect to the second matched filter is different from a distance set with respect to the first matched filter ([0029]: “Range shift simulation”). The rationale the Tomoji’s FFT processing constitutes matched filtering persists from Claim 4 above.
Regarding Claim 7, Tomoji teaches: wherein resolution of the second matched filter is different from resolution of the first matched filter ([0026]: “synthetic aperture time allows for the simulation of resolution degradation”). The rationale the Tomoji’s FFT processing constitutes matched filtering persists from Claim 4 above.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Tomoji (JP 2022078753 A), as applied to Claim 1 above, in view of Sankar (US 2012/0209113).
Regarding Claim 12, Tomoji does not explicitly teach – but Sankar teaches: wherein the radar signal is transmitted based on a frequency modulated continuous wave (FMCW) method (Sankar [0057]: “FMCW RADAR system”; [0082]: “produce higher solution in SAR images”).
It would have been obvious to one of ordinary skill in the art to modify Tomoji and transmit FMCW radar signals, as taught by Sankar. FMCW is well-known in the art, and transmitting FMCW signals is beneficial for improving the resolution of SAR images (Sankar [0082]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NOAH Y. ZHU whose telephone number is (571)270-0170. The examiner can normally be reached Monday-Friday, 8AM-4PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William J. Kelleher can be reached on (571) 272-7753. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NOAH YI MIN ZHU/Examiner, Art Unit 3648
/William Kelleher/Supervisory Patent Examiner, Art Unit 3648