DETAILED ACTION
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-4, 9-14,19-21 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Gou( US 11830167 ) .
Regarding claim 1, Gou teaches a method in a computer system for delineating agricultural fields using satellite images, comprising:
obtaining at least one multitemporal, multispectral satellite image sequence (101)(200 in Fig. 2);
pre-processing (702, 703) the images in the at least one multitemporal, multispectral satellite image sequence (101) to generate a pre-processed image sequence (303) of multitemporal multispectral images covering a specific geographical region(204 in Fig. 2) ;
using a super-resolution method (705) on the images in the pre-processed image sequence to generate a high-resolution image sequence (403) of multitemporal multispectral images where corresponding pixel positions in images in the sequence relate to the same geographical ground position( 206 in Fig. 2) ;
using a delineating artificial neural network to classify (706) pixel positions in the high-resolution image sequence (403) ( Fig. 3A, 3B) as being associated with a geographical ground position that is part of an agricultural field (104) or not being part of an agricultural field(Col. 4, line 30-40, methods described herein can be applied to generate high-resolution remote sensing images with an improved quality to assist agricultural applications).
Regarding claim 2, Gou teaches a method according to claim 1.wherein the super-resolution method (705) is performed by a resolution enhancing convolutional neural network (401)( Col 11, line 5-15 , the single-image neural network model to generate the high-resolution remote sensing image with the second output resolution).
Regarding claim 3, Gou teaches a method according to claim 1 , wherein the pre-processing (702, 703) includes at least one of, for pixels in the images in the at least one obtained satellite image sequence (101): converting (702) received pixel values to bottom of atmosphere (BOA) reflectance( col 4, line 20-35, an influence of weather conditions such as occlusion by clouds, haze, fog, etc.), performing data assimilation, and performing geo-referencing (703).
Regarding claim 4, Gou 2teaches a method according to claim 1, wherein the at least one multitemporal, multispectral satellite image sequence (101) is a plurality of such image sequences( 300 in Fig. 3A), the method further comprising using a fusion artificial neural network (302) ( 304 in Fig. 3A) to receive the plurality of multitemporal, multispectral imaging sequences (101) as input ( and produce a single consolidated multi-temporal multi-spectral imaging data sequence (301) as output( Fig. 3A).
Regarding claim 9, Gou teaches a method according to claim 1, further comprising:
performing a quality assessment (707) of the classification of pixel positions by the delineating artificial neural network (501), and, if the quality assessment produces a result that fails to satisfy a predetermined quality requirement, manually annotating agricultural fields in images of one or more high-resolution image sequences (403) that cover areas for which quality assessment has failed, and retraining the delineating artificial neural network (501) with the manually annotated high-resolution image sequences (403).
Regarding claim 10, Gou teaches a method according to claim1, further comprising:
combining the classification of pixel positions from the delineating artificial neural network (501) with an image representing a corresponding geographical area; and storing or transmitting the result as an image where agricultural fields are delineated.
Claims 11-14, 19-20 recite the system for the method in claims 1-4, 9-10. Since Gou also teaches a system ( Fig. 1), those claims are also rejected.
Claim 21 recites the media for the method in claim 1, and is also rejected.
.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 5-6, 15-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Gou in view of DUTTA ( US 20200302223)
Regarding claim 5, Gou teaches a method according to claim1.
Gou does not expressly teach wherein the delineating artificial neural network (501) is trained to generate two output masks, wherein a first output mask (602) classifies pixel positions as field or background, a second output mask (604) classifies pixel positions as field boundary or not field boundary, and a combined output mask (605) where pixel positions classified as field in the first output mask (602) are re-classified as background if they are classified as field boundary in the second output mask (604).
However, DUTTA teaches artificial neural network (501) ( Fig. 26) is trained to generate two output masks ( [0255], ternary map) , wherein a first output mask (602) classifies pixel positions as field or background ( [0255], a first output value corresponds to a classification label or score for a background class), a second output mask (604) classifies pixel positions as field boundary or not field boundary ( [0255], and a third output value corresponds to a classification label or score for a cluster/cluster interior class), and a combined output mask (605) where pixel positions classified as field in the first output mask (602) are re-classified as background if they are classified as field boundary in the second output mask (604)( [0255], a third output value corresponds to a classification label or score for a cluster/cluster interior class; [0305], classifies, or can reclassify a first subset of the units 1712 as “background units” 1804 depicting the surrounding background of the clusters and “non-background units” ; [0808], subpixels classified as analyte interior are all assigned a same third predetermined class score).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Gou and DUTTA, by substituting the neural network taught by Gou with that taught by DUTTA, with motivation “processing input image data derived from a sequence of image sets through a neural network and generating an alternative representation of the input image data, the input image data has an array of units that depicts analytes and their surrounding background” ( DUTTA, abstract)
Regarding claim 6, Gou in view of DUTTA teaches a method according to claim1, wherein the delineating artificial neural network (501) generates probability scores for the pixel positions in the high-resolution image sequence (403) and classification of pixel positions is performed based on whether respective probability scores are above or below a predetermined threshold value(DUTTA, [0305], the thresholder 1802 thresholds output values of the units 1712 and classifies, or can reclassify a first subset of the units 1712 as “background units” 1804 depicting the surrounding background of the clusters and “non-background units”).
Claims 15-16 recite the system for the method in claims 5-6. Since Gou also teaches a system ( Fig. 1), those claims are also rejected.
Allowable Subject Matter
Claims 7 and 17 are objected to as being dependent upon rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claims 8 and 18 are also objected to since it recites the limitations in claim 7 and 17.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JIANGENG SUN whose telephone number is (571)272-3712. The examiner can normally be reached 8am to 5pm, EST, M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Randolph Vincent can be reached at 571 272 8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JIANGENG SUN
Examiner
Art Unit 2661
/Jiangeng Sun/Examiner, Art Unit 2671