DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
A rejection based on double patenting of the “same invention” type finds its support in the language of 35 U.S.C. 101 which states that “whoever invents or discovers any new and useful process... may obtain a patent therefor...” (Emphasis added). Thus, the term “same invention,” in this context, means an invention drawn to identical subject matter. See Miller v. Eagle Mfg. Co., 151 U.S. 186 (1894); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Ockert, 245 F.2d 467, 114 USPQ 330 (CCPA 1957).
A statutory type (35 U.S.C. 101) double patenting rejection can be overcome by canceling or amending the claims that are directed to the same invention so they are no longer coextensive in scope. The filing of a terminal disclaimer cannot overcome a double patenting rejection based upon 35 U.S.C. 101.
Claims 1-20 of application 18/417,234 are directed to the same invention as that of claims 1-20 of commonly assigned application 18/417,278. Under 35 U.S.C. 101, more than one patent may not be issued on the same invention.
The USPTO may not institute a derivation proceeding in the absence of a timely filed petition. The U.S. Patent and Trademark Office normally will not institute a derivation proceeding between applications or a patent and an application having common ownership (see 37 CFR 42.411). The applicant should amend or cancel claims such that the reference and the instant application no longer contain claims directed to the same invention.
Application 18/417,234
Application 18/417,278
Claim 1:
A computer-implemented method of estimating evapotranspiration (ET) using thermal images and optical images, comprising:
acquiring a color image that captures a target of interest, wherein the target of interest is associated with an agricultural crop; acquiring a thermal image that captures the target of interest at substantially the same time as the color image;
extracting one or more features from the color image, wherein extracting one or more features from the color image includes extracting color features and/or texture features;
segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow;
co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using an energy balance model or other ET model.
Claim 1:
A computer-implemented method of estimating evapotranspiration (ET) using thermal images and optical images, comprising:
acquiring a color image that captures a target of interest, wherein the target of interest is associated with an agricultural crop; acquiring a thermal image that captures the target of interest at substantially the same time as the color image;
extracting one or more features from the color image, wherein extracting one or more features from the color image includes extracting color features and/or texture features;
segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow;
co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using an energy balance model or other ET model.
Claim 2:
The method of claim 1,
wherein the color image and the thermal image are acquired every about 0.5 - 1 h to provide a color image time series and a thermal image time series capturing the target of interest over time.
Claim 2:
The method of claim 1,
wherein the color image and the thermal image are acquired every about 0.5-1 h to provide a color image time series and a thermal image time series capturing the target of interest over time.
Claim 3:
The method of claim 2,
wherein assigning component temperatures includes constructing a time series TS-Component for each of the surface temperature components.
Claim 3:
The method of claim 2,
wherein assigning component temperatures includes constructing a time series TS-Component for each of the surface temperature components.
Claim 4:
The method of claim 1,
wherein extracting features includes extracting, for each pixel, features including an individual pixel value, Gaussian blur at two scales, standard deviation at three scales, linear binary pattern at three scales, and Laplacian at three scales in each of nine bands, wherein the nine bands include red, green, blue, hue, saturation, value, and L*a*b* colorspace components.
Claim 4:
The method of claim 1,
wherein extracting features includes extracting, for each pixel, features including an individual pixel value, Gaussian blur at two scales, standard deviation at three scales, linear binary pattern at three scales, and Laplacian at three scales in each of nine bands, wherein the nine bands include red, green, blue, hue, saturation, value, and L*a*b* colorspace components.
Claim 5:
The method of claim 1,
wherein segmenting the color image includes performing classification by inputting a reduced feature set into a multi-layer perceptron.
Claim 5:
The method of claim 1,
wherein segmenting the color image includes performing classification by inputting a reduced feature set into a multi-layer perceptron
Claim 6:
The method of claim 5,
further comprising training and testing the multi-layer perceptron using images of a cropping system representative of the agricultural crop.
Claim 6:
The method of claim 5,
further comprising training and testing the multi-layer perceptron using images of a cropping system representative of the agricultural crop.
Claim 7:
The method of claim 1,
wherein co-registering the color image and the thermal image includes using a thermally reflective object that appears in the color image and the thermal image, matching a plurality of features of the thermally reflective object between the color image and the thermal image, and using the plurality of features of the thermally reflective object to calculate an affine transform to warp the thermal image to match the color image.
Claim 7:
The method of claim 1,
wherein co-registering the color image and the thermal image includes using a thermally reflective object that appears in the color image and the thermal image, matching a plurality of features of the thermally reflective object between the color image and the thermal image, and using the plurality of features of the thermally reflective object to calculate an affine transform to warp the thermal image to match the color image.
Claim 8:
The method of claim 1,
wherein the energy balance model is a two-source energy balance (TSEB) model.
Claim 8:
The method of claim 1,
wherein the energy balance model is a two-source energy balance (TSEB) model.
Claim 9:
The method of claim 1,
wherein the energy balance model is an extension of a two-source energy balance (TSEB) model extended to include at least one additional source.
Claim 9:
The method of claim 1,
wherein the energy balance model is an extension of a two-source energy balance (TSEB) model extended to include at least one additional source.
Claim 10:
The method of claim 1,
wherein the other ET model utilizes a recurrent neural network (RNN) that estimates ET based on the component temperatures and local meteorological time series.
Claim 10:
The method of claim 1,
wherein the other ET model utilizes a recurrent neural network (RNN) that estimates ET based on the component temperatures and local meteorological time series.
Claim 11:
A system for estimating evapotranspiration (ET) using thermal and optical images, comprising:
an optical camera for acquiring a color image that captures a target of interest, wherein the target of interest is associated with an agricultural crop;
a thermal camera for acquiring a thermal image that captures the target of interest at substantially the same time as the color image;
a computing device operatively connected to the optical camera and the thermal camera to receive the color image and the thermal image and to perform a method comprising:
extracting one or more features from the color image, wherein extracting one or more features from the color image includes extracting color features and/or texture features;
segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow; co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using an energy balance model or other ET model.
Claim 11:
A system for estimating evapotranspiration (ET) using thermal and optical images, comprising:
an optical camera for acquiring a color image that captures a target of interest, wherein the target of interest is associated with an agricultural crop;
a thermal camera for acquiring a thermal image that captures the target of interest at substantially the same time as the color image;
a computing device operatively connected to the optical camera and the thermal camera to receive the color image and the thermal image and to perform a method comprising:
extracting one or more features from the color image, wherein extracting one or more features from the color image includes extracting color features and/or texture features;
segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow; co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using an energy balance model or other ET model.
Claim 12:
The system of claim 11,
wherein the color image and the thermal image are acquired every abut 0.5 - 1 h to provide a color image time series and a thermal image time series capturing the target of interest over time.
Claim 12:
The system of claim 11,
wherein the color image and the thermal image are acquired every abut 0.5-1 h to provide a color image time series and a thermal image time series capturing the target of interest over time.
Claim 13:
The system of claim 12,
wherein assigning component temperatures includes constructing a time series TS-Component for each of the surface temperature components.
Claim 13:
The system of claim 12,
wherein assigning component temperatures includes constructing a time series TS-Component for each of the surface temperature components.
Claim 14:
The system of claim 11,
wherein extracting features includes extracting, for each pixel, features including an individual pixel value, Gaussian blur at two scales, standard deviation at three scales, linear binary pattern at three scales, and Laplacian at three scales in each of nine bands, wherein the nine bands include red, green, blue, hue, saturation, value, and L*a*b* colorspace components.
Claim 14:
The system of claim 11,
wherein extracting features includes extracting, for each pixel, features including an individual pixel value, Gaussian blur at two scales, standard deviation at three scales, linear binary pattern at three scales, and Laplacian at three scales in each of nine bands, wherein the nine bands include red, green, blue, hue, saturation, value, and L*a*b* colorspace components.
Claim 15:
The system of claim 11,
wherein segmenting the color image includes performing classification by inputting a reduced feature set into a multi-layer perceptron trained on images of a cropping system representative of the agricultural crop.
Claim 15:
The system of claim 11,
wherein segmenting the color image includes performing classification by inputting a reduced feature set into a multi-layer perceptron trained on images of a cropping system representative of the agricultural crop.
Claim 16:
The system of claim 11,
wherein co-registering the color image and the thermal image includes using a thermally reflective object that appears in the color image and the thermal image, matching a plurality of features of the thermally reflective object between the color image and the thermal image, and using the plurality of features of the thermally reflective object to calculate an affine transform to warp the thermal image to match the color image.
Claim 16:
The system of claim 11,
wherein co-registering the color image and the thermal image includes using a thermally reflective object that appears in the color image and the thermal image, matching a plurality of features of the thermally reflective object between the color image and the thermal image, and using the plurality of features of the thermally reflective object to calculate an affine transform to warp the thermal image to match the color image.
Claim 17:
The system of claim 11,
wherein the optical camera, the thermal camera, and the computing device are packaged together in a single housing.
Claim 17:
The system of claim 11,
wherein the optical camera, the thermal camera, and the computing device are packaged together in a single housing.
Claim 18:
The system of claim 11,
wherein the energy balance model is a two-source energy balance (TSEB) model.
Claim 18:
The system of claim 11,
wherein the energy balance model is a two-source energy balance (TSEB) model.
Claim 19:
The system of claim 11,
wherein the energy balance model is an extension of a two-source energy balance (TSEB) model that is extended to include at least one additional source.
Claim 19:
The system of claim 11,
wherein the energy balance model is an extension of a two-source energy balance (TSEB) model that is extended to include at least one additional source.
Claim 20:
A computer program product for estimating evapotranspiration (ET) using thermal images and optical images, the computer program product comprising a computer readable storage medium having program code embodied therewith, the program code executable by one or more processors, to perform a method comprising:
acquiring a color image that captures a target of interest, wherein the target of interest is associated with an agricultural crop;
acquiring a thermal image that captures the target of interest at substantially the same time as the color image;
extracting one or more features from the color image, wherein extracting one or more features from the color image includes extracting color features and/or texture features; segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow;
co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using a two-source energy balance (TSEB) model or other ET model.
Claim 20:
A computer program product for estimating evapotranspiration (ET) using thermal images and optical images, the computer program product comprising a computer readable storage medium having program code embodied therewith, the program code executable by one or more processors, to perform a method comprising:
acquiring a color image that captures a target of interest, wherein the target of interest is associated with an agricultural crop;
acquiring a thermal image that captures the target of interest at substantially the same time as the color image;
extracting one or more features from the color image, wherein extracting one or more features from the color image includes extracting color features and/or texture features; segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow;
co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using a two-source energy balance (TSEB) model or other ET model.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to nonstatutory subject matter. Claim 10 is non-statutory under the most recent interpretation of the Interim Guidelines regarding 35 U.S. C.101 because: the computer program product claimed is not positively disclosed in the specification as a statutory only embodiment and is not limited to non-transitory media. The broadest reasonable interpretation of a claim drawn to a computer readable medium (also called machine readable medium and other such variations) typically covers forms of non-transitory tangible media and transitory propagating signa Is per se in view of the ordinary and customary meaning of computer readable media, particularly when the specification is silent. See MPEP 2111.01. When the broadest reasonable interpretation of a claim covers a signal per se, the claim must be rejected under 35 U.S.C. § 101 as covering non-statutory subject matter. See In re Nuijten, 500 F.3d 1346, 1356-57 (Fed. Cir. 2007) transitory embodiments are not directed to statutory subject matter and Interim Examination Instructions for Evaluating Subject Matter Eligibility Under 35 U.S.C. § 101, Aug. 24, 2009; p. 2. To overcome this rejection, the claim may be amended to recite "a non-transitory computer readable medium”
Allowable Subject Matter
Claims 1-20 are allowable over prior art. However claims 1-20 are still rejected under double patenting and 35 USC 101. As allowable subject matter has been indicated, applicant's reply must either comply with all formal requirements or specifically traverse each requirement not complied with. See 37 CFR 1.111(b) and MPEP § 707.07(a).
Regarding claims 1, 11, and 20,
Zheng (2025/0078500) teaches, see Fig. 3, a decision making process for water and fertilizer stress of crops. A visible light image and thermal image. The visible light image is used to determine a crop coefficient, and the thermal image along with a feature extracted from the visible light image are used to extract a canopy temperature.
Iravantchi (2024/0111898) ¶84 teaches creating a mask based on body temperature of humans to segment images.
Moshelion (2017/0042098) teaches a system for characterizing a plant. The system uses sensors to acquire a transpiration rate from the plant, and an evaporation rate from a wick. The system uses a thermal image to estimate a cooling rate of a plant which is used as a proxy to the transpiration rate of the plant ¶83.
No prior art explicitly discloses segmenting the color image into surface temperature components based on the one or more features extracted from the color image, wherein the surface temperature components are selected from the group consisting of sunlit soil, sunlit residue, sunlit vegetation, sunlit snow, shaded soil, shaded residue, shaded vegetation, and shaded snow;
co-registering the color image and the thermal image to provide a registered thermal image;
assigning component temperatures by applying component masks to the registered thermal image to assign a component temperature to each of the surface temperature components;
estimating ET based on the component temperatures using an energy balance model or other ET model.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DUSTIN BILODEAU whose telephone number is (571)272-1032. The examiner can normally be reached 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Mehmood can be reached at (571) 272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DUSTIN BILODEAU/Examiner, Art Unit 2664
/JENNIFER MEHMOOD/Supervisory Patent Examiner, Art Unit 2664