DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 1-19 are objected to due to the following informalities:
Claim 1:
Claim language “a simulation unit that determines the yield represented by each pixel in each image based on the agronomic rules and the analysis of each tile“ should read “a simulation unit that determines the yield represented by each pixel in each image based on the agronomic rules and the analysis of the each tile” in order to comply with the antecedent basis requirement.
Claim 2:
Claim language “each tile is a four channel image having red, blue, green and NIR reflectance “ should read “the each tile is a four channel image having red, blue, green and NIR reflectance” in order to comply with the antecedent basis requirement.
Claim 3:
Claim language “a mean square error, mean absolute error and mean absolute precent error are calculated for the each tile “ should read “a mean square error, mean absolute error and mean absolute percent error are calculated for the each tile” in order to comply with the antecedent basis requirement and provide proper spelling for the word “percent”.
Claim 4:
Claim language “only the areas of each tile that are managed ae used in the analysis “ should read “only [[the]] areas of the each tile that are managed are used in the analysis” in order to comply with the antecedent basis requirement and provide proper spelling for the word “are”.
Claim 5:
Claim language “each tile is scaled to bring a value of the pixel in the tile to between 0-2“ should read “the each tile is scaled to bring a value of a [[the]] pixel in the tile to between 0-2” in order to comply with the antecedent basis requirement.
Claim 6:
Claim language “an encoder/decoder analyzes the pixel density for each image “ should read “an encoder/decoder analyzes [[the]] pixel density for each image” in order to comply with the antecedent basis requirement.
Claim 7:
Claim language “the encoder/decoder analyzes shades of each pixel “ should read “the encoder/decoder analyzes shades of the each pixel” in order to comply with the antecedent basis requirement.
Claim 8:
Claim language “erosion and blurring are applied to each tile“ should read “erosion and blurring are applied to the each tile” in order to comply with the antecedent basis requirement.
Claim 9:
Claim language “using the stress levels of each area and the pixel and the
yield density of each pixel, the encoder/decoder calculates the predicted yield of each area of the field based on the image data only “ should read “using the stress levels of each area and the pixel and a [[the]] yield density of the each pixel, the encoder/decoder calculates a [[the]] predicted yield of each area of the field based on the image data only” in order to comply with the antecedent basis requirement.
Claim 10:
Claim language “each area in the image is classified based on the severity levels“ should read “each area in the image is classified based on [[the]] severity levels” in order to comply with the antecedent basis requirement.
Claim 11:
Claim language “retrieving a plurality of images of a field“ should read “retrieving a plurality of images of [[a]] the field”; claim language “determining the yield represented by each pixel in each image based on the at least one agronomic rules and the analysis of each tile” should read “determining the yield represented by each pixel in the each image based on the at least one agronomic rules and the analysis of the each tile” in order to comply with the antecedent basis requirement.
Claim 12:
Claim language “each tile is a four channel image“ should read “the each tile is a four channel image” in order to comply with the antecedent basis requirement.
Claim 13:
Claim language “a mean square error, mean absolute error and mean absolute precent error are calculated for each tile“ should read “a mean square error, mean absolute error and mean absolute percent error are calculated for the each tile” in order to comply with the antecedent basis requirement and provide proper spelling for the word “percent”.
Claim 14:
Claim language “only the areas of each tile that are managed ae used in the analysis “ should read “only the areas of the each tile that are managed ae used in the analysis” in order to comply with the antecedent basis requirement and provide proper spelling for the word “are”.
Claim 15:
Claim language “each tile is scaled to bring a value of a pixel in the tile to between 0-2 “ should read “the each tile is scaled to bring a value of a pixel in the tile to between 0-2” in order to comply with the antecedent basis requirement.
Claim 16:
Claim language “an encoder/decoder analyzes the pixel density for each image “ should read “an encoder/decoder analyzes [[the]] pixel density for each image” in order to comply with the antecedent basis requirement.
Claim 17:
Claim language “the encoder/decoder analyzes shades of each pixel“ should read “the encoder/decoder analyzes shades of the each pixel” in order to comply with the antecedent basis requirement.
Claim 18:
Claim language “erosion and blurring are applied to each tile“ should read “erosion and blurring are applied to the each tile” in order to comply with the antecedent basis requirement.
Claim 19:
Claim language “using the stress levels of each area and the pixel and the yield density of each pixel, the encoder/decoder calculates the predicted yield“ should read “using the stress levels of each area and the pixel and a [[the]] yield density of each pixel, the encoder/decoder calculates a [[the]] predicted yield” in order to comply with the antecedent basis requirement.
Appropriate correction is required.
Claim Rejections – 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claims recite an abstract idea as discussed below. This abstract idea is not integrated into a practical application for the reasons discussed below. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception for the reasons discussed below.
Step 1 of the 2019 Guidance requires the examiner to determine if the claims are to one of the statutory categories of invention. Applied to the present application, the claims belong to one of the statutory classes of a machine/process. The below claim is considered to be a statutory category (machine).
Step 2A of the 2019 Guidance is divided into two Prongs. Prong 1 requires the
examiner to determine if the claims recite an abstract idea, and further requires that
the abstract idea belongs to one of three enumerated groupings: mathematical concepts, mental processes, and certain methods of organizing human activity.
Independent Claim 1 is copied below, with the limitations belonging to an abstract idea highlighted in bold; the remaining limitations are ‘’additional elements’’.
A yield prediction system including:
an information gathering unit that retrieves a plurality of images of a field over a time period;
an information analysis unit that divides each image into a plurality of tiles;
a pixel analysis unit that gathers at least one agronomic rule to each tile; and
a simulation unit that determines the yield represented by each pixel in each image based on the agronomic rules and the analysis of each tile.
Under the Step 2A, Prong One, we consider whether the claim recites a judicial exception (abstract idea). In the above claim, the highlighted portion constitutes an abstract idea because, under the broadest reasonable interpretation in light of the specification, it recites limitations that fall into abstract idea exceptions. Specifically, under the 2019 Revised Patent Subject Matter Eligibility Guidance, it falls into the grouping of subject matter that when recited as such in a claim limitation covers mathematical processes (mathematical relationships, mathematical formulas or equations, mathematical calculations) and mental processes (concepts performed in the human mind including an observation, evaluation, judgement, and/or opinion).
Step of “divides each image into a plurality of tiles” is treated by the Examiner as belonging to mathematical process grouping.
Step of “determines the yield represented by each pixel in each image based on the agronomic rules and the analysis of each tile” is treated by the Examiner as belonging to combination of mental and mathematical process groupings.
With regards to the mental steps, according to the 2019 PEG: “If a claim, under its broadest reasonable interpretation, covers performance in the mind but for the recitation of generic computer components, then it is still in the mental processes category unless the claim cannot practically be performed in the mind. See Intellectual Ventures I LLC v. Symantec Corp., 838 F.3d 1307, 1318 (Fed. Cir. 2016) (‘‘[W]ith the exception of generic computer implemented steps, there is nothing in the claims themselves that foreclose them from being performed by a human, mentally or with pen and paper.”); Mortg. Grader, Inc. v. First Choice Loan Servs. Inc., 811 F.3d. 1314, 1324 (Fed. Cir. 2016) (holding that computer-implemented method for ‘‘anonymous loan shopping” was an abstract idea because it could be ‘‘performed by humans without a computer”); Versata Dev. Grp. v. SAP Am., Inc., 793 F.3d 1306, 1335 (Fed. Cir. 2015) (‘‘Courts have examined claims that required the use of a computer and still found that the underlying, patent-ineligible invention could be performed via pen and paper or in a person's mind.”).”
Prong 2 of Step 2A of the 2019 Guidance requires the examiner to determine if the claims recite additional elements or a combination of additional elements which integrate the abstract idea into a practical application. This requires additional elements in the claim to apply, rely on, or use the abstract idea in a manner that imposes a meaningful limit on the abstract idea, such that the claim is more than a drafting effort designed to monopolize the abstract idea.
In this step, we evaluate whether the claim recites additional elements that integrate the exception into a practical application of that exception.
Limitations of “retrieves a plurality of images of a field over a time period”, “gathers at least one agronomic rule to each tile”, and “gathers at least one agronomic rule to each tile” are treated as extra solution activities recited in generality (e.g., mere data gathering) and steps recited at a high level of generality such that substantially all practical applications of the judicial exception(s) are covered.
The additional elements: “an information gathering unit”, “plurality of images”, “field”, “time period”, “an information analysis unit”, “a pixel analysis unit”, “agronomic rule”, “tile”, and “a simulation unit that” add extra-solution activities (i.e., mere data gathering, source/type of data to be manipulated) using elements recited at a high level of generality (see MPEP 2106.05(g)); generally link the use of the judicial exception to a particular technological environment or field of use (see MPEP 2106.05(h)); and add the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely use a computer as a tool to perform an abstract idea (see MPEP 2106.05(f)).
The preamble of Claim 1: “A yield prediction system including” is a generically recited preamble.
Various considerations are used to determine whether the additional elements are sufficient to integrate the abstract idea into a practical application. In this particular case, the claim does not recite a particular machine applying or being used by the abstract idea. The claim does not effect a real-world transformation or reduction of any particular article to a different state or thing. (Manipulating data from one form to another or obtaining a mathematical answer using input data does not qualify as a transformation in the sense of Prong 2.) The claim does not contain additional elements which describe the functioning of a computer, or which describe a particular technology or technical field, being improved by the use of the abstract idea. (This is understood in the sense of the claimed invention from Diamond v Diehr, in which the claim as a whole recited a complete rubber-curing process including a rubber-molding press, a timer, a temperature sensor adjacent the mold cavity, and the steps of closing and opening the press, in which the recited use of a mathematical calculation served to improve that particular technology by providing a better estimate of the time when curing was complete. Here, the claim does not recite carrying out any comparable particular technological process).
Therefore, the claim is directed to a judicial exception and requires further analysis under the Step 2B.
Step 2B of the 2019 Guidance requires the examiner to determine whether the additional elements cause the claim to amount to significantly more than the abstract idea itself. The considerations for this particular claim are essentially the same as the considerations for Prong 2 of Step 2A, and the same analysis leads to the conclusion that the claim does not amount to significantly more than the abstract idea.
Essentially, the above claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception (Step 2B analysis) because they are well-understood and conventional in the relevant art of US20200380617 to Murr et al. (hereinafter Murr) and US20250245984 to Melnitchouck et al. (hereinafter Melnitchouck).
Therefore, claim 1 is rejected under 35 U.S.C. 101 as directed to an abstract idea without significantly more.
Similar analysis has been applied to independent Claim 11. The independent claims, therefore, are not patent eligible.
With regards to the dependent claims, Claims 2-10, and 12-20 merely add limitations which further detail the abstract idea, namely further mathematical steps detailing how the data processing algorithm is implemented, i.e. additional limitations corresponding to mathematical relationship grouping. These limitations do not help to integrate the claims into a practical application or make them significantly more than the abstract idea (which is recited in slightly more detail, but not in enough detail to be considered to narrow the claims to a particular practical application).
The dependent claims are, therefore, also ineligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1, 2, 4, 5, 11, 12, 14, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over US20230292647A1 to Bainbridge et al. (hereinafter Bainbridge) in view of US20200380617A1 to Murr et al. (hereinafter Murr).
Regarding Claim 1: Bainbridge discloses:
“A yield prediction system including” (para 0052 – “there is provided a crop monitoring system”; para 0101 – “accurate crop yield predictions, crop population and dimension statistics, crop-loss/disease diagnosis, spatially resolved maps of crop attributes and intervention instructions”);
“an information gathering unit that retrieves a plurality of images of a field over a time period” (para 0011 – “The images are high resolution (HR) digital aerial images of the area of interest (AOI) (i.e. field, added by examiner). The images may be image frames/still images. The image data may comprise video data from which the plurality of images are extracted as image frames/still images. The images may be referred to as “field” images generated by a “field” camera. The field camera (i.e. information gathering unit, added by examiner) may be or comprise a high resolution multi-spectral camera.”; para 0040 – “The image data may comprise a plurality or series of images of the crops taken at different points in time, e.g. in the crop growth or cycle”);
“an information analysis unit that divides each image into a plurality of tiles” (para 0044 – “The method may comprise generating the image data using a satellite, or at least one camera mounted to a drone, or other image capture technology (e.g. digital camera or smart phone)… The method may comprise mapping each image to a different geolocation in the AOI, and/or generating an orthomosaic map (i.e. having a plurality of tiles, added by examiner) of the AOI from the plurality images”; para 0045 – “The orthomosaic map may contain a reference tile positioned within the AOI with a predefined size (e.g. one meter square) in order to adjust the images/map for variations in altitude and colour”).
Bainbridge does not specifically disclose:
“a pixel analysis unit that gathers at least one agronomic rule to each tile; and a simulation unit that determines the yield represented by each pixel in each image based on the agronomic rules and the analysis of each tile”.
However, Murr discloses:
“a pixel analysis unit that gathers at least one agronomic rule to each tile” (Figs. 5 and 6, where areas shown are interpreted as tiles; para 0048 – “Image data database 302 includes image data corresponding to an area of land for which crop yield is to be estimated. The image data may be provided via a satellite imaging platform (interpreted as the pixel analysis unit, added by examiner)… The image data comprises pixels, each of which includes respective data values for frequency (color) and intensity (brightness)”; para 0049 – “there may be training data for a pixel associated with an image of an area of a corn crop and different training data for a pixel associated with an area of a wheat crop. Training data may be provided by government entities, e.g., the U.S. Department of Agriculture, or from the farmers of the farms themselves (i.e. providers if the agronomic rules, added by examiner)”; para 0062 – “satellite image 500 includes a crop of red peppers 502 as imaged in RGB, a crop of green beans 504 as imaged in RGB, a crop of corn 506 as imaged in RGB, a crop of broccoli 508 as imaged in RGB, a crop of green beans 510 as imaged in RGB, a crop of broccoli 512 as imaged in RGB, a crop of red peppers 514 as imaged in RGB, a crop of green beans 504 as imaged in RGB, a crop of corn 518 as imaged in RGB, a crop of broccoli 520 as imaged in RGB, a crop of corn 522 as imaged in RGB and a crop of red peppers 524 as imaged in RGB (i.e. different crops, associated with different tiles, present different agronomic rules, added by examiner)”); and
“a simulation unit that determines the yield represented by each pixel in each image based on the agronomic rules and the analysis of each tile” (para 0027 –para 0074 – “Each vegetation index generating component in vegetation index generation component 114 provides a unique vegetation index, each of which will be used to predict the aspects of the crops (interpreted as yield, added by examiner). As such, an array of vegetation indices is generated by vegetation index generation component 114. Each pixel of the image has data associated with a vegetation index as generated by each of the vegetation index generating component in vegetation index generation component 114”; para 0092 – “predictive component 120 (i.e. simulation unit, added by examiner) uses the array vegetation indices by boundary from zonal statistics component 118, the weather data, the historical crop yield data and the demographic/economic/regional data from accessing component 110 and generates a predicted crop yield. In particular, each pixel will have a weighting factor for each of the vegetation indexes in the array of vegetation indices by boundary”; see also paras 0048, 0062 and 0078).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 2: Bainbridge/Murr combination discloses the yield prediction system of Claim 1.
Bainbridge does not specifically disclose:
“wherein each tile is a four channel image having red, blue, green and NIR reflectance”.
However, Murr discloses:
“wherein each tile is a four channel image having red, blue, green and NIR reflectance” (para 0048 – “Image data database 302 includes image data corresponding to an area of land for which crop yield is to be estimated (interpreted as the tile, added by examiner)… image data may include 4-band image data, which include red, green, blue and near infrared bands (RGB-NIR) of the same area of land for which crop yield is to be estimated”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 4: Bainbridge/Murr combination discloses the yield prediction system of Claim 1.
Bainbridge does not specifically disclose:
“wherein only the areas of each tile that are managed ae used in the analysis”.
However, Murr discloses:
“wherein only the areas of each tile that are managed ae used in the analysis” (para 0048 – “Image data database 302 includes image data corresponding to an area of land for which crop yield is to be estimated (interpreted as the tile, added by examiner)”; para 0049 – “training data for a 4-band image may include specific 4-band pixels data values associated with each type of crop. In other words, there may be training data for a pixel associated with an image of an area of a corn crop and different training data for a pixel associated with an area of a wheat crop.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 5: Bainbridge/Murr combination discloses the yield prediction system of Claim 1.
Bainbridge does not specifically disclose:
“wherein each tile is scaled to bring a value of the pixel in the tile to between 0-2”.
However, Murr discloses:
“wherein each tile is scaled to bring a value of the pixel in the tile to between 0-2” (para 0092 – “predictive component 120 uses the array vegetation indices (interpreted as the analogy to tiles in the image, added by examiner) by boundary from zonal statistics component 118, the weather data, the historical crop yield data and the demographic/economic/regional data from accessing component 110 and generates a predicted crop yield. In particular, each pixel will have a weighting factor (interpreted as the value of the pixel, added by examiner) for each of the vegetation indexes in the array of vegetation indices by boundary”; para 0093 – “The weighting factors for each of the vegetation indices and for the weather data, the historical crop yield data and the demographic/economic/regional data may be set in any known manner”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 11: Bainbridge discloses:
“A method of predicting a yield of a field including the steps of:” (para 0052 – “there is provided a crop monitoring system”; para 0101 – “accurate crop yield predictions, crop population and dimension statistics, crop-loss/disease diagnosis, spatially resolved maps of crop attributes and intervention instructions”);
“retrieving a plurality of images of a field over a time period via and information gathering unit” (para 0011 – “The images are high resolution (HR) digital aerial images of the area of interest (AOI) (i.e. field, added by examiner). The images may be image frames/still images. The image data may comprise video data from which the plurality of images are extracted as image frames/still images. The images may be referred to as “field” images generated by a “field” camera. The field camera (i.e. information gathering unit, added by examiner) may be or comprise a high resolution multi-spectral camera.”; para 0040 – “The image data may comprise a plurality or series of images of the crops taken at different points in time, e.g. in the crop growth or cycle”);
“dividing each image into a plurality of tiles via an information analysis unit” (para 0044 – “The method may comprise generating the image data using a satellite, or at least one camera mounted to a drone, or other image capture technology (e.g. digital camera or smart phone)… The method may comprise mapping each image to a different geolocation in the AOI, and/or generating an orthomosaic map (i.e. having a plurality of tiles, added by examiner) of the AOI from the plurality images”; para 0045 – “The orthomosaic map may contain a reference tile positioned within the AOI with a predefined size (e.g. one meter square) in order to adjust the images/map for variations in altitude and colour”).
Bainbridge does not specifically disclose:
“gathering at least one agronomic rule to each tile via a pixel analysis unit; and determining the yield represented by each pixel in each image based on the at least one agronomic rules and the analysis of each tile a simulation unit”.
However, Murr discloses:
“gathering at least one agronomic rule to each tile via a pixel analysis unit” (Figs. 5 and 6, where areas shown are interpreted as tiles; para 0048 – “Image data database 302 includes image data corresponding to an area of land for which crop yield is to be estimated. The image data may be provided via a satellite imaging platform (interpreted as the pixel analysis unit, added by examiner)… The image data comprises pixels, each of which includes respective data values for frequency (color) and intensity (brightness)”; para 0049 – “there may be training data for a pixel associated with an image of an area of a corn crop and different training data for a pixel associated with an area of a wheat crop. Training data may be provided by government entities, e.g., the U.S. Department of Agriculture, or from the farmers of the farms themselves (i.e. providers if the agronomic rules, added by examiner)”; para 0062 – “satellite image 500 includes a crop of red peppers 502 as imaged in RGB, a crop of green beans 504 as imaged in RGB, a crop of corn 506 as imaged in RGB, a crop of broccoli 508 as imaged in RGB, a crop of green beans 510 as imaged in RGB, a crop of broccoli 512 as imaged in RGB, a crop of red peppers 514 as imaged in RGB, a crop of green beans 504 as imaged in RGB, a crop of corn 518 as imaged in RGB, a crop of broccoli 520 as imaged in RGB, a crop of corn 522 as imaged in RGB and a crop of red peppers 524 as imaged in RGB (i.e. different crops, associated with different tiles, present different agronomic rules, added by examiner)”); and
“determining the yield represented by each pixel in each image based on the at least one agronomic rules and the analysis of each tile a simulation unit” (para 0027 –para 0074 – “Each vegetation index generating component in vegetation index generation component 114 provides a unique vegetation index, each of which will be used to predict the aspects of the crops (interpreted as yield, added by examiner). As such, an array of vegetation indices is generated by vegetation index generation component 114. Each pixel of the image has data associated with a vegetation index as generated by each of the vegetation index generating component in vegetation index generation component 114”; para 0092 – “predictive component 120 (i.e. simulation unit, added by examiner) uses the array vegetation indices by boundary from zonal statistics component 118, the weather data, the historical crop yield data and the demographic/economic/regional data from accessing component 110 and generates a predicted crop yield. In particular, each pixel will have a weighting factor for each of the vegetation indexes in the array of vegetation indices by boundary”; see also paras 0048, 0062 and 0078).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 12: Bainbridge/Murr combination discloses the yield prediction system of Claim 11.
Bainbridge does not specifically disclose:
“wherein each tile is a four channel image having red, blue, green and NIR reflectance”.
However, Murr discloses:
“wherein each tile is a four channel image having red, blue, green and NIR reflectance” (para 0048 – “Image data database 302 includes image data corresponding to an area of land for which crop yield is to be estimated (interpreted as the tile, added by examiner)… image data may include 4-band image data, which include red, green, blue and near infrared bands (RGB-NIR) of the same area of land for which crop yield is to be estimated”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 14: Bainbridge/Murr combination discloses the yield prediction system of Claim 11.
Bainbridge does not specifically disclose:
“wherein only the areas of each tile that are managed ae used in the analysis”.
However, Murr discloses:
“wherein only the areas of each tile that are managed ae used in the analysis” (para 0048 – “Image data database 302 includes image data corresponding to an area of land for which crop yield is to be estimated (interpreted as the tile, added by examiner)”; para 0049 – “training data for a 4-band image may include specific 4-band pixels data values associated with each type of crop. In other words, there may be training data for a pixel associated with an image of an area of a corn crop and different training data for a pixel associated with an area of a wheat crop.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 15: Bainbridge/Murr combination discloses the yield prediction system of Claim 11.
Bainbridge does not specifically disclose:
“wherein each tile is scaled to bring a value of the pixel in the tile to between 0-2”.
However, Murr discloses:
“wherein each tile is scaled to bring a value of the pixel in the tile to between 0-2” (para 0092 – “predictive component 120 uses the array vegetation indices (interpreted as the analogy to tiles in the image, added by examiner) by boundary from zonal statistics component 118, the weather data, the historical crop yield data and the demographic/economic/regional data from accessing component 110 and generates a predicted crop yield. In particular, each pixel will have a weighting factor (interpreted as the value of the pixel, added by examiner) for each of the vegetation indexes in the array of vegetation indices by boundary”; para 0093 – “The weighting factors for each of the vegetation indices and for the weather data, the historical crop yield data and the demographic/economic/regional data may be set in any known manner”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge, as taught by Murr, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Claims 3 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Bainbridge in view Murr and in further view of US20200342226A1 to Bengtson et al. (hereinafter Bengtson).
Regarding Claim 3: Bainbridge/Murr combination discloses the yield prediction system of Claim 1.
Bainbridge does not specifically disclose:
“wherein a mean square error, mean absolute error and mean absolute precent error are calculated for each tile”.
However, Bengtson discloses:
“wherein a mean square error, mean absolute error and mean absolute precent error are calculated for each tile” (para 0059 - ”Imagery data 111 may consist of an image or photograph taken from a remote sensing platform (airplane, satellite, or drone), or imagery as a raster data set, each raster being comprised of pixels. Each pixel has a specific pixel value (or values) that represents a ground characteristic”; para 0122 – “After the selected features are produced for the models, a K-fold cross-validation is performed to select the models for each crop. The accuracy of the models is evaluated using the Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) and Median Absolute Percentage Error (MedianAPE). The best-tested models are used for training the crop-specific models.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr combination, as taught by Bengtson, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 13: Bainbridge/Murr combination discloses the yield prediction system of Claim 11.
Bainbridge does not specifically disclose:
“wherein a mean square error, mean absolute error and mean absolute precent error are calculated for each tile”.
However, Bengtson discloses:
“wherein a mean square error, mean absolute error and mean absolute precent error are calculated for each tile” (para 0059 - ”Imagery data 111 may consist of an image or photograph taken from a remote sensing platform (airplane, satellite, or drone), or imagery as a raster data set, each raster being comprised of pixels. Each pixel has a specific pixel value (or values) that represents a ground characteristic”; para 0122 – “After the selected features are produced for the models, a K-fold cross-validation is performed to select the models for each crop. The accuracy of the models is evaluated using the Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) and Median Absolute Percentage Error (MedianAPE). The best-tested models are used for training the crop-specific models.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr combination, as taught by Bengtson, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Claims 6 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Bainbridge in view Murr and in further view of US20230108422 to Brauer et al. (hereinafter Brauer).
Regarding Claim 6: Bainbridge/Murr combination discloses the yield prediction system of Claim 5.
Bainbridge does not specifically disclose:
“wherein an encoder/decoder analyzes the pixel density for each image”.
However, Brauer discloses:
“wherein an encoder/decoder analyzes the pixel density for each image” (para 0061 – “FIG. 2B illustrates an example architecture 250 that may be used in implementing (e.g., via the computing device 102, etc.) the denoising diffusion model described herein. As shown, the architecture 250 includes an encoder 252 and a decoder 254. The encoder is configured, for example, to condition input satellite image data (i.e. analyze, added by examiner) (e.g., a sat_image, etc.) as generally described herein. And, the decoder 254 then is configured to generate the defined spatial resolution images based on the input satellite image data. More particularly, the encoder 252 is configured to receive the satellite image data and generate (or create) a representation (or representations) in semantic latent space 256 that is used by the decoder 254 to generate a corresponding defined spatial resolution image (or images’; para 0081 – “The crop characteristics may include, for example, yield, harvest date, harvest moisture, etc. In various embodiments, the computing device 102 relies on the one or more environmental metrics, temporally dense NDVI (interpreted as the pixel density in the image, added by examiner) and/or image values from the generated images”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr combination, as taught by Brauer, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 16: Bainbridge/Murr combination discloses the yield prediction system of Claim 15.
Bainbridge does not specifically disclose:
“wherein an encoder/decoder analyzes the pixel density for each image”.
However, Brauer discloses:
“wherein an encoder/decoder analyzes the pixel density for each image” (para 0061 – “FIG. 2B illustrates an example architecture 250 that may be used in implementing (e.g., via the computing device 102, etc.) the denoising diffusion model described herein. As shown, the architecture 250 includes an encoder 252 and a decoder 254. The encoder is configured, for example, to condition input satellite image data (i.e. analyze, added by examiner) (e.g., a sat_image, etc.) as generally described herein. And, the decoder 254 then is configured to generate the defined spatial resolution images based on the input satellite image data. More particularly, the encoder 252 is configured to receive the satellite image data and generate (or create) a representation (or representations) in semantic latent space 256 that is used by the decoder 254 to generate a corresponding defined spatial resolution image (or images’; para 0081 – “The crop characteristics may include, for example, yield, harvest date, harvest moisture, etc. In various embodiments, the computing device 102 relies on the one or more environmental metrics, temporally dense NDVI (interpreted as the pixel density in the image, added by examiner) and/or image values from the generated images”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr combination, as taught by Brauer, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Claims 7-10 and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bainbridge in view Murr, in view of Brauer, and in further view of US20200226375 to Albrecht et al. (hereinafter Albrecht) and in further view of US20220067614 to Guan et al. (hereinafter Guan).
Regarding Claim 7: Bainbridge/Murr/Brauer combination discloses the yield prediction system of Claim 6.
Bainbridge does not specifically disclose:
“wherein the encoder/decoder analyzes shades of each pixel to determine a stress level of all areas of the field ranging from no stress to high stress”.
However, Albrecht discloses:
“wherein the encoder/decoder analyzes shades of each pixel” (para 0048 – “the satellite data 406 includes pixel values for each pixel within each satellite image. For example, pixels can exhibit different values in resolution, such as brightness, contrast and/or color. In color images, separate colors (e.g., red, green and blue components) are specified for each pixel. For grayscale images, the pixel value can be a single number representative of the brightness of the pixel. As an example, a pixel value of zero can be black and a pixel value of 255 can be white, with values in between forming different shades of gray. The pixel analyzer 418 can detect pixel values for each pixel”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr/Brauer combination, as taught by Albrecht, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Bainbridge/Murr/Brauer combination does not specifically disclose:
“to determine a stress level of all areas of the field ranging from no stress to high stress”.
However, Guan discloses:
“to determine a stress level of all areas of the field ranging from no stress to high stress” (para 0036 – “the key satellite observations (multispectral surface reflectance as well as derived indicators such as vegetation indices) provide an approximation of aboveground biomass, crop stress, or other crop-condition-related indicators; while environmental variables (e.g., weather, soil, and/or satellite-based stress terms) determines processes that are not captured by the satellite vegetation data, for example, the reproductive process, some biotic stresses, and/or local soil conditions”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr/Brauer/Albrecht combination, as taught by Guan, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 8: Bainbridge/Murr/Brauer/Albrecht/Guan combination discloses the yield prediction system of Claim 7.
Bainbridge does not specifically disclose:
“wherein erosion and blurring are applied to each tile to remove noise from the image by the pixel analysis unit”.
However, Guan discloses:
“wherein erosion and blurring are applied to each tile to remove noise from the image by the pixel analysis unit” (para 0042 – “ Methods of dimensionality reduction and feature composition, including principal component analysis and autoencoder (interpreted as the analogues of erosion and blurring, added by examiner), can be applied to denoise the data and to simplify the model by compressing input features. ”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr/Brauer/Albrecht/Guan combination, as taught by Guan, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 9: Bainbridge/Murr/Brauer/Albrecht/Guan combination discloses the yield prediction system of Claim 8.
Bainbridge does not specifically disclose:
“wherein using the stress levels of each area and the pixel and the yield density of each pixel, the encoder/decoder calculates the predicted yield of each area of the field based on the image data only”.
However, Guan discloses:
“wherein using the stress levels of each area and the pixel and the yield density of each pixel” (para 0036 – “the key satellite observations (multispectral surface reflectance as well as derived indicators such as vegetation indices) provide an approximation of aboveground biomass, crop stress, or other crop-condition-related indicators; while environmental variables (e.g., weather, soil, and/or satellite-based stress terms) determines processes that are not captured by the satellite vegetation data, for example, the reproductive process, some biotic stresses, and/or local soil conditions”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr/Brauer/Albrecht/Guan combination, as taught by Guan, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Bainbridge/Murr/Brauer/Albrecht/Guan combination does not specifically disclose:
“the encoder/decoder calculates the predicted yield of each area of the field based on the image data only”.
However, Brauer discloses:
“the encoder/decoder calculates the predicted yield of each area of the field based on the image data only” (para 0061 – “FIG. 2B illustrates an example architecture 250 that may be used in implementing (e.g., via the computing device 102, etc.) the denoising diffusion model described herein. As shown, the architecture 250 includes an encoder 252 and a decoder 254. The encoder is configured, for example, to condition input satellite image data (i.e. analyze, added by examiner) (e.g., a sat_image, etc.) as generally described herein. And, the decoder 254 then is configured to generate the defined spatial resolution images based on the input satellite image data”; para 0068 – “the computing device 102 is configured to determine one or more crop characteristics, at the plot level, for example, from the index values, the environmental metrics, and/or the aggregate value thereof, for the crop(s) identified in (or represented by) the input sat_images. The crop characteristics, for example, may include a yield prediction for the crop(s)”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the yield prediction system, disclosed by Bainbridge/Murr/Brauer/Albrecht/Guan combination, as taught by Guan, in order to obtain the detailed information regarding the state of the crop and to predict the yield with better accuracy.
Regarding Claim 10: Bainbridge/Murr/Brauer/Albrecht/Guan combination discloses the yield prediction system of Claim 9.
Bainbridge further discloses:
“wherein each area in the image is classified based on the severity levels” (para 0029 – “Determining the one or more crop features attributes may comprise determining, or detecting and classifying, one or more secondary crop feature attributes for each identified crop feature … a dataset of crop images to determine the one or more secondary crop feature attributes based, at least in part, on one or more image features extracted from each respective image and/or based at least in part on the determined primary crop features attributes. The one or more secondary crop feature attributes may include one or more of: diseased and disease type, pest-ridden and pest type, weed-ridden and weed type, healthy, and unhealthy (interpreted as the severity level, added by examiner). The primary crop features attributes may be used and be helpful for detecting secondary crop feature attributes. For example, black grass (a secondary attribute) is clearly visible once a kernel can be detected on a wheat plant. So detecting the kernel (primary attribute) helps to identify (i.e. ) the secondary attribute