DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This action is in reply to the application filed on 9/16/2024.
No claims have been amended.
No claims have been added.
No claims have been cancelled.
Claims 1-20 are currently pending and have been examined.
Information Disclosure Statement
The information disclosure statement(s) (IDS(s)) submitted on 9/16/2024 and 2/25/2025 have been received and considered.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Claim Objections
Claims 14 and 15 are objected to because of the following informalities: “processor configured” appears to be missing the word “is” to read “processor is configured.” Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
“Working unit configured to perform […] collecting, processing, and forwarding harvested material” in claims 1, 2, 4, 5, 7-9, and 18-20.
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function, here using the term “unit;”
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”, here using the term “configured to”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function, here reciting only “collecting, processing, and forwarding,” without reciting structure, material, or acts for performing the collecting, processing, or forwarding.
Therefore, the element “working unit configured to perform […] collecting, processing, and forwarding harvested material” will be interpreted under 112(f) according to the specification ¶ 0014 as “Such a working unit may be any one, any combination, or all of an attachment attachable to the work machine, an inclined conveyor of the work machine, a threshing device of the work machine, a separating device of the work machine, a cleaning device of the work machine, a grain elevator of the work machine, a chaff auger of the work machine, a grain tank emptying device of the work machine, a post-accelerator of the work machine, a grain auger or a grain tank auger.”
Particularly regarding Claim 18, the adjustment of the working unit “so that the adjustment results in a predetermined harvested material flow being adjusted” is not modified by sufficient structure, material, or acts for performing the claimed function. Within the specification, ¶ 0041 describes “modify operation of the working unit(s) in order to change the proportion of the available harvested material to be closer to the preset proportion of the harvested material flow,” which similarly does not provide sufficient structure, material, or acts for performing the claimed function.
Particularly regarding Claim 19, the adjustment of the working unit “so that the adjustment results in reduction in a loss of the harvested material” is not modified by sufficient structure, material, or acts for performing the claimed function. Within the specification, ¶ 0042 describes “a throughput of a harvested material flow through the self-propelled agricultural work machine is controlled with reduced or minimized harvested material losses using the adjustment of at least one working unit,” which similarly does not provide sufficient structure, material, or acts for performing the claimed function.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 3, 4, 6, 7, 18, and 19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The terms “closer area,” “remote area,” and “middle area” in claims 3, 4, 6, and 7 are relative terms which renders the claim indefinite. The terms “close,” “remote,” and “middle” are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The terms are defined only with relation to one another without any sort of objective scale for defining their degree.
The terms “more precise” and “less precise” in claims 4 and 7 are relative terms which renders the claim indefinite. The terms “more precise” and “less precise” are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The terms are defined only with relation to one another without any sort of objective scale for defining their degree.
The terms “accurate adjustment” and “precise adjustment” in claim 9 are relative terms which renders the claim indefinite. The terms “accurate adjustment” and “precise adjustment” are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The terms are defined only with relation to one another without any sort of objective scale for defining their degree.
The terms “immediate” adjustment and “short-term” adjustment in claim 9 are relative terms which renders the claim indefinite. The terms “immediate” adjustment and “short-term” adjustment are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The terms are defined only with relation to one another without any sort of objective scale for defining their degree.
Claims 18 and 19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being incomplete for omitting essential elements, such omission amounting to a gap between the elements. See MPEP § 2172.01. The omitted elements are:
Claim 18 recites “adjustment results in a predetermined harvested material flow being adjusted,” which recites a desired outcome of adjustment of a material flow without reciting a mechanism for achieving the desired result of adjustment, and furthermore is redundant in reciting the outcome of “adjustment results in […] adjustment.”
Claim 19 recites “adjustment results in reduction of a loss of the harvested material,” which recited a desired outcome of the adjustment without reciting a mechanism for achieving the desired result of a reduction in loss of harvested material.
The claims are generally narrative and indefinite, failing to conform with current U.S. practice. They appear to be a literal translation into English from a foreign document and are replete with grammatical and idiomatic errors. The instances of 112(f) invocation may be and specific instances listed above suggest that the claims may have been machine translated, and the Examiner requests that the claims be reviewed for proper intent.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1, 2, 5, 8, 10, 16, 17, and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Anderson et al (EP 3991539, hereinafter “Anderson”).
Regarding Claim 1, Anderson teaches:
A self-propelled agricultural work machine comprising: one or more working units configured to perform one or more of collecting, processing and forwarding harvested material of a crop of an agricultural field; (Anderson ¶ 101 lines 56-1 “FIG. 2 is a side view of an agricultural vehicle 220 (e.g., harvester, combine […]) that incorporates or comprises a system 11 for estimating yield of standing crop in a field,”)
PNG
media_image1.png
661
314
media_image1.png
Greyscale
at least one drive motor configured to drive the one or more working units and to propel the self-propelled agricultural working machine at a travel speed along the agricultural field; (Anderson ¶ 0155 lines 19-24 “The agricultural harvesting vehicle 1000 is supported on two front wheels 1030 that are driven by an engine (not shown) and two rear wheels 1050 that are steerable by a steering actuator. As the agricultural harvesting vehicle 1000 (e.g., combine 1020) travels through the field harvesting rows of crop,”)
at least one camera device (Anderson ¶ 0013 “In one configuration, the imaging device 110 comprises a stereo vision imaging device 110 or digital stereo vision camera with image data processing. Further, the imaging device 110 may operate in one or more frequency spectrums or bandwidths, such as one or more of the following: humanly visible light, near-infrared light, infra-red light and ultraviolet light. The imaging device 110 is coupled to one or more data ports 112. In an alternate embodiment, the imaging device 110 may comprise a monocular imaging device 110, a video imaging device 110,”)
configured to capture one or more images of the crop of the agricultural field in an environment of the self-propelled agricultural work machine; (Anderson ¶ 0020 “In step S102, an imaging device 110 (e.g., imaging sensors) is configured to obtain or to collect image data associated with one or more target plants (210a, 210b, 210c and 210d in FIG. 2) in one or more rows of the standing crop in the field,”)
at least one processor configured to: automatically determine a proportion of the harvested material present in the crop using the one or more images of the camera device; (Anderson ¶ 0025 lines 3-10 “In one embodiment in step S112, a data processor 120 or yield estimator 119 is configured to provide the estimated or detected size of the harvestable plant component (e.g., grain bearing portion or ear of a crop plant) on a user interface 118 (e.g., an electronic display 142) for the one or more target plants as an indicator of yield of the one or more plants or standing crop in the field,”)
and automatically adjust, based on the proportion of the harvested material present in the crop, one or both of: (Anderson ¶ 0030 “Further, in an alternate embodiment, the electronic data processor 120 may generate […] command or control signal based on the estimated yield metric meeting a threshold yield metric or not meeting a threshold yield metric. In one example, if the electronic data processor 120 generates […] command or control signal (e.g., for output or display by the user interface 118) based on the estimated yield metric meets or exceeds a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region that has a high priority for harvesting by a harvester or a combine. Conversely, if the electronic data processor 120 generates […] command or control signal based on the estimated yield metric does not meet or exceed a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region (e.g., for output or display by the user interface 118 or for input to a vehicle guidance system that directs the steering, heading or yaw of the vehicle during harvesting operations) that has a low priority for harvesting by a harvester or a combine, or that is not harvested,” teaching control signals based on yield values)
at least one working unit of the one or more working units; or the travel speed of the self-propelled agricultural work machine. (Anderson ¶ 0139 “In step S1300, the electronic data processor 120 controls the harvester or combine based on the revised grain yield estimate, such as sending a data message to the user interface 118 to provide the end user (or harvester operator) with an alert and an option to discontinue harvesting of the field or a portion of the field where a yield metric falls below a defined threshold. Further, the end user or harvester operator may have the option of designating the portion of the field that was abandoned for harvesting as grazing land for cattle, cows, sheep, chickens, goats or other domestic farm animals,” teaching the adjustment of the working units by discontinuing harvesting based on the yield estimate)
Regarding Claim 2, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein the at least one processor is configured to determine in a plurality of different areas of the one or more images the proportion of harvested material in the crop in each respective area of the plurality of different areas; (Anderson ¶ 0028 lines 41-44 “the data processor 120 or yield estimator 119 may provide […] a sectional yield for one or more rows of harvester or combine; or even an aggregate yield,”)
and wherein the at least one processor is configured to automatically adjust, based on the proportion of the harvested material present in the crop in at least one respective area of the plurality of different areas, (Anderson ¶ 0030 “Further, in an alternate embodiment, the electronic data processor 120 may generate […] command or control signal based on the estimated yield metric meeting a threshold yield metric or not meeting a threshold yield metric. In one example, if the electronic data processor 120 generates […] command or control signal (e.g., for output or display by the user interface 118) based on the estimated yield metric meets or exceeds a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region that has a high priority for harvesting by a harvester or a combine. Conversely, if the electronic data processor 120 generates […] command or control signal based on the estimated yield metric does not meet or exceed a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region (e.g., for output or display by the user interface 118 or for input to a vehicle guidance system that directs the steering, heading or yaw of the vehicle during harvesting operations) that has a low priority for harvesting by a harvester or a combine, or that is not harvested,” teaching a control signal based on prioritization of areas of high or low yield)
by commanding one or both of: at least one working unit to adjust the at least one of the one or more working units; or the at least one drive motor to adjust the travel speed of the self-propelled agricultural work machine. (Anderson ¶ 0139 as above)
Regarding Claim 5, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein the at least one processor is configured to determine in a plurality of different areas of the one or more images the proportion of harvested material in the crop in each respective area of the plurality of different areas; (Anderson ¶ 0028 lines 41-44 “the data processor 120 or yield estimator 119 may provide […] a sectional yield for one or more rows of harvester or combine; or even an aggregate yield,” and ¶ 0029 lines 5-14 “The per-plant yield or another yield metric may be associated with a corresponding position or location in the field, such as a relative position or an absolute position (e.g., three-dimensional geographic coordinates from the location the determining receiver 130). For example, the data processor 120 or yield estimator 119 may generate or provide a yield map versus three-dimensional position (GPS coordinates or a georeferenced yield map) that provides individual plant yield data or that aggregates yield data from multiple plants,” a yield map teaching the determination of yield over a plurality of different areas)
and wherein the at least one processor is configured to automatically adjust, based on the proportion of the harvested material present in the crop in each respective area of the plurality of different areas, (Anderson ¶ 0030 “Further, in an alternate embodiment, the electronic data processor 120 may generate […] command or control signal based on the estimated yield metric meeting a threshold yield metric or not meeting a threshold yield metric. In one example, if the electronic data processor 120 generates […] command or control signal (e.g., for output or display by the user interface 118) based on the estimated yield metric meets or exceeds a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region that has a high priority for harvesting by a harvester or a combine. Conversely, if the electronic data processor 120 generates […] command or control signal based on the estimated yield metric does not meet or exceed a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region (e.g., for output or display by the user interface 118 or for input to a vehicle guidance system that directs the steering, heading or yaw of the vehicle during harvesting operations) that has a low priority for harvesting by a harvester or a combine, or that is not harvested,” teaching a control signal based on prioritization of areas of high or low yield)
by commanding one or both of: the at least one working unit; or the at least one drive motor to adjust the travel speed of the self-propelled agricultural work machine. (Anderson ¶ 0139 as above)
Regarding Claim 8, Anderson teaches the elements of Claim 1 as described above and further teaches:
further comprising at least one sensor device configured to detect an operating state of the self-propelled agricultural work machine; (Anderson ¶ 0016 “In still other alternate embodiments, the system 11 may comprise one or more supplemental sensors 131, such as a yield monitor 133 (e.g., secondary yield sensor), a moisture sensor 135, or both. For example, the yield monitor 133 may detect aggregate yield of harvester or combine for a set of rows or row units (e.g., all row units). Further, a moisture sensor 135 supports the yield monitor 133 such that the yield of harvested crop can be estimated, corrected or augmented by moisture sensor data associated with soil moisture, crop leaf moisture, or harvested plant components (e.g., ears of maize, corn, pods of legumes, or bolls of cotton),” emphasis added, teaching that the operating state of the harvested material is detected and used to augment or correct the detected yield value, which feeds into the harvester control as previously established)
and wherein the at least one processor is configured to adjust one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on both of the operating state detected by the at least one sensor device (Anderson ¶ 0030 as above, using the augmented or corrected yield values using the secondary yield and moisture sensors)
and the proportion of the harvested material that is available. (Anderson ¶ 0139 as above)
Regarding Claim 10, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein complete plants of the crop each have a peduncle and an infructescence of inflorescence; wherein incomplete plants of the crop have at most one peduncle; wherein the at least one processor is configured to: detect one or more image areas indicative of the inflorescences of the crop within the one or more images (Anderson ¶ 0021 “In step S104, an electronic data processor 120 is configured to estimate a spatial region of plant pixels of one or more target plants in the obtained image data for a harvestable plant component (212a, 212b, 212c and 212d in FIG. 2) and its associated component pixels of the harvestable plant component. For example, the harvestable plant component may comprise a grain bearing portion (GBP) of a plant, an ear, a grain head for a grain, corn, maize, wheat, rye, oats, rice, sorghum, cereal or quasi-cereal plant,” the GBPs being equivalent to the claimed inflorescences)
and to determine the proportion of the harvested material present based on the one or more image areas indicative of the inflorescences; (Anderson ¶ 0025 lines 3-10 “In one embodiment in step S112, a data processor 120 or yield estimator 119 is configured to provide the estimated or detected size of the harvestable plant component (e.g., grain bearing portion or ear of a crop plant) on a user interface 118 (e.g., an electronic display 142) for the one or more target plants as an indicator of yield of the one or more plants or standing crop in the field,”)
or detect one or more image areas indicative of the peduncles of the crop within the one or more images and to determine a non-existent proportion of the harvested material based on the one or more image areas indicative of the peduncles. (Anderson ¶ 0033 lines 6-11 “the electronic data processor 120 may be capable of identifying additional background pixels, such as plant pixels from adjacent crop plants, as opposed to weeds, where such plant pixels comprise of pixels that represent crop leaves, stalk, or ears from an adjacent or next plant in a row or adjoining row,” teaching that the system distinguishes between the harvestable material (inflorescence) and the non-harvestable material such as stalks (peduncles))
Regarding Claim 16, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein the at least one processor is configured to use one or more of harvested material information, (Anderson ¶ 0016 “In still other alternate embodiments, the system 11 may comprise one or more supplemental sensors 131, such as a yield monitor 133 (e.g., secondary yield sensor), a moisture sensor 135, or both. For example, the yield monitor 133 may detect aggregate yield of harvester or combine for a set of rows or row units (e.g., all row units). Further, a moisture sensor 135 supports the yield monitor 133 such that the yield of harvested crop can be estimated, corrected or augmented by moisture sensor data associated with soil moisture, crop leaf moisture, or harvested plant components (e.g., ears of maize, corn, pods of legumes, or bolls of cotton),” emphasis added, teaching that the operating state of the harvested material is detected and used to augment or correct the detected yield value, which feeds into the harvester control as previously established)
weather information, (Anderson ¶ 0086 line 39 - ¶ 0087 line 10 “Under a first technique, the electronic data processor 120 or yield estimator 119 is configured to estimate a yield reduction to the yield metric (e.g., a per-plant yield or plant row yield, sectional yield, or aggregate yield) by color differentiation of exposed grain seeds (e.g., kernels) of the target ear; wherein the providing the yield metric comprises a yield-adjusted yield metric (e.g., a per-plant yield or a row yield, sectional yield, or aggregate yield) of the one or more plants or the standing crop in the field. […] Under second technique, the electronic data processor 120 yield estimator 119 is configured to determine a potential cause of the yield reduction to the yield metric based on color differentiation of the exposed grain seeds (e.g., exposed kernels) at an outer end (e.g., top end) opposite the base end of the ear of target corn. For example, the yield reduction of an estimated yield metric may be caused, without limitation, by any of the following factors: […] and environmental stress, such as climate; severe weather, such as hail, high winds, frost, freezing or low temperatures; drought, excess of water or rain; exceptional heat,” teaching yield estimate adjustment based on weather)
location information, (Anderson ¶ 0098 “Under a ninth technique, which can complement or augment the eighth technique, the electronic data processor 120 or yield estimator 119 is configured to estimate a second yield reduction component to the aggregate yield derived from fungus, mold or plant disease data for the growing season in the same geographic region or county as the field, wherein the providing of the aggregate yield comprises a yield-reduced aggregate yield of the one or more plants or the standing crop in the field derived from or based on the first yield reduction component and the second yield reduction component,” teaching yield estimate adjustment based on location)
satellite-based information (Anderson ¶ 0028 lines 41-44 “the data processor 120 or yield estimator 119 may provide […] a sectional yield for one or more rows of harvester or combine; or even an aggregate yield,” and ¶ 0029 lines 5-14 “The per-plant yield or another yield metric may be associated with a corresponding position or location in the field, such as a relative position or an absolute position (e.g., three-dimensional geographic coordinates from the location the determining receiver 130). For example, the data processor 120 or yield estimator 119 may generate or provide a yield map versus three-dimensional position (GPS coordinates or a georeferenced yield map) that provides individual plant yield data or that aggregates yield data from multiple plants,” a yield map teaching the determination of yield over a plurality of different areas, according to location and GPS (satellite-based information) coordinates)
or field information in determining the proportion of the harvested material. (Anderson ¶ 0088 lines 13-22 “Under a third technique, the electronic data processor 120 or yield estimator 119 is configured: (a)) to sampling multiple target ears throughout the field to determine the potential cause of the yield reduction to the yield metric and (b) to estimate or to facilitate estimation or recording of a geographic position in two-dimensional or three dimensional coordinates of each one of the sampled target ears throughout the field to determine an aggregate yield reduction associated with the yield-reduced aggregate yield,” teaching yield estimation reduction based on observed data about the aggregate field.)
Regarding Claim 17, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein the at least one processor is configured to use each of harvested material information, (Anderson ¶ 0016 “In still other alternate embodiments, the system 11 may comprise one or more supplemental sensors 131, such as a yield monitor 133 (e.g., secondary yield sensor), a moisture sensor 135, or both. For example, the yield monitor 133 may detect aggregate yield of harvester or combine for a set of rows or row units (e.g., all row units). Further, a moisture sensor 135 supports the yield monitor 133 such that the yield of harvested crop can be estimated, corrected or augmented by moisture sensor data associated with soil moisture, crop leaf moisture, or harvested plant components (e.g., ears of maize, corn, pods of legumes, or bolls of cotton),” emphasis added, teaching that the operating state of the harvested material is detected and used to augment or correct the detected yield value, which feeds into the harvester control as previously established)
weather information, (Anderson ¶ 0086 line 39 - ¶ 0087 line 10 “Under a first technique, the electronic data processor 120 or yield estimator 119 is configured to estimate a yield reduction to the yield metric (e.g., a per-plant yield or plant row yield, sectional yield, or aggregate yield) by color differentiation of exposed grain seeds (e.g., kernels) of the target ear; wherein the providing the yield metric comprises a yield-adjusted yield metric (e.g., a per-plant yield or a row yield, sectional yield, or aggregate yield) of the one or more plants or the standing crop in the field. […] Under second technique, the electronic data processor 120 yield estimator 119 is configured to determine a potential cause of the yield reduction to the yield metric based on color differentiation of the exposed grain seeds (e.g., exposed kernels) at an outer end (e.g., top end) opposite the base end of the ear of target corn. For example, the yield reduction of an estimated yield metric may be caused, without limitation, by any of the following factors: […] and environmental stress, such as climate; severe weather, such as hail, high winds, frost, freezing or low temperatures; drought, excess of water or rain; exceptional heat,” teaching yield estimate adjustment based on weather)
location information, (Anderson ¶ 0098 “Under a ninth technique, which can complement or augment the eighth technique, the electronic data processor 120 or yield estimator 119 is configured to estimate a second yield reduction component to the aggregate yield derived from fungus, mold or plant disease data for the growing season in the same geographic region or county as the field, wherein the providing of the aggregate yield comprises a yield-reduced aggregate yield of the one or more plants or the standing crop in the field derived from or based on the first yield reduction component and the second yield reduction component,” teaching yield estimate adjustment based on location)
satellite-based information (Anderson ¶ 0028 lines 41-44 “the data processor 120 or yield estimator 119 may provide […] a sectional yield for one or more rows of harvester or combine; or even an aggregate yield,” and ¶ 0029 lines 5-14 “The per-plant yield or another yield metric may be associated with a corresponding position or location in the field, such as a relative position or an absolute position (e.g., three-dimensional geographic coordinates from the location the determining receiver 130). For example, the data processor 120 or yield estimator 119 may generate or provide a yield map versus three-dimensional position (GPS coordinates or a georeferenced yield map) that provides individual plant yield data or that aggregates yield data from multiple plants,” a yield map teaching the determination of yield over a plurality of different areas, according to location and GPS (satellite-based information) coordinates)
or field information in determining the proportion of the harvested material. (Anderson ¶ 0088 lines 13-22 “Under a third technique, the electronic data processor 120 or yield estimator 119 is configured: (a)) to sampling multiple target ears throughout the field to determine the potential cause of the yield reduction to the yield metric and (b) to estimate or to facilitate estimation or recording of a geographic position in two-dimensional or three dimensional coordinates of each one of the sampled target ears throughout the field to determine an aggregate yield reduction associated with the yield-reduced aggregate yield,” teaching yield estimation reduction based on observed data about the aggregate field.)
Regarding Claim 20, Anderson teaches:
A method for controlling a self-propelled agricultural work machine comprising: operating the self-propelled agricultural work machine, the self-propelled agricultural work machine comprising: one or more working units configured to perform one or more of collecting, processing and forwarding harvested material of a crop of an agricultural field; (Anderson ¶ 101 lines 56-1 “FIG. 2 is a side view of an agricultural vehicle 220 (e.g., harvester, combine […]) that incorporates or comprises a system 11 for estimating yield of standing crop in a field,”)
at least one drive motor configured to drive the one or more working units and to propel the self-propelled agricultural working machine at a travel speed along the agricultural field; (Anderson ¶ 0155 lines 19-24 “The agricultural harvesting vehicle 1000 is supported on two front wheels 1030 that are driven by an engine (not shown) and two rear wheels 1050 that are steerable by a steering actuator. As the agricultural harvesting vehicle 1000 (e.g., combine 1020) travels through the field harvesting rows of crop,”)
at least one camera device (Anderson ¶ 0013 “In one configuration, the imaging device 110 comprises a stereo vision imaging device 110 or digital stereo vision camera with image data processing. Further, the imaging device 110 may operate in one or more frequency spectrums or bandwidths, such as one or more of the following: humanly visible light, near-infrared light, infra-red light and ultraviolet light. The imaging device 110 is coupled to one or more data ports 112. In an alternate embodiment, the imaging device 110 may comprise a monocular imaging device 110, a video imaging device 110,”)
configured to capture one or more images of the crop of the agricultural field in an environment of the self-propelled agricultural work machine; (Anderson ¶ 0020 “In step S102, an imaging device 110 (e.g., imaging sensors) is configured to obtain or to collect image data associated with one or more target plants (210a, 210b, 210c and 210d in FIG. 2) in one or more rows of the standing crop in the field,”)
automatically determining a proportion of the harvested material present in the crop using the one or more images of the camera device; (Anderson ¶ 0025 lines 3-10 “In one embodiment in step S112, a data processor 120 or yield estimator 119 is configured to provide the estimated or detected size of the harvestable plant component (e.g., grain bearing portion or ear of a crop plant) on a user interface 118 (e.g., an electronic display 142) for the one or more target plants as an indicator of yield of the one or more plants or standing crop in the field,”)
and automatically adjusting, based on the proportion of the harvested material present in the crop, one or both of: (Anderson ¶ 0030 “Further, in an alternate embodiment, the electronic data processor 120 may generate […] command or control signal based on the estimated yield metric meeting a threshold yield metric or not meeting a threshold yield metric. In one example, if the electronic data processor 120 generates […] command or control signal (e.g., for output or display by the user interface 118) based on the estimated yield metric meets or exceeds a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region that has a high priority for harvesting by a harvester or a combine. Conversely, if the electronic data processor 120 generates […] command or control signal based on the estimated yield metric does not meet or exceed a threshold yield metric, locations or positions of the plants contributing to the observed yield metric may be used to generate a geofenced boundary or region (e.g., for output or display by the user interface 118 or for input to a vehicle guidance system that directs the steering, heading or yaw of the vehicle during harvesting operations) that has a low priority for harvesting by a harvester or a combine, or that is not harvested,” teaching control signals based on yield values)
at least one working unit of the one or more working units; or the travel speed of the self-propelled agricultural work machine. (Anderson ¶ 0139 “In step S1300, the electronic data processor 120 controls the harvester or combine based on the revised grain yield estimate, such as sending a data message to the user interface 118 to provide the end user (or harvester operator) with an alert and an option to discontinue harvesting of the field or a portion of the field where a yield metric falls below a defined threshold. Further, the end user or harvester operator may have the option of designating the portion of the field that was abandoned for harvesting as grazing land for cattle, cows, sheep, chickens, goats or other domestic farm animals,” teaching the adjustment of the working units by discontinuing harvesting based on the yield estimate)
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 3, 4, 6, 7, and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Anderson in view of Middelberg et al (US 20170118915, hereinafter “Middelberg”).
Regarding Claim 3, Anderson teaches the elements of Claim 2 as described above and further teaches:
[…] wherein the at least one processor is configured to automatically adjust based on the proportion of the harvested material present in the crop in at least one of the […] (Anderson ¶ 0028 lines 41-44 “the data processor 120 or yield estimator 119 may provide […] a sectional yield for one or more rows of harvester or combine; or even an aggregate yield,” and ¶ 0030 as described above)
Anderson does not teach:
wherein the plurality of different areas comprise at least one of: a closer area of the environment of the self-propelled agricultural work machine;
a remote area of the environment of the self-propelled agricultural work machine;
or a middle area of the environment of the self-propelled agricultural work machine between the closer area and the remote area;
and […] closer area, the remote area, or the middle area.
Within the same field of endeavor as Anderson, Middelberg teaches:
wherein the plurality of different areas comprise at least one of: a closer area of the environment of the self-propelled agricultural work machine; a remote area of the environment of the self-propelled agricultural work machine; or a middle area of the environment of the self-propelled agricultural work machine between the closer area and the remote area; and […] closer area, the remote area, or the middle area. (Middelberg ¶ 0015 lines 1-5 “In another particularly preferred feature of the agricultural work machine, the surroundings detected in sections by the surroundings detection device comprises a close-, mid- and long-range, and zero, one or more scanning planes are assigned to each region,” the closer, middle, and remote areas corresponding respectively to the close-, mid-, and long-range sections.)
Anderson and Middelberg are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the sectional yield determination of different crop areas of Anderson with the simple addition of the close-, mid-, and long-range detection sections of Middelberg. This modification would be made with a reasonable expectation of success as motivated by an improved ability to adapt precisely in nearly real-time to conditions in the immediate vicinity and improve work efficiency (Middelberg ¶ 0021).
Regarding Claim 4, the combination of Anderson and Middelberg teaches the elements of Claim 3 as described above. Anderson does not teach:
wherein the at least one processor is configured to perform one or both of: control a more precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in the closer area; or control a less precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in one or both of the middle area or the remote area.
Within the same field of endeavor as Anderson, Middelberg teaches:
wherein the at least one processor is configured to perform one or both of: control a more precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in the closer area; or control a less precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in one or both of the middle area or the remote area. (Middelberg ¶ 0021 lines 1-6 “In another particularly preferred feature of the agricultural work machine, there is a short-term precise control and evaluation of the scanning planes in the close-range, and an early rough control of the working elements through evaluation of one or more scanning planes lying in the medium and/or long range.”)
Anderson and Middelberg are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the sectional yield determination of different crop areas of Anderson with the simple addition of the close-, mid-, and long-range detection sections and short-term precise control based on scanning in the close range and early rough control based on medium and/or long range of Middelberg. This modification would be made with a reasonable expectation of success as motivated by an improved ability to adapt precisely in nearly real-time to conditions in the immediate vicinity and improve work efficiency (Middelberg ¶ 0021).
Regarding Claim 6, Anderson teaches the elements of Claim 5 as described above. Anderson further teaches:
[…] wherein the at least one processor is configured to automatically adjust
based on the proportion of the harvested material present in the crop in each of […] , (Anderson ¶ 0029 lines 5-14 teaching a yield map over a plurality of areas and ¶ 0030 teaching a control signal based on prioritization of crop areas based on their yield, in combination teaching control based on prioritization of each area.)
Anderson does not teach:
wherein the plurality of different areas comprise: a closer area of the environment of the self-propelled agricultural work machine; a remote area of the environment of the self-propelled agricultural work machine; or a middle area of the environment of the self-propelled agricultural work machine in between the closer area and the remote area;
and […] the closer area, the remote area, or the middle area.
Within the same field of endeavor as Anderson, Middelberg teaches:
wherein the plurality of different areas comprise: a closer area of the environment of the self-propelled agricultural work machine; a remote area of the environment of the self-propelled agricultural work machine; or a middle area of the environment of the self-propelled agricultural work machine in between the closer area and the remote area; and […] the closer area, the remote area, or the middle area. (Middelberg ¶ 0015 lines 1-5 “In another particularly preferred feature of the agricultural work machine, the surroundings detected in sections by the surroundings detection device comprises a close-, mid- and long-range, and zero, one or more scanning planes are assigned to each region,” the closer, middle, and remote areas corresponding respectively to the close-, mid-, and long-range sections.)
Anderson and Middelberg are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the sectional yield determination of different crop areas of Anderson with the simple addition of the close-, mid-, and long-range detection sections of Middelberg. This modification would be made with a reasonable expectation of success as motivated by an improved ability to adapt precisely in nearly real-time to conditions in the immediate vicinity and improve work efficiency (Middelberg ¶ 0021).
Regarding Claim 7, the combination of Anderson and Middelberg teaches the elements of Claim 6 as described above. Anderson does not teach:
wherein the at least one processor is configured to: control a more precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in the closer area; and control a less precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in one or both of the middle area or the remote area.
Within the same field of endeavor as Anderson, Middelberg teaches:
wherein the at least one processor is configured to: control a more precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in the closer area; and control a less precise adjustment of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the proportion of harvested material in one or both of the middle area or the remote area. (Middelberg ¶ 0021 lines 1-6 “In another particularly preferred feature of the agricultural work machine, there is a short-term precise control and evaluation of the scanning planes in the close-range, and an early rough control of the working elements through evaluation of one or more scanning planes lying in the medium and/or long range.”)
Anderson and Middelberg are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the sectional yield determination of different crop areas of Anderson with the simple addition of the close-, mid-, and long-range detection sections and short-term precise control based on scanning in the close range and early rough control based on medium and/or long range of Middelberg. This modification would be made with a reasonable expectation of success as motivated by an improved ability to adapt precisely in nearly real-time to conditions in the immediate vicinity and improve work efficiency (Middelberg ¶ 0021).
Regarding Claim 9, Anderson teaches the elements of Claim 8 as described above. Anderson further teaches:
wherein the at least one processor is configured to make an immediate accurate adjustment
of one or both of the at least one working unit or the travel speed of the self-propelled agricultural work machine depending on the operating state of the self-propelled agricultural work machine; and wherein the immediate accurate adjustment comprises a smaller adjustment parameter range than a short-term precise adjustment. (Anderson ¶ 0139 “In step S1300, the electronic data processor 120 controls the harvester or combine based on the revised grain yield estimate, such as sending a data message to the user interface 118 to provide the end user (or harvester operator) with an alert and an option to discontinue harvesting of the field or a portion of the field where a yield metric falls below a defined threshold,” emphasis added, teaching adjusting based on the revised estimate, established as incorporating results from the secondary yield sensor of ¶ 0016, to discontinue harvesting, being an immediate adjustment with a small (binary) adjustment parameter range of on/off)
Claim(s) 11-15 are rejected under 35 U.S.C. 103 as being unpatentable over Anderson in view of Ellaboudy et al (US 20210000006, hereinafter “Ellaboudy”).
Regarding Claim 11, Anderson teaches the elements of Claim 10 as described above and further teaches:
wherein the at least one processor is configured to use one or more […] to detect the one or more image areas relating to the inflorescences of the crop. (Anderson ¶ 0022 “In step S106, the electronic data processor 120 or image processing module 115 is configured to identify the component pixels of a harvestable plant component within the obtained image data of plant pixels of the one or more target plants. For example, the component pixels refer to pixels of the harvestable plant component that are identified or classified by color differentiation or other image processing techniques with respect to background pixels, leaf pixels, stem or stalk pixels, or other portions of the crop plant. Other image processing techniques may include classification of pixels by any of the following: image segmentation, clustering analysis of point clouds of pixels or constellations of pixels in three dimensional spatial representation, edge detection, size differentiation, and shape differentiation, and neural networks or artificial intelligence algorithms that use any of the foregoing image processing techniques” detecting an area of the image representative of the harvestable plant components by a variety of techniques including image segmentation and shape differentiation being analogous to the use of border polygons to detect inflorescences)
Anderson does not teach:
[…] border polygons […]
Within the same field of endeavor as Anderson, Ellaboudy teaches:
wherein the at least one processor is configured to use one or more border polygons to detect the one or more image areas relating to the inflorescences of the crop. (Ellaboudy ¶ 0207 “FIG. 28 is flow chart of an example of a process 2800 for detecting crop row by identifying bounding boxes for plants of the crop row,” teaches the use of bounding boxes for detecting plants in a crop)
Anderson and Ellaboudy are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the visual identification of harvestable material using various techniques of Anderson with the simple substitution of the boundary polygon identification of Ellaboudy as a method of harvestable material area identification. This modification would be made with a reasonable expectation of success as motivated by the simple substitution of one known element for another to obtain predictable results (MPEP 2143(I)(B)): The prior art of Anderson’s harvestable material identification differs from the claimed device by the substitution of the use of border polygons for identification. The use of a boundary polygon for crop identification was known in the art as evidenced by Ellaboudy, and one of ordinary skill in the art could have substituted the boundary box identification of Ellaboudy into the list of various image identification methods of Anderson to obtain the predictable result of identifying harvestable material by shape differentiation using an outline of boundary polygons.
Regarding Claim 12, the combination of Anderson and Ellaboudy teaches the elements of Claim 11 as described above. Anderson further teaches:
wherein the at least one processor is configured to use a number of pixels of the one or more image areas within the border polygons with respect to the inflorescences of the crop within the one or more images in order to determine the proportion of the harvested material. (Anderson ¶ 0022 “In step S106, the electronic data processor 120 or image processing module 115 is configured to identify the component pixels of a harvestable plant component within the obtained image data of plant pixels of the one or more target plants. For example, the component pixels refer to pixels of the harvestable plant component that are identified or classified by color differentiation or other image processing techniques with respect to background pixels, leaf pixels, stem or stalk pixels, or other portions of the crop plant. Other image processing techniques may include classification of pixels by any of the following: image segmentation, clustering analysis of point clouds of pixels or constellations of pixels in three dimensional spatial representation, edge detection, size differentiation, and shape differentiation, and neural networks or artificial intelligence algorithms that use any of the foregoing image processing techniques” emphasis added, as applies to the combination of Anderson and Ellaboudy previously established)
Regarding Claim 13, the combination of Anderson and Ellaboudy teaches the elements of Claim 12 as described above. Anderson further teaches:
wherein the at least one processor is configured to use one or more […] to detect the one or more image areas relating to the inflorescences of the crop. (Anderson ¶ 0022 “In step S106, the electronic data processor 120 or image processing module 115 is configured to identify the component pixels of a harvestable plant component within the obtained image data of plant pixels of the one or more target plants. For example, the component pixels refer to pixels of the harvestable plant component that are identified or classified by color differentiation or other image processing techniques with respect to background pixels, leaf pixels, stem or stalk pixels, or other portions of the crop plant. Other image processing techniques may include classification of pixels by any of the following: image segmentation, clustering analysis of point clouds of pixels or constellations of pixels in three dimensional spatial representation, edge detection, size differentiation, and shape differentiation, and neural networks or artificial intelligence algorithms that use any of the foregoing image processing techniques” detecting an area of the image representative of the harvestable plant components by a variety of techniques including image segmentation and shape differentiation being analogous to the use of border polygons to detect inflorescences)
Anderson does not teach:
[…] rectangular border polygons […]
Within the same field of endeavor as Anderson, Ellaboudy teaches:
wherein the at least one processor is configured to use one or more rectangular border polygons to detect the one or more image areas relating to the inflorescences of the crop. (Ellaboudy ¶ 0101 “The process 500 includes receiving 510 boundary data specifying an area within a map. For example, the boundary data may include a sequence of vertices of a polygon (e.g., a rectangle or hexagon) corresponding to the area in a two-dimensional representation of the map. In some implementations, the polygon may be specified in a plane of a two-dimensional slice or projection of a three-dimensional map,” emphasis added, teaches the use of boundary polygons while detecting crop areas, and ¶ 0207 “FIG. 28 is flow chart of an example of a process 2800 for detecting crop row by identifying bounding boxes for plants of the crop row,” teaches the use of bounding boxes for detecting plants in a crop)
Anderson and Ellaboudy are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the visual identification of harvestable material using various techniques of Anderson with the simple substitution of the boundary polygon identification of Ellaboudy as a method of harvestable material area identification. This modification would be made with a reasonable expectation of success as motivated by the simple substitution of one known element for another to obtain predictable results (MPEP 2143(I)(B)): The prior art of Anderson’s harvestable material identification differs from the claimed device by the substitution of the use of border polygons for identification. The use of a boundary polygon for crop identification was known in the art as evidenced by Ellaboudy, and one of ordinary skill in the art could have substituted the boundary polygon identification of Ellaboudy into the list of various image identification methods of Anderson to obtain the predictable result of identifying harvestable material by shape differentiation using an outline of boundary polygons.
Regarding Claim 14, the combination of Anderson and Ellaboudy teaches the elements of Claim 12 as described above. Anderson does not teach:
wherein the at least one processor configured to combine overlapping image areas present in the one or more images with respect to the inflorescences of the crop to form a respective image area.
Within the same field of endeavor as Anderson, Ellaboudy teaches:
wherein the at least one processor configured to combine overlapping image areas present in the one or more images with respect to the inflorescences of the crop to form a respective image area. (Ellaboudy ¶ 0219 lines 10-18 “the one or more image sensors may include two image sensors with overlapping fields of view, and a distance from the vehicle to a plant in the crop row may be determined based on stereoscopic signal processing of image data from the two image sensors depicting the plant. For example, detecting 3020 a crop row based on the image data to obtain position data for the crop row may include determining a bounding box for a plant of the crop row based on the image data;” teaches the use of multiple overlapping image areas from a stereoscopic camera in detecting plant data)
Anderson and Ellaboudy are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the visual identification of harvestable material using various techniques of Anderson with the simple substitution of stereoscopic image analysis of Ellaboudy as a method of harvestable material area identification, and although this use is not explicitly called out in Anderson, it is clearly within the scope of Anderson’s mention of stereo image use for proximity/distance filtering in Anderson ¶ 0141 lines 51-55. This modification would be made with a reasonable expectation of success as motivated by the use of a known technique (stereo camera analysis) to improve similar devices (the harvestable material identification devices of Anderson and Ellaboudy) in the same way (use of multiple overlapping images in the plant identification analysis) (MPEP 2143(I)(C)).
Regarding Claim 15, the combination of Anderson and Ellaboudy teaches the elements of Claim 14 as described above. Anderson does not teach:
wherein the at least one processor configured to combine overlapping border polygons present in the one or more images with respect to the inflorescences of the crop to form a border polygon.
Within the same field of endeavor as Anderson, Ellaboudy teaches:
wherein the at least one processor configured to combine overlapping image areas present in the one or more images with respect to the inflorescences of the crop to form a respective image area. (Ellaboudy ¶ 0218 lines 1-17 “In some implementations, the vehicle is also connected to a distance sensor and information from captured range data (e.g., a point cloud or a voxelized occupancy grid) is fused with the image data to detect 3020 the crop row and determine obtain position data for the crop row. For example, the distance sensor may be a radar sensor. For example, the distance sensor may be a lidar sensor. In some implementations, detecting 3020 a crop row based on the image data to obtain position data for the crop row includes accessing range data captured using the distance sensor, and determining a position of a plant in the crop row based on the image data and the range data. For example, image data and range data may be combined or fused to detect 3020 a plant in the crop row by inputting the image data and the range data to a neural network (e.g., a convolutional neural network) to obtain position data for the plant (e.g., a bounding box for the plant)” teaches the use of fused image data in order to determine a bounding box for the plant)
Anderson and Ellaboudy are considered analogous because they both relate to visual yield detection of crops on harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the visual identification of harvestable material using various techniques of Anderson with the simple substitution of stereoscopic image analysis of Ellaboudy as a method of harvestable material area identification, and although this use is not explicitly called out in Anderson, it is clearly within the scope of Anderson’s mention of stereo image use for proximity/distance filtering in Anderson ¶ 0141 lines 51-55. This modification would be made with a reasonable expectation of success as motivated by the use of a known technique (stereo camera analysis) to improve similar devices (the harvestable material identification devices of Anderson and Ellaboudy) in the same way (use of multiple overlapping images in the plant identification analysis) (MPEP 2143(I)(C)).
Claim(s) 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Anderson in view of Cleodolphi et al (US 20230135915, hereinafter “Cleodolphi”).
Regarding Claim 18, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein the at least one processor is configured to adjust one or more of the at least one working unit or the travel speed of the self-propelled agricultural work machine based on the proportion of the harvested material […] (Anderson ¶ 0030 as above)
Anderson does not teach:
[…] so that the adjustment results in a predetermined harvested material flow being adjusted by the self-propelled agricultural work machine.
Within the same field of endeavor as Anderson, Cleodolphi teaches:
[…] so that the adjustment results in a predetermined harvested material flow being adjusted by the self-propelled agricultural work machine. (Cleodolphi ¶ 0033 lines 1-10 “As indicated above, it is generally desirable to monitor the mass flow rate of harvested materials (e.g., sugarcane) through an agricultural harvester to allow the operator to gather data associated with the crop yield and evaluate the performance of the harvester. In addition, the mass flow rate through the harvester may also be used to automate certain functions or control actions associated with the harvester, such as to automatically adjust one or more operational settings of one or more harvester components to improve the efficiency and/or performance thereof,” teaching monitoring-based adjustment of the harvester to improve mass flow rate performance and efficiency )
Anderson and Cleodolphi are considered analogous because they both relate to harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the adjustment of vehicle parameters based on detected yield of Anderson with the simple addition of the adjustment based on monitoring to improve mass flow rate performance of Cleodolphi. This modification would be made with a reasonable expectation of success as motivated by improving efficiency and performance of harvesting (Cleodolphi ¶ 0033).
Regarding Claim 19, Anderson teaches the elements of Claim 1 as described above and further teaches:
wherein the at least one processor is configured to adjust the one or more of the at least one working unit or the travel speed of the self-propelled agricultural work machine based on the proportion of the harvested material […] (Anderson ¶ 0030 as above)
Anderson does not teach:
[…] so that the adjustment results in reduction in a loss of the harvested material from the self-propelled agricultural work machine.
Within the same field of endeavor as Anderson, Cleodolphi teaches:
[…] so that the adjustment results in reduction in a loss of the harvested material from the self-propelled agricultural work machine. (Cleodolphi ¶ 0033 lines 1-10 “As indicated above, it is generally desirable to monitor the mass flow rate of harvested materials (e.g., sugarcane) through an agricultural harvester to allow the operator to gather data associated with the crop yield and evaluate the performance of the harvester. In addition, the mass flow rate through the harvester may also be used to automate certain functions or control actions associated with the harvester, such as to automatically adjust one or more operational settings of one or more harvester components to improve the efficiency and/or performance thereof,” teaching monitoring-based adjustment of the harvester to improve efficiency, that is a reduction in loss)
Anderson and Cleodolphi are considered analogous because they both relate to harvester machines. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the adjustment of vehicle parameters based on detected yield of Anderson with the simple addition of the adjustment based on monitoring to improve mass flow rate efficiency (reduction in loss) of Cleodolphi. This modification would be made with a reasonable expectation of success as motivated by improving efficiency and performance of harvesting (Cleodolphi ¶ 0033).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Hiramatsu et al (US 20170135277) teaches control of harvesting machine operation based on GPS (satellite) data and weather data.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZACHARY E GLADE whose telephone number is (703)756-1502. The examiner can normally be reached 4-5-9 7:30-16:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito Robinson can be reached at (571) 270-3921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZACHARY E. F. GLADE/ Examiner, Art Unit 3664
/KITO R ROBINSON/ Supervisory Patent Examiner, Art Unit 3664