Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings are objected to because Figures 9, 10, 12, 30, 31, 35 and 37-38 appear to be a black and white photocopy of a color photo, and thus no structure can be determined from the Figures. See 37 CFR 1.84(b)(1). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Specification
The specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware of, in the specification.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed on January 22nd 2024.
Status of Claims
This Non-Final rejection is in response to the applicant’s filing on December 21st 2023;
Claims 1-13 are pending and examined below.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-13 are rejected under 35 U.S.C. 103 as being unpatentable over Weidenbach (Patent No. US20220187832A1) in view of Wilson (Patent No. US20150324648A1).
Regarding claim 1 Weidenbach teaches a row detection system comprising; (See Weidenbach paragraph 0036; “…The camera system detects a “sea of green” and has difficulty differentiating rows the vehicle should steer along. In the canopied corn scenario, the radar sensor readily identifies and differentiates rows (because it is mounted below the canopy and laterally directed at the crop and associated stalks) thus its confidence is high. The machine controller then chooses to use the radar sensor in this scenario because of the relatively higher confidence.”); a camera mounted to an agricultural machine to image a ground surface that is traveled by the agricultural machine to acquire time-series images including at least a portion of the ground surface; (See Weidenbach paragraph 0037; “The agricultural machine 200 includes one or more vision sensor assemblies 202 including, for example, a digital video camera or LIDAR device. The one or more vision sensor assemblies 202 are, in one example, mounted at an elevated position relative to the field, crop canopy, or ground engagement units on the machine and are configured to capture images of a field 220 including crop rows and intervening furrows. The images are analyzed by the machine controller to determine one or more of cross track error or heading error…”).
Weidenbach does not explicitly teach but Willson teaches, and a processor configured or programmed to perform image processing for the time-series images; (See Willson paragraph 0028 and 0033; “…Accordingly, the secondary processor can apply the filter to remove the unwanted elements. Such a packaging of the image analysis can provide the effect of the crop row characteristic circuit set 110 and filter analysis while allowing the secondary processor access to the raw image for other analysis. In an example, the interface can be arranged to apply the filter to the image and create a modified image with the filtered elements removed. Such a packaging allows for a smaller image that still retains the pertinent information the secondary processor can use. In an example, the packaged crop row elements can be streamed to the secondary processor. Thus, the secondary processor can apply the crop row characteristic circuit set 110 and filter analysis in real-time…Once the model is created, the filter function 410 can be modified into a final filter by being plotted in three dimensions, such as via a graphics processor, mapped the crop row via transformation (e.g., translation, skew, rotation, etc.), and rendered in two dimensions to create a mask. In an example the variable mask features described above (e.g., more levels to the mask than simply covering or not covering an element) can be represented in the two dimensional image as alpha channels.”); and from the time-series images, select a search region in which to detect at least one of crop rows and ridges, the search region having a size and shape including at least a portion of one or more wheels of the agricultural machine; (See Wilson paragraph 0024;”… In an example, edges of the one or more shapes can correspond to a crop row characteristic (e.g., crop width) distance from the row center. In an example, the distance can be halved, or otherwise fractionalized to correspond to the anchor point—e.g., if the anchor point is the crop row center, and the characteristic is the crop width, then the edge is set at one-half the crop width distance from the row center. In an example, the filter includes a parameter that corresponds to a feature of the image, varies in response to variations in the feature, and modifies the edges of the one or more geometric shapes. Thus, as discussed above, the camera's 125 perspective can modify the representation of the crop rows in an image, and these modifications can be accounted for by the filter. In an example, the parameter is a multiplier, applied to a constant crop width, that changes in response to a pixel row in a raster representation of the image. Thus, as one moves pixels rows towards the bottom of the image, the width filter can increase to account for the larger representation of the crops closer to the camera 125. In an example, the parameter is a crop row characteristic measured perpendicular to a respective crop row center. Such perpendicular characteristics can include crop width and height. For example, for a crop planted some distance apart within a row, a variable width, instead of a constant width, may be observed. Such a perpendicularly measured characteristic, varying along the crop row center, can capture this scenario.”).
Both Weidenbach and Wilson are in the same field of row detection system and method for agricultural machine. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Weidenbach a row detection system and method with Wilson time-series images, a search region having a size and shape including at least a portion of one or more wheels of the agricultural machine. No new functionality would arise from the combination and the combination would improve usability of Weidenbach by adding a time-series images, a search region having a size and shape including at least a portion of one or more wheels of the agricultural machine will provide better visual data to detect the row. Further, finding that one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 2 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach further teaches, wherein the processor is configured or programmed to: detect at least one portion of the one or more wheels from the time-series images; and select as the search region a region that includes at least the at least one detected portion of the one or more wheels; (See Weidenbach paragraph 0041, 0046, figure 4 and figure 5b; “FIG. 4 illustrates an example of determining confidence in crop row detection using characteristic measurements (e.g., images) of a field generated by a vision sensor.…The present subject matter provides for using one or more sensors (such as radar and/or image sensors) to correct for sidehill drift, such as when the front wheels may appear to be accurately positioned between crop rows while the rear wheels drift/slid over adjacent crop rows and in some examples even drive over adjacent crop rows into proximate furrows.”).
Regarding claim 3 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach further teaches, wherein the search region has a shape that includes, among the at least one of crop rows and ridges existing in the time-series images, a crop row or ridge that is located on a left side of the one or more wheels over to a crop row or ridge that is located on a right side of the one or more wheels; (See Weidenbach paragraph 0052; “FIG. 8 illustrates an example of an agricultural machine 800 having sensors 802, 804, 812, 814, 816 that are alternatively or cooperatively usable to obtain guidance parameters for automated control of the machine. According to various embodiments, the present subject matter provides for switching between sensors in various conditions. Sensor solution or measurement quality can be affected by the conditions in the surrounding environment (e.g., obstructions 830 in the environment) that reduce the confidence in measurements generated by one sensor (e.g., sensor A, 802) more than the confidence in measurement obtained by another sensor (e.g., sensor B, 804). A machine controller can use determined or obtained confidence in each sensor to select a measurement generated by the less obstruction (higher confidence) sensor for calculating cross track error or cross heading error, in various embodiments of the present subject matter. Sensors A and B are in one example the same types of sensors (e.g., both vision sensors, both radar sensors, or the same type of other sensor, such as ultrasound, LIDAR or the like). In another example, the sensors A and B are different types of sensors (e.g., sensor A is a vision sensor and sensor B is a radar sensor, or other different types of sensors). The sensors can be coupled to the machine at the same or different locations or elevations, in various embodiments. As discussed herein, if the image-based sensor solution confidence is low (e.g., with sensor A, 802), the present subject, matter can switch to using the radar sensor measurements (e.g., with sensor B, 804), in an embodiment. In one instance, when bad crop row measurements lead to low confidence, a radar sensor has high confidence and the control module selects the radar sensor. In another instance, when crops are short and hard to detect with radar, this leads to low radar confidence and if the image sensor has high confidence and the control module selects the image sensor.”).
Regarding claim 4 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach further teaches, wherein the processor is configured or programmed to estimate a positional relationship between the detected at least one of crop rows and ridges and the one or more wheels based on an image of at least the portion of the one or more wheels included in the search region; (See Weidenbach paragraph 0045; “…The depicted embodiment uses cross track error measurement from the center of a wheel of an agricultural machine to a center of a crop row furrow, and a cross track error variance can be determined. The cross track error variance can include a range of values of XTK, for example a curve or plot such as a bell curve with a peak representing the XTK and a width affecting confidence. The cross track error variance is then used to determine a variance confidence penalty. The variance confidence penalty is in one example subtracted from the combined sensor solution confidence to provide, a raw confidence measurement, which is optionally run through a low pass filter to obtain a final solution quality for the sensor, in various embodiments.”).
Regarding claim 5 Weidenbach in view of Wilson teaches the row detection system of claim 4, Weidenbach does not teach but Wilson teaches, wherein the processor is configured or programmed to estimate a positional relationship between the detected at least one of crop rows and ridges and the agricultural machine based on the positional relationship; (See Wilson paragraph 0029; “…the secondary processor, or the crop row characteristic circuit set 110 is arranged to calculate crop density based on the crop elements. In an example, the secondary processor, or the crop row characteristic circuit set 110, can be arranged to store a portion of the image corresponding to crop elements in response to determining that the crop density is beyond a threshold. In an example, the secondary processor, or the crop row characteristic circuit set 110, can be arranged to estimate a crop germination metric from the crop density and a planting metric (e.g., when the crop was planted, fertilizer applied, soil type, etc.). In an example, the secondary processor, or the crop row characteristic circuit set 110, can be arranged to estimate crop height based on crop elements, a planting metric, and crop type.”).
Both Weidenbach and Wilson are in the same field of row detection system and method for agricultural machine. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Weidenbach a row detection system and method with Wilson estimate a positional relationship between the detected at least one of crop rows and ridges. No new functionality would arise from the combination and the combination would improve usability of Weidenbach by adding an estimate a positional relationship between the detected at least one of crop rows and ridges will provide better data to detect the row. Further, finding that one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 6 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach further teaches, wherein the processor is configured or programmed to detect at least the portion of the one or more wheels included in the search region, and based on an image of at least the detected portion of the one or more wheels, estimate a position of the one or more wheels relative to the agricultural machine; (See Weidenbach paragraph 0045; “…The depicted embodiment uses cross track error measurement from the center of a wheel of an agricultural machine to a center of a crop row furrow, and a cross track error variance can be determined. The cross track error variance can include a range of values of XTK, for example a curve or plot such as a bell curve with a peak representing the XTK and a width affecting confidence. The cross track error variance is then used to determine a variance confidence penalty. The variance confidence penalty is in one example subtracted from the combined sensor solution confidence to provide, a raw confidence measurement, which is optionally run through a low pass filter to obtain a final solution quality for the sensor, in various embodiments.”).
Regarding claim 7 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach does not teach but Wilson teaches, wherein the processor is configured or programmed to: acquire time-series color images from the camera; generate from at least the search region of the time-series color images a plan view image of at least the search region of the ground surface in which a color of a crop row is enhanced; (See Wilson paragraph 0019; “The image capture controller 105 can be arranged to receive an image, e.g., from a camera 125. In an example, the image capture controller 105 can be a sensor (e.g., the camera 125, such that the image capture controller 105 and the camera 125 are the same unit) affixed to a vehicle (e.g., the AEQ 120). In an example, the image capture controller 105 can retrieve the image from another component (e.g., a sensor 125 affixed to the vehicle 120, a database, video stream, etc.). In an example, the image can include any one or more of color information, depth information (e.g., as measured by time-of-flight from an emitter (e.g., infrared emitter), depth pattern, etc.), or luminance information. In an example, the image controller 105 is arranged to capture the image during operation of the AEQ 120 in the environment that the AEQ 120 is operating. Thus, the captured images are real-time with the operation of the AEQ 120 and include the agricultural environment as the subject.”); classify the plan view image into first pixels of which an index value for the color is equal to or greater than a threshold and second pixels of which the index value is below the threshold; and determine positions of edge lines of the crop row based on the index values of the first pixels; (See Wilson paragraph 0020; “The crop row characteristic circuit set 110 can be arranged to gather a plurality of crop row characteristics. In an example, the crop row characteristic circuit set 110 can be arranged to obtain at least one row center. In the example of the crop row center, the crop row characteristic circuit set 110 can be arranged to calculate the center from the image. In an example, such calculation can include applying a feature threshold to identify elements of interest in the image. In an example, the feature threshold can be a color threshold. For example, if the crop color is green, the furrow color is brown, the image can be filtered and non-green information (e.g., pixels) removed. Once the elements of interest are identified via feature thresholding, the crop row characteristic circuit set 110 can be arranged to perform a Hough transform on the elements of interest to identify a line in the image that corresponds to the row center. In an example, the row center is a crop row or adjacent to a crop row.”).
Both Weidenbach and Wilson are in the same field of row detection system and method for agricultural machine. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Weidenbach a row detection system and method with Wilson the search region of the ground surface in which a color of a crop row is enhanced based on the index values of the pixels. No new functionality would arise from the combination and the combination would improve usability of Weidenbach by adding a the search region of the ground surface in which a color of a crop row is enhanced based on the index values of the pixels will provide better visual data to detect the row. Further, finding that one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 8 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach further teaches, wherein the processor is configured or programmed to generate and output a target path based on positions of the crop rows or ridges; (See Weidenbach paragraph 0040; “FIGS. 3A and 3B illustrate examples of determining cross track error and heading error. In FIG. 3A, an agricultural machine 300 includes a controller used to determine heading error as an angle 334 of the difference between machine orientation 332 (machine heading and position) and path guidance 330 of the machine 300…”).
Regarding claim 9 Weidenbach in view of Wilson teaches the row detection system of claim 1, Weidenbach further teaches, wherein the processor is configured or programmed to: from a plurality of images among the time-series images that have been acquired at different points in time, determine a first amount of movement of each of a plurality of feature points in an image plane through feature point matching; through perspective projection of each of the plurality of feature points from the image plane onto a reference plane corresponding to the ground surface, determine a second amount of movement of each of a plurality of projection points in the reference plane based on the first amount of movement; and based on the second amount of movement, estimate heights of the plurality of feature points from the reference plane to detect a ridge on the ground surface; (See Weidenbach paragraph 0037 and 0042; “The agricultural machine 200 includes one or more vision sensor assemblies 202 including, for example, a digital video camera or LIDAR device. The one or more vision sensor assemblies 202 are, in one example, mounted at an elevated position relative to the field, crop canopy, or ground engagement units on the machine and are configured to capture images of a field 220 including crop rows and intervening furrows. The images are analyzed by the machine controller to determine one or more of cross track error or heading error. The agricultural machine 200 further includes one or more sensor assemblies 204 including, for example, a GPS antenna, real time kinematics (RTK) system or receiver/transmitter or other sensor device that may sense characteristics associated with one or more of cross track error, heading error or the like… In various embodiments, a signal energy 470 associated with a calculated row position is determined based on, for example, how clearly a crop or other row indicator can be differentiated from a soil or other furrow indicator (e.g., differentiation between brow and green pixels in image). In various embodiments, to make a confidence determination, the controller (such as machine controller 105 in FIG. 1) looks for brown/green differentiation and consistency with a line. The present subject matter determines a higher confidence for both, and lower confidence for one or zero of these determinations, in various embodiments. A line is fit to the identified row, a transformation is taken from the image to world space (x, y coordinates and determination of XTE and Heading Error) and then XTE and Heading Error are determined from a vehicle line (direction vehicle is heading from the rear axle) relative to the identified line (e.g., the crop row), in various embodiments.”).
Regarding claim 10 Weidenbach in view of Wilson teaches the row detection system of claim 9, Weidenbach further teaches, wherein given a height Hc of a center point of the perspective projection from the reference plane; heights dH of the plurality of feature points from the reference plane; a second amount of movement L of a feature point with a dH of zero on the reference plane; and a second amount of movement L+dL of a feature point with a dH greater than zero; the processor is configured or programmed to determine the height of each of the plurality of feature points by calculating Hc' (1.0-L/(L+dL)); (See Weidenbach paragraph 0044 and 0064; ” FIGS. 5A-5B illustrate a homography transformation for converting from an image space 502 to a world space 512. FIG. 5B is an example showing distortion correction applied. The homography transformation uses camera distortion correction, in various embodiments. Without the distortion correction, the transformation shown in FIG. 5A includes taking a position (x, y) in an acquired crop row image, augmenting it to (x, y, 1) and multiplying by a 3×3 matrix that is determined based on the height and pitch of the camera, in an embodiment. This calculation provides a new location (X, Y, Z) that, when divided by Z, results with (u, v, 1) where u=X/Z and v=Y/Z, such that u and v are then the coordinates of the same point projected onto a different plane. In one embodiment, the present subject matter projects from the image plane to the ground plane as illustrated in FIG. 5B… the method further includes configuring the first sensor to couple to the agricultural machine at an elevated location relative to the path reference for detecting the first orientation from the elevated location directed toward the path reference; and configuring the second sensor to couple to the agricultural machine at a lateral location relative to the path reference for detecting the second orientation from the lateral location directed across the path reference. In some examples, obtaining the first confidence comprises decreasing the first confidence relative to the second confidence responsive to a detected increase in a height of crops in the guidance path. In some examples, obtaining the first confidence comprises decreasing the first confidence relative to the second confidence responsive to a detected increase in a size of density of a canopy of crops in the path reference. In various examples, obtaining the first confidence comprises increasing the first confidence relative to the second confidence responsive to a detected curvature in the path reference.”).
Regarding claim 11 Weidenbach in view Wilson taches, agricultural machine comprising: the row detection system of claim 1; Weidenbach further teaches, a wheel; and an automatic steering controller configured or programmed to control a steering angle of the wheel based on positions of the crop rows or ridges as determined by the row detection system; (See Weidenbach paragraph 0038 and 0071;” …the agricultural machine 200 includes one or more ground engagement units 205 (e.g., wheels, axles or tracks)…a row steering system of an agricultural machine, the row steering system comprising: a first sensor assembly configured to detect a first orientation of the agricultural machine relative to a path reference in a field using a first sensor configured to measure a first characteristic; a second sensor assembly configured to detect a second orientation of the agricultural machine relative to the crop rows in the field using a second sensor configured to measure a second characteristic different than the first characteristic; and a control module including: a first, evaluation module to obtain a first confidence in the detected first, orientation; a second evaluation module to obtain a second confidence in the detected second orientation; and a selector module to selectively provide one or more of the detected first orientation or the detected second orientation to a machine controller of the agricultural machine based on the first and second confidences.”).
Regarding claim 12, Weidenbach in view Wilson taches, agricultural machine of claim 11, Weidenbach further teaches wherein, based on the time-series images, the processor of the row detection system is configured or programmed to monitor a positional relationship between the crop rows or ridges and the wheel, and supply a positional error signal to the automatic steering controller; ( See Weidenbach paragraph 0037 and 0047; “The agricultural machine 200 includes one or more vision sensor assemblies 202 including, for example, a digital video camera or LIDAR device. The one or more vision sensor assemblies 202 are, in one example, mounted at an elevated position relative to the field, crop canopy, or ground engagement units on the machine and are configured to capture images of a field 220 including crop rows and intervening furrows. The images are analyzed by the machine controller to determine one or more of cross track error or heading error. The agricultural machine 200 further includes one or more sensor assemblies 204 including, for example, a GPS antenna, real time kinematics (RTK) system or receiver/transmitter or other sensor device that may sense characteristics associated with one or more of cross track error, heading error or the like.…using a vision sensor, or radar with an additional sensor in the rear, the present subject matter can determine the actual heading of the agricultural machine. For example, one or more sensors can detect the slope of the hill and the machine controller can use this measurement to determine, for example, that actual confidence in the GPS should be low (and the GPS confidence value is accordingly decreased) and use data from the vision or radar sensor along with the additional sensor to determine the actual heading of the vehicle. In FIG. 7, using only GPS with the GPS antenna mounted at the front axle of the vehicle, the machine controller may incorrectly calculate that the rear axle is in the same row of crops as the front axle, when it is actually significantly downhill of the front axle. In an extreme case (depending on the position of the GPS antenna and the severity of the hill), this would mean that one or both sets of wheels could be running over crops when the system is not aware of the positional error.”).
With respect to independent claim 13, please see the rejection above with respect to claim 1 which is commensurate in scope to claim 13, with claim 1 being drown to a system, and claim 13 being drawn to a corresponding to computer implemented method.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LIDIA KWIATKOWSKA whose telephone number is (571)272-5161. The examiner can normally be reached Monday-Friday 8:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached at (571) 270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/L.K./Examiner, Art Unit 3666
/SCOTT A BROWNE/Supervisory Patent Examiner, Art Unit 3666