DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This office action is in response to application number 18/398,321 filed on 12/28/2023, in which
Claims 1-20 are presented for examination.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 1/25/2024 has been received and considered by the examiner.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description:
FIG. 4, pg. 16, para 0046: "instructions 116".
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Objections
Claims 3 and 13 are objected to because of the following informalities:
Claim 3 and 13 recites:
"identifying neighboring points of the plurality of points within a certain distance for each of the plurality of points, a number of the neighboring points for each of the points indicative of dust being less than a threshold number, a number of the neighboring points for each of the remaining points being above the threshold number."
It is unclear if the claim language is listing identifying steps, and should instead recite something similar to:
"filter the data by identifying neighboring points [...], a number of neighboring points [...], and/or a number of neighboring points [...]" or
if the claim language is further describing the filtering and neighboring points and should instead recite something similar to:
"filter the data by identifying neighboring points [...], wherein a number of neighboring points less than a threshold number are the points indicative of dust and/or a number of neighboring points above the threshold number are the remaining points."
Based on the specification, [pg. 20, para 0055 ad pg. 27, para 0073], for examination purposes, the claim language will be interpreted as the latter, further defining the filtering and neighboring points.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 4-6, 8-10, 11, 14-16, and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Posselius et al., PG Pub US-2018/0220577-A1 (herein “Posselius) in view of Darr et al., PG Pub US-2015/0124054-A1 (herein “Darr”) and Qadir et al., PG Pub US-2024/0126269-A1 (herein “Qadir”).
Regarding Claim 1, Posselius discloses: An agricultural system for monitoring surface roughness within a field during an agricultural operation. See [Posselius, pg. 2, para 0019], which explains that “[…], the present subject matter is directed to a system and method for automatically monitoring the soil surface roughness of a field during the performance of a ground-engaging operation.”
Posselius further discloses: the agricultural system comprising: an agricultural implement having at least one ground engaging tool, the at least one ground engaging tool being configured to engage a field to perform an agricultural operation within the field as the agricultural implement moves across the field. See [Posselius, pg. 3. para 0023], which explains that the work vehicle tows an implement which is engaged with the field as the implement is towed across the field and can include shanks, disk blades, leveling blades, basket assemblies, tines or spikes, “Additionally, as shown in FIGS. 1 and 2, the implement 12 may generally include a carriage frame assembly 30 configured to be towed by the work vehicle via a pull hitch or tow bar 32 in a travel direction of the vehicle (e.g., as indicated by arrow 34). As is generally understood, the carriage frame assembly 30 may be configured to support a plurality of ground-engaging tools, such as a plurality of shanks, disk blades, leveling blades, basket assemblies, tines, spikes, and/or the like. In several embodiments, the various ground-engaging tools may be configured to perform a tillage operation or any other suitable around-engaging operation across the field along which the implement 12 is being towed.”
Posselius further disclose: a sensor having a field of view directed toward a portion of the field worked by the agricultural implement during the agricultural operation, the sensor being configured to generate data indicative of the portion of the field,[…]. See [Posselius, pg. 3, para 0030], which further explains that the implement includes one or more soil roughness sensors with a field of view toward the portion of the field being worked for detecting surface roughness of the field, “Additionally, in accordance with aspects of the present subject matter, the work vehicle 10 and/or the implement 12 may include one or more non-contact soil roughness sensors 104 coupled thereto and/or supported thereon for monitoring the surface roughness of the field as a ground-engaging operation (e.g., a tillage operation, a planting operation, fertilizing operation, and/or the like) is being performed thereon via the implement 12. Specifically, in several embodiments, the soil roughness sensor(s) 104 may be provided in operative association with the work vehicle 10 and/or the implement 12 such that the sensor(s) 104 has a field of view or sensor detection range directed towards a portion(s) of the field adjacent to the work vehicle 10 and/or the implement 12. As such, the soil roughness sensor(s) 104 may be used to detect the surface roughness of the adjacent portions of the field as the tractor 10 and/or implement 12 passes by such portions of the field during the performance of the ground-engaging operation.”
Posselius further discloses: a computing system configured to: receive the data generated by the sensor; filter the data to remove points indicative of dust from remaining points of the plurality of points; […]. See [Posselius, pg. 5, para 0040], which explains that the computing device, contains a processor and memory such as a computer readable non-volatile medium, “In general, the controller 102 may correspond to any suitable processor-based device(s), such as a computing device or any combination of computing devices. Thus, as shown in FIG. 4, the controller 102 may generally include one or more processor(s) 110 and associated memory devices 112 configured to perform a variety of computer-implemented functions […]. […]. Additionally, the memory 112 may generally comprise memory element(s) including, but not limited to, computer readable medium [...], computer readable non-volatile medium […], […]. Such memory 112 may generally be configured to store information accessible to the processor(s) 110, including data 114 that can be retrieved, manipulated, created and/or stored by the processor(s) 110 and instructions 116 that can be executed by the processor(s) 110.” Also see [Posselius, pg. 6, para 0046], which explains that modules analyze the sensor data to estimate surface roughness by correcting, adjusting, and analyzing the data, including filtering to remove outliers, “In general, the data analysis module 126 may be configured to analyze the initial or raw sensor data captured by the soil roughness sensor(s) 104 to allow the controller 102 to estimate the surface roughness of one or more sections of the field. For instance, the data analysis module 126 may be configured to execute one or more suitable data processing techniques or algorithms that allows the controller 102 to accurately and efficiently analyze the sensor data, such as by applying corrections or adjustments to the data based on the sensor type, sensor resolution, and/or other parameters associated with the soil roughness sensor(s) 104, by filtering the data to remove outliers, by implementing sub-routines or intermediate calculations required to estimate the surface roughness of the soil, and/or by performing any other desired data processing-related techniques or algorithms.” Finally see [Posselius, pgs. 6-7, para 0049], which further explains that filtering out or removing outliers includes removing non-ground related datapoints such as dust, “Additionally, as shown in FIG. 5, the data analysis module 126 may also be configured to filter out or remove outliers from the data (e.g., at box 204). Data outliers may, for example, correspond to non-ground-related points captured by the sensor(s) 104, such as dust, unwanted stubble, and/or the like. In one embodiment, the data analysis module 126 may be configured to implement a machine learning classification algorithm to remove any outliers from the data, such as by implementing decision trees, support vector machines, clustering and/or the like. In this regard, the actual geometry of the surface roughness data, itself, may produce features that can be identified as outliers using any suitable data processing technique. It should be appreciated that similar to the sensor calibration, the specific algorithm or technique used to remove the outliers from the data may be dependent on the type of sensor(s) 104 being used. For instance, a LIDAR scanner may produce intensity or reflectivity measurements in connection with the point cloud that may need to be removed as outliers.”
Posselius further discloses: […] determine a surface roughness of the field based at least in part on the high point and the low point within each of the plurality of subsections. See [Posselius, pgs. 3-4, para 0031], which explains that the sensor data, such as from a LIDAR scanner, collects data associated with the surface roughness of the surface or soil across a field or a portion of the field, “In general, the non-contact soil roughness sensor(s) 104 may correspond to any suitable sensing device(s) configured to detect or capture data associated with the surface roughness of the soil. For instance, in several embodiments, the soil roughness sensor(s) 104 may correspond to a Light Detection and Ranging […] device(s), such as a LIDAR scanner(s). In such embodiments, the soil roughness sensor(s) 104 may be configured to output light pulses from a light source […] and detect the reflection of each pulse off of the soil surface. Based on the time of flight of the light pulses, the specific location (e.g., 3-D coordinates) of the soil surface relative to the sensor(s) 104 may be calculated. By scanning the pulsed light over a given swath width, the surface roughness of the soil may be detected across a given section of the field. Thus, by continuously scanning the pulsed light along the soil surface as the work vehicle 10 and the implement 12 are moved across the field, a point cloud may be generated that includes surface roughness data for all or a portion of the field.” Also see [Posselius, pg. 8, para 0062], which explains that after the data is plotted and fit, the roughness can be determined using the height of the data corresponding to the ground surface, where a section of the field can be determined as rough or smooth, “In each data plot shown in FIGS. 6 and 7, the surface roughness data has been preprocessed (e.g., by applying a suitable sensor calibration and removing outliers) and subsequently plotted for a section or strip of the field represented generally by a 2-D plane. As particularly shown in FIG. 6, a first set of roughness data has been plotted that provides an estimated first soil surface for the field (e.g., as indicated by solid line 300), in addition, a best-fit line has been fitted to the data to establish a baseline ground surface for the data (e.g., as indicated by dashed line 302). Similarly, as particularly shown in FIG. 7, a second set of roughness data has been plotted that provides an estimated second soil surface line (e.g., as indicated by solid line 304), with a best-fit line having been fitted to the data to establish a baseline ground surface for the data (e.g., as indicated by dashed line 306). Based on the baseline ground surface 302, 306 determined for each data set, a surface roughness value(s) for the field may be estimated by calculating the standard deviation of the heights or vertical distances 308, 310 defined between each data point along each soil surface line 300, 304 and the corresponding baseline ground surface 302, 306. As shown in FIG. 6, given the large variance in the plotted data relative to the baseline ground surface 302, it may be determined that the section of the field associated with the first set of data was relatively rough when the data was captured. Similarly, as shown in FIG. 7, given the significantly smaller variance in the plotted data relative to the baseline ground surface 306, it may be determined that the section of the field associated with the second set of data was relatively smooth when the data was captured.”
Posselius does not disclose: […] the data including a plurality of points; […]; rotate the data after filtering such that a surface of the field determined from the remaining points is substantially horizontal; sub-divide the data into a plurality of subsections after rotating; determine both a high point associated with a given high height-percentile and a low point associated with a given low height-percentile of the remaining points within each of the plurality of subsections, the given high height-percentile being higher than the given low height-percentile; […].
However, Qadir teaches: rotate the data after filtering such that a surface of the field determined from the remaining points is substantially horizontal […]. See [Qadir, pg. 1, para 0003], which explains that the method includes using 3D point cloud data representative of a plane, where an upper plane represents the height data and the lower plane represents the ground, where a processor transforms the data so that the planes are in the same coordinate system, “[…] collecting stereo vision data to determine a crop three-dimensional representation or upper point cloud of an upper portion (e.g., 3D coordinates of points lying on a top, upper surface, or upper plane) of a crop canopy in a front region in front of the vehicle. An elevation estimator is configured to estimate observed ground elevation data based on current observed position data of a location-determining receiver, […]. […]. Further, the elevation estimator is configured to determine an average or weighted average of the observed ground elevation and the stored ground elevation for the aligned prior position data and current observed position data. A lower point cloud estimator is configured to estimate a ground three-dimensional representation or lower point cloud (e.g., ground plane or lower surface lower point cloud of virtual points) of the ground based on the determined average. An electronic data processor is configured to transform or align the collected stereo vision data to a common coordinate system of the estimated ground position data.” Also see [Qadir, pg. 2, para 0029], which further explains that the module calibrates the image data by transforming or aligning the stereo images, including using the same rotational angle, “The stereo-calibration-and-processing module 304 is configured to calibrate the orientation (e.g., angular rotation in two or more dimensions) and alignment […] of a first image […] and second image […] of an image pair of one or more collected stereo images. For example, the first image and the second image are aligned to have the same rotational angle in two or more dimensions. In one embodiment, the stereo-calibration-and-processing module 304 is configured to account for distortion (e.g., relative scaling or magnification, and skew) associated with a pair of images […] that are used in the stereo camera or imaging device 301 because there may be variation […] in the optical system and optical path of the first image and the second image of the imaging device 301. Further, the first image and the second image are aligned (e.g., vertically or longitudinally) to have a common reference frame (e.g., common image plane) of the imaging device 301,” and [Qadir, pg. 3, para 0029], which further explains that the transformation module can apply calibration to the 3D point cloud data, “The camera to world transformation module 310 is configured to transform one or more point clouds (e.g., preliminary three-dimensional point clouds), such as a lower point cloud (representative of the ground) and an upper point cloud (representative of the crop canopy) in the same region of interest, of the common reference frame of the imaging device 301 to (e.g., refined three-dimensional point clouds) in a real world reference frame.”
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Qadir to transform the data using an angle based on the ground plane and height plane. Doing so allows the datapoints to be analyzed in a common coordinate system [Qadir, pg. 1, para 0003], which is important because the agricultural vehicle may be operating on a slope or at an angle, which can provide incorrect measurements [Qadir, pg. 1, para 0002].
However, Darr teaches: […] the data including a plurality of points. See [Darr, pgs. 2-3, para 0038], which explains that the 3D sensor receives data in the form of pixels for matching and comparison and generation of a 3D profile,” As depicted in FIG. 8, in examples in which the 3D sensor 202 includes cameras, the computing device can generate a 3D profile of the scene based on captured images from the cameras. For example, the computing device can apply triangulation to the captured images. Triangulation can include matching a pixel from a first image taken by a first camera to an associated pixel in a second image taken by a second camera. The difference between the pixel location in the first image and the corresponding pixel location in the second image can be the disparity. The computing device can generate an image 804 depicting the disparities between pixels in the captured images. Based on each pixel location in the first image, the associated disparity, and a distance in real space (e.g., in meters) between the first camera and the second camera, the computing device can determine a 3D location in real space associated with each pixel. The computing device can use the 3D locations in real space to generate a 3D profile 806 corresponding to the camera data.”
Darr further teaches: sub-divide the data into a plurality of subsections after rotating. See [Darr, pg. 3, para 0040], which explains that the computing device can segment the corrected data, “For example, the computing device can use the characteristic associated with the material and/or the characteristic associated with the elevator to divide a captured image 800 into different segmented regions 802. In some examples, the computing device can use the 3D profile 806 to improve the division of the captured images 800 into different segmented regions 802. As shown in FIG. 9, upon dividing the image into segmented regions 802, the computing device can generate corrected segmented regions 904. The corrected segmented regions 904 can include versions of each of the segmented regions 802 that have been corrected for illumination, scale, and perspective to improve the invariance of these factors. The computing device can include a feature extraction module 906 configured to extract features 908 from each of the corrected segmented regions 904. In some examples, the feature extraction module 906 can apply one or more filters to the corrected segmented region, change the dimensions of a corrected segmented region, or account for non-linearities in the corrected segmented region.”
Darr further teaches: determine both a high point associated with a given high height-percentile and a low point associated with a given low height-percentile of the remaining points within each of the plurality of subsections, the given high height-percentile being higher than the given low height-percentile. See [Darr, pg. 6, para 0072], which explains that system determines a volume using the known area of the cell and the measured height, “If the yield measurement system is not in a calibration mode, the process 700 can continue to block 712. In block 712, the processing device 602 determines the volume of the material based on the filtered 3D map. For example, the processing device 602 can subtract a calibration value associated with the plane of the elevator back plate from the robust maximum height (e.g., determined in block 704) of the cell. This can produce the height of the material in the cell. Because each cell can have a known area (e.g., 1.25 cm.times.1.25 cm=1.5625 cm.sup.2 area), the processing device 602 can multiply the height of the material in the cell by the area of the cell to determine the volume of the material in the cell. The processing device 602 can repeat this process for all of the cells to determine the total instantaneous volume of the material on the elevator.” Also see [Dar, pg. 9, para 0102], which further explains that the 3D map includes an upper and lower mapping associated with a high percentile, of 90% with respect to ground and a low percentile, of 10%, with respect to ground,” In block 1606, the processing device 602 determines an upper 3D map and a lower 3D map. The lower 3D map can include a 3D map associated with features in the plane at or near ground level. The upper 3D map can include a 3D map associated with features in the plane along or near the top of the material (e.g., the top of the stubble). The processing device 602 can determine the upper 3D map and the lower 3D map based on the 3D map (e.g., generated in block 1604). The processing device 602 can determine the upper 3D map using a high percentile (e.g., 90%) of the height of the 3D data points with respect to ground. The processing 602 can determine the lower 3D map using a low percentile (e.g., 10%) of the height of the 3D data points with respect to ground.”
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to include dividing the data points and determining high and low points among the data points using a height percentile. Doing so allows for additional calibration factors [Darr, pg. 9, para 0107] that can be used for determining implement controls [Darr, pg. 9, para 0109]. Including segmenting the map using height boundaries to account of features at the top of the field, or material, and at the bottom of the material at or near ground level [Darr, pg. 9, paras 0100-0102], which allows for extracting features and further filtering the data [Darr, pg.9 , para 0109], such as airborne material [Darr, pg. 8, para 0098]. Finally, this includes calibrations such as multi-dimensional spatial filtering that can support improving mapping and smoothing mapping among map zones [Darr, pg. 7, para 0081].
Regarding Claim 4, Posselius as modified discloses the limitations of Claim 1.
Posselius further discloses: […] fitting […] to the remaining points; […]. See [Posselius, pg. 8, para 0062], which explains that after the surface roughness data has been preprocessed and plotted as a 2D plane, the data is fit to a line based on the ground surface, “In each data plot shown in FIGS. 6 and 7, the surface roughness data has been preprocessed (e.g., by applying a suitable sensor calibration and removing outliers) and subsequently plotted for a section or strip of the field represented generally by a 2-D plane. As particularly shown in FIG. 6, a first set of roughness data has been plotted that provides an estimated first soil surface for the field (e.g., as indicated by solid line 300), in addition, a best-fit line has been fitted to the data to establish a baseline ground surface for the data (e.g., as indicated by dashed line 302). Similarly, as particularly shown in FIG. 7, a second set of roughness data has been plotted that provides an estimated second soil surface line (e.g., as indicated by solid line 304), with a best-fit line having been fitted to the data to establish a baseline ground surface for the data (e.g., as indicated by dashed line 306). Based on the baseline ground surface 302, 306 determined for each data set, a surface roughness value(s) for the field may be estimated by calculating the standard deviation of the heights or vertical distances 308, 310 defined between each data point along each soil surface line 300, 304 and the corresponding baseline ground surface 302, 306.”
However, Qadir teaches: […fitting] a plane […]; determining an angle between the plane and a horizontal plane; and rotating the plane based at least in part on the angle. See again [Qadir, pg. 1, para 0003], which explains that the method includes using 3D point cloud data representative of a plane, where an upper plane represents the height data and the lower plane represents the ground, where a processor transforms the data so that the planes are in the same coordinate system. Also see again [Qadir, pg. 2, para 0029], which further explains that the module calibrates the image data by transforming or aligning the stereo images, including using the same rotational angle and [Qadir, pg. 3, para 0029], which further explains that the transformation module can apply calibration to the 3D point cloud data.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Qadir to transform the data using an angle based on the ground plane and height plane. Doing so allows the datapoints to be analyzed in a common coordinate system [Qadir, pg. 1, para 0003], which is important because the agricultural vehicle may be operating on a slope or at an angle, which can provide incorrect measurements [Qadir, pg. 1, para 0002].
Regarding Claim 5, Posselius as modified discloses the limitations of Claim 1.
Posselius further discloses: […] wherein the computing system is configured to sub-divide the data along both a direction of travel of the agricultural implement and along a lateral direction of the agricultural implement into the plurality of subsections. See again [Posselius, pg. 3, para 0030], which further explains that the implement includes one or more soil roughness sensors with a field of view toward the portion of the field being worked for detecting surface roughness of the field and [Posselius, pgs. 3-4, para 0031], which explains that the surface roughness can be detected for sections or portions of a field or the whole field, “By scanning the pulsed light over a given swath width, the surface roughness of the soil may be detected across a given section of the field. Thus, by continuously scanning the pulsed light along the soil surface as the work vehicle 10 and the implement 12 are moved across the field, a point cloud may be generated that includes surface roughness data for all or a portion of the field.” Also see [Posselius, pg. 4, para 0033], which further explains that the data is collected in a plane or line along the direction of travel of the work vehicle and perpendicular to the work vehicle and used to determine soil roughness, “In several embodiments, two or more soil roughness sensors 104 may be provided in operative association with the work vehicle 10 and/or the implement 12. For instance, as shown in FIGS. 1 and 2, in one embodiment, a first soil roughness sensor 104A may be provided at a forward end 70 (FIG. 3) of the work vehicle 10 to allow the sensor 104A to capture data associated with the soil roughness of an adjacent first section 106 of the field disposed in front of the work vehicle 10. For instance, for each detection event, the first soil roughness sensor 104A may be configured to capture soil roughness data along a plane or reference line that extends generally perpendicular to the travel direction 34 of the work vehicle 10 directly in front of the vehicle 10. Similarly, as shown in FIGS. 1 and 2, a second soil roughness sensor 104B may be provided at or adjacent to an aft end 76 (FIG. 3) of the implement 12 to allow the sensor 104B to capture data associated with the soil roughness of an adjacent second section 108 of the field disposed behind the implement 12. For instance, for each detection event, the second soil roughness sensor 104B may be configured to capture soil roughness data along a plane or reference line that extends generally perpendicular to the travel direction 34 of the work vehicle 10 at a location directly behind the implement 12.”
Regarding Claim 6, Posselius as modified discloses the limitations of Claim 1.
Posselius does not disclose: […] wherein the plurality of subsections are equally sized.
However, Darr teaches: […] wherein the plurality of subsections are equally sized. See [Darr, pg. 6, para 0067], which describes the 3D coordinate system, broken down into grids, which are further described as sized as squares with equal sides, “In some examples, upon generating the 3D profile, the processing device 602 can change the coordinate system of the 3D profile to a frame aligned with the elevator plane, such that the y-axis follows along the paddle direction, the z-axis points from the elevator to the 3D sensor 202 (e.g., a camera in the 3D sensor), and X is orthogonal to the y-axis and the z-axis. The origin of the frame can be centered in the detectable area of the 3D sensor 202 (e.g., within the view of a camera of the 3D sensor 202). The processing device 602 can divide the 3D profile into a grid of squares (e.g., 1.25 cm by 1.25 cm in size) associated with the plane of the elevator. The processing device 602 can discard the points that fall outside of the grid of squares. For each square in the grid of squares, the processing device 602 can determine the robust maximum height (e.g., using a median filter) relative to the plane of the elevator. Based on the grid of squares and the robust maximum heights, the processing device 602 can generate the 3D map. Each cell in the 3D map can have an associated robust maximum height.”
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to break down the data into squares of equal size. Doing so allows the system to generate data regardless of the calibration status since the area is a known, consistent value [Darr, pg. 6, para 0072]. Additionally, segmenting the data evenly can contribute to calibrations for multi-dimensional spatial filtering that can support improving mapping and smoothing mapping among map zones [Darr, pg. 7, para 0081].
Regarding Claim 8, Posselius as modified discloses the limitations of Claim 1.
Posselius does not disclose: […] wherein the high height-percentile is from about a 95th height-percentile to about a 100th height-percentile, wherein the low height-percentile is from about a 0th height-percentile to about a 5th height-percentile.
However, Darr teaches: […] wherein the high height-percentile is from about a 95th height-percentile to about a 100th height-percentile, wherein the low height-percentile is from about a 0th height-percentile to about a 5th height-percentile. See again [Dar, pg. 9, para 0102], which further explains that the 3D map includes an upper and lower mapping associated with a high percentile, of 90% with respect to ground and a low percentile, of 10%, with respect to ground.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to include a high height percentile of 90% and a low height percentile of 10%. Doing so allows the 3D map to be segmented using height boundaries to account of features at the top of the field, or material, and at the bottom of the material at or near ground level [Darr, pg. 9, paras 0100-0102], which allows for extracting features and further filtering the data [Darr, pg.9 , para 0109], such as airborne material [Darr, pg. 8, para 0098].
Regarding Claim 9, Posselius as modified discloses the limitations of Claim 1.
Posselius further discloses: […] wherein the sensor is a Light Detection and Ranging (LIDAR) sensor. See [Posselius, pgs. 3-4, para 0031], which explains that the soil roughness sensor can include a LIDAR device or scanner, “In general, the non-contact soil roughness sensor(s) 104 may correspond to any suitable sensing device(s) configured to detect or capture data associated with the surface roughness of the soil. For instance, in several embodiments, the soil roughness sensor(s) 104 may correspond to a Light Detection and Ranging (“LIDAR”) device(s), such as a LIDAR scanner(s).”
Regarding Claim 10, Posselius as modified discloses the limitations of Claim 1.
Posselius further discloses: […] wherein the at least one ground engaging tool comprises at least one of a shank, a disk blade, leveling blades, or a basket assembly. See again [Posselius, pg. 3. para 0023], which explains that the work vehicle tows an implement which is engaged with the field as the implement is towed across the field and can include shanks, disk blades, leveling blades, basket assemblies, tines or spikes.
Regarding Claim 11, Posselius discloses: An agricultural method for monitoring surface roughness within a field during an agricultural operation with an agricultural implement having at least one ground engaging tool, the at least one ground engaging tool being configured to engage the field to perform the agricultural operation as the agricultural implement moves across the field, […]. See again [Posselius, pg. 3. para 0023], which explains that the work vehicle tows an implement which is engaged with the field as the implement is towed across the field and can include shanks, disk blades, leveling blades, basket assemblies, tines or spikes.
Posselius further discloses: […] the agricultural method comprising: receiving, with a computing system, data generated by a sensor having a field of view directed toward a portion of the field worked by the agricultural implement during the agricultural operation, the data being indicative of the portion of the field, […]; filtering, with the computing system, the data to remove points indicative of dust from remaining points of the plurality of points; […]; determining, with the computing system, a surface roughness of the field based at least in part on the high point and the low point within each of the plurality of subsections; and performing, with the computing system, a control action associated with the agricultural implement based at least in part on the surface roughness of the field. See again [Posselius, pg. 3, para 0030], which further explains that the implement includes one or more soil roughness sensors with a field of view toward the portion of the field being worked for detecting surface roughness of the field. Also see again [Posselius, pg. 5, para 0040], which explains that the computing device, contains a processor and memory such as a computer readable non-volatile medium. Also see again [Posselius, pg. 6, para 0046], which explains that modules analyze the sensor data to estimate surface roughness by correcting, adjusting, and analyzing the data, including filtering to remove outliers. Also see again [Posselius, pgs. 6-7, para 0049], which further explains that filtering out or removing outliers includes removing non-ground related datapoints such as dust. Finally see again [Posselius, pgs. 3-4, para 0031], which explains that the sensor data, such as from a LIDAR scanner, collects data associated with the surface roughness of the surface or soil across a field or a portion of the field and [Posselius, pg. 8, para 0062], which explains that after the data is plotted and fit, the roughness can be determined using the height of the data corresponding to the ground surface, where a section of the field can be determined as rough or smooth. Also see [Posselius, pg. 2, para 0019], which explains that in response to the surface roughness measurements the ground-engaging implement can be controlled, “Thereafter, if it is determined that the effectiveness of the implement is deficient (e.g., due to surface roughness differential differing from a given target value or falling outside a given target range), the controller may be configured to automatically adjust the operation of the work vehicle and/or the implement in a manner designed to modify the effectiveness of the implement in decreasing or increasing the surface roughness of the soil, as desired. For example, the controller may be configured to automatically adjust the ground speed of the implement and/or adjust a ground-engaging parameter(s) associated with one or more ground-engaging tools of the implement (e.g., a penetration depth and/or a down pressure for one or more of the around-engaging tools).”
Posselius does not disclose: […] the data including a plurality of points; […]; rotating, with the computing system, the data after filtering such that a surface of the field determined from the remaining points is substantially horizontal; sub-dividing, with the computing system, the data into a plurality of subsections after rotating; determining, with the computing system, both a high point associated with a given high height-percentile and a low point associated with a given low height-percentile of the remaining points within each of the plurality of subsections, the given high height-percentile being higher than the given low height-percentile; […].
However, Qadir teaches: […] rotating, with the computing system, the data after filtering such that a surface of the field determined from the remaining points is substantially horizontal […]. See again [Qadir, pg. 1, para 0003], which explains that the method includes using 3D point cloud data representative of a plane, where an upper plane represents the height data and the lower plane represents the ground, where a processor transforms the data so that the planes are in the same coordinate system. Also see again [Qadir, pg. 2, para 0029], which further explains that the module calibrates the image data by transforming or aligning the stereo images, including using the same rotational angle and [Qadir, pg. 3, para 0029], which further explains that the transformation module can apply calibration to the 3D point cloud data.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Qadir to transform the data using an angle based on the ground plane and height plane. Doing so allows the datapoints to be analyzed in a common coordinate system [Qadir, pg. 1, para 0003], which is important because the agricultural vehicle may be operating on a slope or at an angle, which can provide incorrect measurements [Qadir, pg. 1, para 0002].
However, Darr teaches: […] the data including a plurality of points; […]; […]; sub-dividing, with the computing system, the data into a plurality of subsections after rotating; determining, with the computing system, both a high point associated with a given high height-percentile and a low point associated with a given low height-percentile of the remaining points within each of the plurality of subsections, the given high height-percentile being higher than the given low height-percentile; […]. See again [Darr, pgs. 2-3, para 0038], which explains that the 3D sensor receives data in the form of pixels for matching and comparison and generation of a 3D profile. Also see again [Darr, pg. 3, para 0040], which explains that the computing device can segment the corrected data. Finally see again [Darr, pg. 6, para 0072], which explains that system determines a volume using the known area of the cell and the measured height and [Dar, pg. 9, para 0102], which further explains that the 3D map includes an upper and lower mapping associated with a high percentile, of 90% with respect to ground and a low percentile, of 10%, with respect to ground.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to include dividing the data points and determining high and low points among the data points using a height percentile. Doing so allows for additional calibration factors [Darr, pg. 9, para 0107] that can be used for determining implement controls [Darr, pg. 9, para 0109]. Including segmenting the map using height boundaries to account of features at the top of the field, or material, and at the bottom of the material at or near ground level [Darr, pg. 9, paras 0100-0102], which allows for extracting features and further filtering the data [Darr, pg.9 , para 0109], such as airborne material [Darr, pg. 8, para 0098]. Finally, this includes calibrations such as multi-dimensional spatial filtering that can support improving mapping and smoothing mapping among map zones [Darr, pg. 7, para 0081].
Regarding Claim 14, Posselius as modified discloses the limitations of Claim 11.
Posselius further discloses: […] fitting […] to the remaining points […]. See again [Posselius, pg. 8, para 0062], which explains that after the surface roughness data has been preprocessed and plotted as a 2D plane, the data is fit to a line based on the ground surface.
Posselius does not disclose: […] […fitting] a plane […]; determining an angle between the plane and a horizontal plane; and rotating the plane based at least in part on the angle.
However, Qadir teaches: […] […fitting] a plane […]; determining an angle between the plane and a horizontal plane; and rotating the plane based at least in part on the angle. See again [Qadir, pg. 1, para 0003], which explains that the method includes using 3D point cloud data representative of a plane, where an upper plane represents the height data and the lower plane represents the ground, where a processor transforms the data so that the planes are in the same coordinate system. Also see again [Qadir, pg. 2, para 0029], which further explains that the module calibrates the image data by transforming or aligning the stereo images, including using the same rotational angle and [Qadir, pg. 3, para 0029], which further explains that the transformation module can apply calibration to the 3D point cloud data.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Qadir to transform the data using an angle based on the ground plane and height plane. Doing so allows the datapoints to be analyzed in a common coordinate system [Qadir, pg. 1, para 0003], which is important because the agricultural vehicle may be operating on a slope or at an angle, which can provide incorrect measurements [Qadir, pg. 1, para 0002].
Regarding Claim 15, Posselius as modified discloses the limitations of Claim 11.
Posselius further discloses: […] wherein sub-dividing the data comprises sub-dividing the data along both a direction of travel of the agricultural implement and along a lateral direction of the agricultural implement into the plurality of subsections. See again [Posselius, pg. 3, para 0030], which further explains that the implement includes one or more soil roughness sensors with a field of view toward the portion of the field being worked for detecting surface roughness of the field and [Posselius, pgs. 3-4, para 0031], which explains that the surface roughness can be detected for sections or portions of a field or the whole field. Also see again [Posselius, pg. 4, para 0033], which further explains that the data is collected in a plane or line along the direction of travel of the work vehicle and perpendicular to the work vehicle and used to determine soil roughness.
Regarding Claim 16, Posselius as modified discloses the limitations of Claim 11.
Posselius does not disclose: […] wherein the plurality of subsections are equally sized.
However, Darr teaches: […] wherein the plurality of subsections are equally sized. See again [Darr, pg. 6, para 0067], which describes the 3D coordinate system, broken down into grids, which are further described as sized as squares with equal sides.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to break down the data into squares of equal size. Doing so allows the system to generate data regardless of the calibration status since the area is a known, consistent value [Darr, pg. 6, para 0072]. Additionally, segmenting the data evenly can contribute to calibrations for multi-dimensional spatial filtering that can support improving mapping and smoothing mapping among map zones [Darr, pg. 7, para 0081].
Regarding Claim 18, Posselius as modified discloses the limitations of Claim 11.
Posselius does not disclose: […] wherein the high height-percentile is from about a 95th height-percentile to about a 100th height-percentile, wherein the low height-percentile is from about a 0th height-percentile to about a 5th height-percentile.
However, Darr teaches: […] wherein the high height-percentile is from about a 95th height-percentile to about a 100th height-percentile, wherein the low height-percentile is from about a 0th height-percentile to about a 5th height-percentile. See again [Dar, pg. 9, para 0102], which further explains that the 3D map includes an upper and lower mapping associated with a high percentile, of 90% with respect to ground and a low percentile, of 10%, with respect to ground.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to include a high height percentile of 90% and a low height percentile of 10%. Doing so allows the 3D map to be segmented using height boundaries to account of features at the top of the field, or material, and at the bottom of the material at or near ground level [Darr, pg. 9, paras 0100-0102], which allows for extracting features and further filtering the data [Darr, pg.9 , para 0109], such as airborne material [Darr, pg. 8, para 0098].
Regarding Claim 19, Posselius as modified discloses the limitations of Claim 11.
Posselius further discloses: […] wherein performing the control action comprises controlling an operation of the agricultural implement based at least in part on the surface roughness of the field. See again [Posselius, pg. 2, para 0019], which explains that in response to the surface roughness measurements the ground-engaging implement can be controlled.
Regarding Claim 20, Posselius as modified discloses the limitations of Claim 11.
Posselius does not disclose: […] wherein performing the control action comprises controlling an operation of a user interface associated with the agricultural implement to indicate the surface roughness of the field.
However, Darr teaches: […] wherein performing the control action comprises controlling an operation of a user interface associated with the agricultural implement to indicate the surface roughness of the field. See [Darr, pg. 5, para 0058], which explains that the computing device and 3D sensor can communicate with an input/output device, “The computing device 612 can include an input/output interface 610. The I/O interface 610 can be used to facilitate a connection to hardware used to input data […] or output data […]. For example, the I/O interface 610 can be in wired or wireless communication with a 3D sensor 202 […], a lighting system 204 […], and other sensors 618 […]. In some examples, the computing device 612 can be in wired or wireless communication with the 3D sensor 202, paddle sensors 206, lighting system 204, and other sensors 618 via the network interface 620.” Also see [Darr, pg. 7, para 0083], which explains that the data can be presented to the operator through the user interface, including a map, “In some examples, the yield data and machinery mass productivity can be presented to the operator through a user interface (e.g., a dynamic user interface). For example, a yield map and machinery mass productivity measurements can be presented through a user interface that is output on a display that is in communication with the computing device 612.” Finally see [Darr, pg. 9, para 0111], which explains that the map generated by the implement can be displayed for the user, “FIG. 19 is an example of a geographic map 1900 output by a base cutter height control system according to one example. In some examples, the geographic map 1900 can be presented through a user interface that is output on a display.”
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Darr to include displaying the data to the user. Doing so allows the user to visualize the data and provide operational feedback [Darr, pg. 9, para 0111] to support optimization to improve equipment logistics and productivity [Darr, pg. 7, para 0083].
Claims 2-3 and 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Posselius in view of Darr and Qadir, further in view of He et al., KR-20220130050-A (herein “He”).
Regarding Claim 2, Posselius as modified discloses the limitations of Claim 1.
Posselius does not disclose: […] wherein the computing system is configured to filter the data by applying a height filter to the data, the height filter having a height threshold, the points indicative of dust being above the height threshold and the remaining points being below the height threshold.
However, He teaches: […] wherein the computing system is configured to filter the data by applying a height filter to the data, the height filter having a height threshold, the points indicative of dust being above the height threshold and the remaining points being below the height threshold.
See [He, pg. 3, paras 0031-0032], which explains that the work area, divided into grids, can include an average altitude and topographic altitude, where the map can include LIDAR data of dust and other noise, where the ground elevation is tracked, “[0031] Secondly, the work area is divided into multiple grids, and by using the point cloud data collected within a certain time passed by the rider, the average altitude of the point cloud within each grid is calculated, and the topographic altitude of the grid is obtained, and the grid altitude map is drawn. create However, a process shop floor usually has a lot of dust (ground dust, smog, etc.), and the dust is scanned by the lidar, causing many noise points to be included in the point cloud data. […]. [0032] Third, multi-frame environment depth images are collected using a depth camera, and based on each frame depth image, the ground elevation value of each grid is updated in real time using the Kalman filter algorithm, and the grid elevation map is updated.” Also see [He, pgs. 7-8, para 0073], which further explains that a threshold above the tracked ground elevation can be used to determine noise such a dust, where the dust floats above the ground and appears high above ground level, “According to some embodiments, in response to determining that the difference between the elevation coordinate of the input point and the elevation value of the grid is greater than the first threshold, the input point is determined as the noise point. The first threshold can be set to, for example, 0.5 m, and understandably, since the dust floats above the ground, the altitude of the noise point is usually higher than the ground level (ie the altitude value of the grid). By comparing the elevation coordinates of the input points with the elevation values of the grid, noise points can be quickly identified.” Finally see [He, pg. 9, para 0091], which further explains that the coordinate points of each datapoint can be compared to each other to identify abrupt changes in elevation, which can be identified as dust, “In each curve diagram, the abscissa is the number of times (corresponding to time) input points (P(x, y, z)) to the grid, and the ordinate is the altitude (unit is m). The solid line means the elevation coordinate (z coordinate) of the input point P, and the dotted line means the height of the grid. As shown in Fig. 4, the method of the embodiment of the present disclosure can effectively filter out the abruptly changed dust noise points (refer to the curvature gird_1, grid_2 and grid_3), and at the same time the normal elevation of the ground level (for example, adding raw materials) ) in a timely manner (see curvature gird_4).”
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with He to remove data indicative of dust using a filter. Doing so allows improving data accuracy where noise, such as dust, can result in low data accuracy [He, pg. 3, paras 0031-0032], where typically, dust floats at a higher altitude than true point measurements, resulting in higher altitudes [He, pgs. 7-8, para 0073] and abrupt changes in point coordinates [He, pg. 9, para 0091].
Regarding Claim 3, Posselius as modified discloses the limitations of Claim 1.
Posselius does not disclose: […] wherein the computing system is configured to filter the data by identifying neighboring points of the plurality of points within a certain distance for each of the plurality of points, a number of the neighboring points for each of the points indicative of dust being less than a threshold number, a number of the neighboring points for each of the remaining points being above the threshold number.
However, He teaches: […] wherein the computing system is configured to filter the data by identifying neighboring points of the plurality of points within a certain distance for each of the plurality of points, a number of the neighboring points for each of the points indicative of dust being less than a threshold number, a number of the neighboring points for each of the remaining points being above the threshold number. See again [He, pg. 3, paras 0031-0032], which explains that the work area, divided into grids, can include an average altitude and topographic altitude, where the map can include LIDAR data of dust and other noise, where the ground elevation is tracked. Also see again [He, pg. 9, para 0091], which further explains that the coordinate points of each datapoint can be compared to each other to identify abrupt changes in elevation, which can be identified as dust.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with He to remove data indicative of dust using a filter. Doing so allows improving data accuracy where noise, such as dust, can result in low data accuracy [He, pg. 3, paras 0031-0032], where typically, dust floats at a higher altitude than true point measurements, resulting in higher altitudes [He, pgs. 7-8, para 0073] and abrupt changes in point coordinates [He, pg. 9, para 0091].
Regarding Claim 12, Posselius as modified discloses the limitations of Claim 11.
Posselius does not disclose: […] wherein filtering the data comprises applying a height filter to the data, the height filter having a height threshold, the points indicative of dust being above the height threshold and the remaining points being below the height threshold.
However, He teaches: […] wherein filtering the data comprises applying a height filter to the data, the height filter having a height threshold, the points indicative of dust being above the height threshold and the remaining points being below the height threshold. See again [He, pg. 3, paras 0031-0032], which explains that the work area, divided into grids, can include an average altitude and topographic altitude, where the map can include LIDAR data of dust and other noise, where the ground elevation is tracked. Also see again [He, pgs. 7-8, para 0073], which further explains that a threshold above the tracked ground elevation can be used to determine noise such a dust, where the dust floats above the ground and appears high above ground level. Finally see again [He, pg. 9, para 0091], which further explains that the coordinate points of each datapoint can be compared to each other to identify abrupt changes in elevation, which can be identified as dust.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with He to remove data indicative of dust using a filter. Doing so allows improving data accuracy where noise, such as dust, can result in low data accuracy [He, pg. 3, paras 0031-0032], where typically, dust floats at a higher altitude than true point measurements, resulting in higher altitudes [He, pgs. 7-8, para 0073] and abrupt changes in point coordinates [He, pg. 9, para 0091].
Regarding Claim 13, Posselius as modified discloses the limitations of Claim 11.
Posselius does not disclose: […] filtering the data comprises identifying neighboring points of the plurality of points within a certain distance for each of the plurality of points, a number of the neighboring points for each of the points indicative of dust being less than a threshold number, a number of the neighboring points for each of the remaining points being above the threshold number.
However, He teaches: […] filtering the data comprises identifying neighboring points of the plurality of points within a certain distance for each of the plurality of points, a number of the neighboring points for each of the points indicative of dust being less than a threshold number, a number of the neighboring points for each of the remaining points being above the threshold number. See again [He, pg. 3, paras 0031-0032], which explains that the work area, divided into grids, can include an average altitude and topographic altitude, where the map can include LIDAR data of dust and other noise, where the ground elevation is tracked. Also see again [He, pg. 9, para 0091], which further explains that the coordinate points of each datapoint can be compared to each other to identify abrupt changes in elevation, which can be identified as dust.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with He to remove data indicative of dust using a filter. Doing so allows improving data accuracy where noise, such as dust, can result in low data accuracy [He, pg. 3, paras 0031-0032], where typically, dust floats at a higher altitude than true point measurements, resulting in higher altitudes [He, pgs. 7-8, para 0073] and abrupt changes in point coordinates [He, pg. 9, para 0091].
Claims 7 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Posselius in view of Darr and Qadir, further in view of Morisue et al., JP-2006196879-A (herein “Morisue”).
Regarding Claim 7, Posselius as modified discloses the limitations of Claim 1.
Posselius further discloses: […] determining a total number of rough subsections of the plurality of subsections having a difference between a height of the high point and a height of the low point greater than a threshold difference; determining a total number of smooth subsections of the plurality of subsections having a difference between the height of the high point and the height of the low point less than the threshold difference. See again [Posselius, pgs. 3-4, para 0031], which explains that the sensor data, such as from a LIDAR scanner, collects data associated with the surface roughness of the surface or soil across a field or a portion of the field. Also see again [Posselius, pg. 8, para 0062], which explains that after the data is plotted and fit, the roughness can be determined using the height of the data corresponding to the ground surface, where a section of the field can be determined as rough or smooth.
Posselius does not disclose: […] determining a ratio between the total number of rough subsections and a sum of the total number of rough subsections and the total number of smooth subsections, the ratio being the surface roughness of the field.
However, Morisue teaches: […] determining a ratio between the total number of rough subsections and a sum of the total number of rough subsections and the total number of smooth subsections, the ratio being the surface roughness of the field. See [Morisue, pg. 6, lines 20-24], which explains that surface roughness is determined using a ratio of the rough surface and the total surface area, “The surface roughness may be measured with an atomic force microscope (AFM). Moreover, it is preferable that the surface area ratio of the rough surface by the uneven | corrugated shape with respect to a rough surface area | region (area which does not include the surface area increase by uneven | corrugated shape) is 1.5 or more.” Also see [Morisue, pg. 42, lines 11-28], which further explains that the surface roughness is calculated using height differences and a surface area, “The measurement was performed using an atomic force microscope (AFM) and the measurement range was 5 μm × 5 μm. The measurement results are shown in Table 2 and FIG. FIG. 32 is a three-dimensional extension of the average surface roughness of the surface of each sample (centerline average roughness defined in JIS B0601 is applicable to the surface: see FIG. 32A). ), Maximum height difference of surface irregularities (see FIG. 32B), mean square surface roughness (square root of the mean square of surface irregularities: see FIG. 32C), surface area ratio (arbitrary region) The ratio of the entire surface area including the surface area of the irregular shape in an arbitrary region to the area of the area (see FIG. 32D)) and the hot water immersion time of the sample are shown. […]. It can be confirmed that as the warm water immersion time is increased, the surface of the sample is roughened, and the surface is in a more rough state. This can be confirmed by the change in the surface area ratio shown in FIG. 32 (D), and the surface area ratio increases as the hot water immersion time increases. This is probably because the surface roughness increased and the surface area increased due to the uneven shape.”
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Morisue to use a ratio of the rough and smooth areas. Doing so contributes to a low cost, high efficiency solution to improve performance [Morisue, pg. 1, lines 31-34] by calculating and achieving a desired surface roughness [Morisue, pg. 4, lines 4-22].
Examiner’s Note: Examiner would like to note that the “total number of rough subsections” and the “sum of the total number of rough and smooth subsections” is being mapped to the “rough surface area” and the “total surface area,” respectively and as recited in Morisue. Furthermore, Examiner would like to note that although Morisue is directed towards semiconductor manufacturing, both inventors faced the same problem of understanding, or determining, surface roughness to control parameters for improved performance.
Regarding Claim 17, Posselius as modified discloses the limitations of Claim 11.
Posselius discloses: […] determining, with the computing system, a total number of rough subsections of the plurality of subsections having a difference between a height of the high point and a height of the low point greater than a threshold difference; determining, with the computing system, a total number of smooth subsections of the plurality of subsections having a difference between the height of the high point and the height of the low point less than the threshold difference; […]. See again [Posselius, pgs. 3-4, para 0031], which explains that the sensor data, such as from a LIDAR scanner, collects data associated with the surface roughness of the surface or soil across a field or a portion of the field. Also see again [Posselius, pg. 8, para 0062], which explains that after the data is plotted and fit, the roughness can be determined using the height of the data corresponding to the ground surface, where a section of the field can be determined as rough or smooth.
Posselius does not disclose: […] determining, with the computing system, a ratio between the total number of rough subsections and a sum of the total number of rough subsections and the total number of smooth subsections.
However, Morisue teaches: […] determining, with the computing system, a ratio between the total number of rough subsections and a sum of the total number of rough subsections and the total number of smooth subsections. See again [Morisue, pg. 6, lines 20-24], which explains that surface roughness is determined using a ratio of the rough surface and the total surface area and [Morisue, pg. 42, lines 11-28], which further explains that the surface roughness is calculated using height differences and a surface area.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the invention, to modify Posselius with Morisue to use a ratio of the rough and smooth areas. Doing so contributes to a low cost, high efficiency solution to improve performance [Morisue, pg. 1, lines 31-34] by calculating and achieving a desired surface roughness [Morisue, pg. 4, lines 4-22].
Examiner’s Note: Examiner would like to note that the “total number of rough subsections” and the “sum of the total number of rough and smooth subsections” is being mapped to the “rough surface area” and the “total surface area,” respectively and as recited in Morisue. Furthermore, Examiner would like to note that although Morisue is directed towards semiconductor manufacturing, both inventors faced the same problem of understanding, or determining, surface roughness to control parameters for improved performance.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIN MARIE HARTMANN whose telephone number is (571)272-5309. The examiner can normally be reached M-F 7-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito Robinson can be reached at (571) 270-3921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/E.M.H./Examiner, Art Unit 3664
/KITO R ROBINSON/Supervisory Patent Examiner, Art Unit 3664