DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-12, 14-16 are rejected under 35 U.S.C. 103 as being unpatentable over Jiang (US 2021/03050576) in view of Newman (US 2024/0221312)
As to claim 1 Jiang discloses a computer implemented method for monitoring the operational use of an agricultural machine, the method comprising:
obtaining raw image data from a stereo camera system of or otherwise associated with the agricultural machine (Paragraph 72 “As discussed above, the processing component 1310 may convert the raw images 1304 and 1306 into depth maps 1308 and confidence maps 1314.”);
processing the raw image data through application of an image rectification to generate a rectified image data set (Paragraph 72 “Raw images 1304 from the first camera may be rectified by a rectification engine 1416. Raw images 1306 from the second camera may be rectified by a rectification engine 1418.”);
applying a stereo-matching algorithm on the rectified image data set to generate a disparity map (Paragraph 75 “The stereo correspondence engine 1420 may determine a disparity between matching pixels in the images 1430, 1432 (e.g., the images 1430, 1432 may be left and right images), may compute a disparity map, may compute and output the depth map 1308 from the disparity map, and may compute and output the confidence map 1314.”); and
controlling operation of one or more operable components of or otherwise associated with the agricultural machine in dependence on the generated disparity map(Paragraph 64 “FIG. 11 shows two stereo vision systems mounted on a combine harvester 1100, according to some embodiments of the present technology. A forward-facing stereo vision system may include a forward camera 1102 and a forward camera 1104, which may be used, e.g., to obtain information for measuring an amount of crop ahead to be encountered the harvester 1100, to enable an informed control a throttle of the harvester 1110 and therefore maximize a feed rate.”)
Jiang does not explicitly disclose wherein the image rectification comprises an epipolar rectification.
Newman teaches wherein the image rectification comprises an epipolar rectification (Paragraph 113 “Epipolar lines for the left image, for example, 612 and for the right image, for example, 613 may depend on the relative positions of the two viewpoints from which the camera or cameras were positioned in the scene at the precise time(s) of capture. The epipolar lines for images 610 and 611 may be determined from the intrinsic camera parameters and extrinsic camera parameters determined during a process of camera calibration and known movement (if any) of the cameras. Camera calibration techniques, both for multiple cameras at fixed positions on a platform as well as for cameras that move over time, are known in the art. An image warp may then be used to transform image data so that epipolar lines become horizontal, and the image data along horizontal scanlines may be, therefore, in a more useful arrangement for subsequent processing (i.e., horizontal shifts and offsets may be how pixel data is processed in subsequent processing stages). FIG. 6 C shows the result of epipolar warping on the image of FIG. 6 A. And, FIG. 6 D shows the result of epipolar warping on the image of FIG. 6 B. For example, in FIG. 6 C image 620 shown, with epipolar lines including 622 now horizontal and image 621 (FIG. 6 D) also with epipolar lines such as 623. Further, warped images 620 and 621 may be generated so that epipolar lines 622 and 623, where from the same, or substantially the same, epipolar plane, may be stored in same, or substantially same, row position in their respective images.”). It would have been obvious to one of ordinary skill to modify Jiang to include the teachings of using epipolar rectification for the purpose of speeding up the searching for a corresponding point in the images.
As to claim 2 Newman teaches a method wherein the epipolar rectification comprises an equidistant epipolar rectification (Paragraph 113).
As to claim 3 Newman teaches a method wherein the epipolar rectification is formulated in dependence on one or more characteristics of the stereo camera system (Paragraph 79).
As to claim 4 Jiang teaches a method wherein the one or more characteristics comprise an orientation or relative arrangement of the stereo camera system (Paragraph 73).
As to claim 5 Newman teaches discloses a method wherein the epipolar rectification comprises a remapping of the raw image data which aligns epipolar lines in the raw image data in dependence on an axis relative to the orientation and/or relative arrangement of the stereo camera system(Paragraph 113).
As to claim 6 Newman teaches a method comprising analyzing the generated disparity map and/or a generated 3D representation therefrom to identify one or more operational parameters associated with the agricultural machine(Paragraph 64).
As to claim 7 Jiang discloses a method comprising determining a relative position of one or more operable components of the machine and/or implements operably coupled thereto (Fig. 11, Paragraph 64)
As to claim 8 Jiang discloses a method comprising comparing the determined position with an expected position from respective operational settings for the machine (Paragraph 66).
As to claim 9 Jiang discloses a method comprising determining a relative position of an unloading auger of the machine or a coupled implement with respect to the machine, an associated machine or vehicle, or an environment (Fig. 11, Paragraph 64).
As to claim 10 Jiang discloses a method comprising utilizing the generated disparity map to identify a relative position of one or more objects, vehicles, machines or the like in the operating environment of the agricultural machine (Paragraph 75).
As to claim 11 Jiang discloses a method comprising identifying, from the generated disparity map, the relative position of a cooperative machine within the working environment of the agricultural machine (Paragraph 64).
As to claim 12 Jiang discloses a method comprising determining the relative position of a collection vehicle into which the agricultural machine unloads crop material, in use (Paragraph 64).
As to claim 14 Jiang discloses a control system comprising one or more controllers configured to perform the method of claim 1 (Paragraph 67).
As to claim 15 Jiang discloses a stereo camera system comprising a pair of cameras; and wherein the system further comprises or is controllable under operation of the control system of claim 14 (Paragraph 64).
As to claim 16 Jiang discloses an agricultural machine comprising the stereo camera system of claim 15 (Paragraph 64).
Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Jiang (US 2021/03050576) in view of Newman (US 2024/0221312) as applied to claim 1 above, and in further view of Pastucha (US 2024/0224873)
As to claim 13 Pastucha teaches a method comprising at least party automating an unloading operation of crop material from the agricultural machine to the collection vehicle in dependence on the determined position, including controlling operable parameters of an unloading auger of the machine, including one or more of an operational speed of components thereof, and/or an operating position with respect to the collection vehicle (Paragraph 54). It would have been obvious to one of ordinary skill to modify Jiang to include the teachings of automating the unloading of the crop material from the harvesting chine for the purpose of improving efficiency in unloading the crop material form the agricultural machine.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IMRAN K MUSTAFA whose telephone number is (571)270-1471. The examiner can normally be reached Mon-Fri 9-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J Lee can be reached at 571-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
IMRAN K. MUSTAFA
Primary Examiner
Art Unit 3668
/IMRAN K MUSTAFA/ Primary Examiner, Art Unit 3668
11/10/2025