Prosecution Insights
Last updated: April 19, 2026
Application No. 18/382,118

ALIGNING A GRAIN CART TO ANOTHER VEHICLE

Non-Final OA §103
Filed
Oct 20, 2023
Examiner
TAYLOR JR, ANTHONY D
Art Unit
3747
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
MACDON INDUSTRIES LTD.
OA Round
3 (Non-Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
218 granted / 295 resolved
+3.9% vs TC avg
Strong +83% interview lift
Without
With
+83.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
24 currently pending
Career history
319
Total Applications
across all art units

Statute-Specific Performance

§101
1.0%
-39.0% vs TC avg
§103
46.0%
+6.0% vs TC avg
§102
17.6%
-22.4% vs TC avg
§112
34.9%
-5.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 295 resolved cases

Office Action

§103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 08/19/2025 has been entered. Response to Arguments Applicant’s arguments with respect to claims 1-12 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the arguments [e.g., in view of applicant’s amendments, an updated search was performed that revealed new relevant prior art that is now being relied upon for fairly rendering the claimed invention(s) obvious]; [e.g., with respect to the amended claimed invention(s), the newly cited prior art reference US 20220011444 A1 (Eichhorn) teaches a comparable apparatus, system and method for grain cart-vehicle alignment and control (see title, abstract) using the same (or a substantially similar) combined/correlated monitoring device/camera and ranging device/LiDAR configuration, and includes additional context that at least suggests and/or encompasses the embodiment(s) of the claimed invention(s) in terms of the invention(s) per Eichhorn being configurable and/or functionally capable of steering the grain cart relative to the vehicle in the same (or a substantially similar) manner (e.g., based on multiple sensor fusion systems including cameras and LiDAR disposed along the length of the grain cart, wherein the data from said cameras and LiDAR (or sensor fusion systems) is combined/correlated for the purpose(s) of steering the grain cart so as to be parallel to the vehicle and/or autonomously maintaining a proper separation distance between the grain cart and the vehicle)]; [e.g., the claimed invention(s) at best appear(s) to be with respect to one or more embodiments that are at least suggested and/or encompassed by the teachings/context per Eichhorn (e.g., especially when one of ordinary skill in the art opts to combine the various implementations/embodiments illustrated per at least 12A-12C and described per paragraph [0112] of Eichhorn to accordingly align and/or maintain alignment between the grain cart and the vehicle such that a parallel spacing and proper separation distance is achieved and maintained)]; [e.g., per Fig. 12A-12C of Eichhorn, compare the combinable configuration of the ranging device/LiDAR sensors 52 located along the length of the grain cart 18, of which are also described as being part of a sensor fusion system including one or more cameras, to the combination of cameras and LiDAR per applicant’s Fig. 2 (e.g., observe at least the camera 28 and LiDAR 30 and camera 34 and LiDAR 36, such that said cameras and LiDAR constitute respective sensor fusion systems)]. See detailed rejection below. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. This application includes one or more claim limitations that do not use the word “means”, but are nonetheless being interpreted under 35 U.S.C. 112(f), because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “a monitoring device configured to obtain an …” (see claims 1 and 4); and “a ranging device configured to identify …” (see claims 1 and 4). Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f), it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. A review of the specification provides the following as the corresponding structure for performing the claimed function, and equivalents thereof: the monitoring device appears to be with respect to a camera (see at least paragraph [0064]); and the ranging device appears to be with respect to a LiDAR sensor (see at least paragraph [0064]). If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f), applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-12 are rejected under 35 U.S.C. 103 as being obvious over US 20220011444 A1 (Eichhorn). Regarding claim 1, Eichhorn (Figures 1-26) [emphasis on Fig. 5-8] teaches a system (10) for controlling a grain cart (18) relative to a vehicle (14, 2) (see Fig. 5-8 in conjunction with paragraphs [0013], [0073]-[0078]), wherein the grain cart includes at least one edge extending between a front edge and a rear edge (see Fig. 5-8) [e.g., observe the various corresponding edges of the grain cart 18 per Fig. 5-8] and the vehicle includes a side edge (2B) extending between a front end and a rear end (see Fig. 5-8 in conjunction with paragraphs [0100]-[0104]), the system comprising: a monitoring device [e.g., one or more stereo cameras of the sensors 52 in the sensor fusion system 50] configured to obtain an image of the vehicle (see Fig. 5-8 in conjunction with paragraphs [0099]-[0107]) [e.g., “examples of such sensors 52 include LiDAR, structured light sensors, stereo cameras, and time of flight sensors such as flash LiDAR”]; [e.g., a stereo camera constitutes a monitoring device configured to obtain an image of the vehicle]; a controller (22) configured to use the image from the monitoring device to identify a location of the vehicle (see Fig. 5-8 in conjunction with paragraphs [0099]-[0110]) [e.g., “sensor 52 is mounted on the grain cart 18 to determine the location and orientation of the grain truck 14”]; [e.g., observe the ECU 22 with respect to the sensor 52 per Fig. 8]; and a ranging device [e.g., one or more LiDAR (or multi-dimensional or scanning) sensors of the sensors 52 in the sensor fusion system 50] configured to identify a position and orientation of the side edge of the vehicle relative to the grain cart (see Fig. 5-8 in conjunction with paragraphs [0019], [0099]-[0110], [0130]) [e.g., “one or more multi-dimensional sensors disposed on the grain cart configured to measure an orientation of the one or more grain trucks and relative positions of the one or more grain trucks and grain cart”]; [e.g., “The multi-dimensional distance sensors 52 are mounted to the grain cart 18 such that the sensor field of view E contains some or part of the grain truck 14 and/or trailer 2 as the grain cart 18 approaches the truck 14”]; [e.g., “multi-dimensional sensors 52 detect the grain truck 14 and trailer 2 as a single or multi-dimensional series of points relative to the sensor 52 (on the grain cart 18 tractor or wagon)”], wherein the controller is further configured to use the position and orientation of the side edge from the ranging device to determine a front distance between the front edge of the grain cart and the side edge (see Fig. 5-8 in conjunction with paragraphs [0099]-[0110]) [e.g., observe at least the broken line extending between the sensor(s) 52 and the side edge 2B per Fig. 6]; [e.g., “the LiDAR sensor 52 is positioned on the cart 18 so that as the cart 18 approaches the truck 14 for unloading, the LiDAR sensor 52 can accurately measure the distance to the side 2B of the truck 14/trailer 2”]; [e.g., “The multi-dimensional distance sensors 52 are mounted to the grain cart 18 such that the sensor field of view E contains some or part of the grain truck 14 and/or trailer 2 as the grain cart 18 approaches the truck 14”]; [e.g., further observe the arrows indicative of communication between the sensors(s) 52 and the ECU 22 per Fig. 8]. Eichhorn fails to explicitly or expressly teach wherein the position and orientation of the side edge of the vehicle relative to the grain cart identified by the ranging device is necessarily based on the location identified by the controller, and wherein the controller is further configured to use the position and orientation of the side edge from the ranging device to determine a rear distance between the rear edge of the grain cart and the side edge, determine a maximum distance between the front distance and the rear distance, determine whether the maximum distance is greater than a maximum threshold, and if the controller determines that the maximum distance is greater than the maximum threshold, the controller is further configured to steer the grain cart to reduce the maximum distance [e.g., in other words, Eichhorn fails to explicitly or expressly teach wherein multiple sensor fusion systems are utilized along the length of the grain cart and such that the data from said sensor fusion systems is combined and/or correlated to achieve the proper, desired, and/or optimal alignment of the grain cart relative to the vehicle]. However, the claim limitation(s) “based on the location identified by the controller” is/are an implicit and/or a straightforward obvious possibility in consideration of the context concerning sensor fusion and/or the sensor fusion system(s) 50 (see Fig. 5-8 in conjunction with paragraphs [0099]-[0110]), such that the disclosed sensor fusion entails and/or encompasses combining and/or correlating data from the one or more LiDAR sensors and the one or more stereo cameras to create a more accurate, complete, and/or reliable understanding of the environment (implicit in view of well-known and/or basic engineering logic/principles concerning sensor fusion in autonomous vehicle systems) [e.g., when utilizing one or more sensor fusion systems that comprise one or more cameras and LiDAR, the possibility that the LiDAR data is correlated with the camera data is an implied and/or straightforward obvious possibility]. Furthermore, when discussing the implementations illustrated per Fig. 12-21, Eichhorn provides that the sensors 52 (of which again, may include both cameras and LiDAR sensors in a sensor fusion system 50) may be provided on a front portion of the grain cart [e.g., see Fig. 12C] and a rear portion relative to the front portion of the grain cart [e.g., see Fig. 12A-12B] with respect to ensuring proper and/or optimal alignment of the grain cart relative to the vehicle, and subsequently provides that the implementations per the respective figures may be utilized individually or in combination (see Fig. 12-21 in conjunction with paragraphs [0009], [0078], [0093]-[0094], [0099]-[0112]) [e.g., acknowledging that multiple distances constitute measurements, note the recitation “in certain implementations, the disclosed systems, methods and devices sense configurations, dimensions, and/or measurements of the grain truck(s), certain non-limiting examples being the length, height, and/or width of the truck grain box, such as, for example, on approach. This sensing of at least one configuration, dimension, or measurement is useful to help determine the best location to position the grain cart to load into a specified location of the grain truck, such as the center of the receiving grain truck box, and to accurately position and move the grain cart along the length of the receiving grain truck, as would be appreciated”]; [e.g., note the recitation “In addition to left/right steering control, the described guidance system 10 may also include speed control, gear control, direction control, that is forward/reverse, and other automatic steering controls as would be appreciated. With speed control, the speed of the grain cart 18 tractor or other towing vehicle could be controlled so that distribution of the grain in the grain truck 14 follows an optimal, or user-defined pattern. Direction control would allow the guidance system 10 to move the grain cart 18 in reverse, allowing distribution of the grain into the grain truck 14”, such that the claimed functional capability of the grain cart to reduce one or more of a distance and an angle based on a threshold distance and/or angle is described and/or at least suggested (e.g., to obtain and maintain an alignment and/or a proper separation distance defined between the vehicle and the grain cart, one or more distances and/or one or more angles must be capable of being adjusted via the steering of the grain cart relative to the vehicle), and such that the subject matter not explicitly or expressly taught by Eichhorn merely pertains to an obvious and/or non-inventive controlling of the grain cart relative to the vehicle to maintain an optimal, desired, and/or user-defined separation distance]; [e.g., further note the recitation “The guidance line B is then automatically transferred to the grain cart 18 guidance system. In various implementations, the guidance line B can include more than just the parallel path next to the truck 14/trailer 2. For example, the guidance line B can also include a planned path from the current location of the grain cart 18 to the optimal aligned position. This cloud-based approach may also be used to guide an autonomous (i.e. remote or computer-operated) grain carts 18”, such that the subject matter not explicitly or expressly taught by Eichhorn merely pertains to an obvious and/or non-inventive alternative planned path from the current location of the grain cart to a desired and/or optimal aligned position]; [e.g., “Line or plane detection algorithms can detect the long side 2B of the grain truck 14, that is, the side 2B of the grain truck 14 that the grain cart 18 should drive parallel to at the proper separation distance to achieve onloading (or grain transfer)”]; [e.g., “As illustrated in FIGS. 12-21, it is appreciated that there can be additional challenges that may be encountered during execution of the steps/processes/methods described above. It is understood that several example implementations are discussed, and that each may be utilized individually or in combination by system 10 and grain truck 14/grain cart 18 component configurations discussed above to plot guidance lines B for approach, as would be readily understood”]; [e.g., the guidance lines will necessarily involve and/or encompass autonomously steering the grain cart 18 based on distance(s) and/or angle(s) between the grain cart and the vehicle and adjusting said distance(s) and/or angle(s) between the grain cart and the vehicle as required in order to achieve and maintain the parallel alignment and/or the proper separation distance between the grain cart and the vehicle]. As such, it would have been obvious to one of ordinary skill in the art and/or merely involve routine skill in the art to accordingly utilize and/or implement additional sensors 52 (or an additional sensor fusion system 50) on a rear portion relative to the front portion of the grain cart such that the grain cart may be accordingly and/or autonomously steered to an optimal aligned position as a function of correlated distances and/or associated angles as a modification (or an alternative) in the system(s) per Eichhorn [e.g., similar to the configuration of the grain cart 10 per applicant’s Fig. 2 of which includes camera 28 and LiDAR 30 on a front portion and camera 34 and LiDAR 36 on a rear portion relative to the front portion of the grain cart, using both of the implementations of the sensors 52 per Fig. 12A-12C of Eichhorn]; [e.g., such that multiple sensor fusion systems are utilized and such that the data from said sensor fusion systems is combined and/or correlated to achieve the proper, desired, and/or optimal alignment of the grain cart relative to the vehicle and/or maintain the parallel alignment and/or proper separation distance between the vehicle and the grain cart], as suggested by Eichhorn, in order to further improve the accuracy of the position and orientation determination(s) for achieving the proper, desired, and/or optimal alignment of the grain cart relative to the vehicle (implicit in view of well-known and/or basic engineering logic/principles concerning the use of multiple sensor fusion systems as opposed to a single sensor fusion system in autonomous vehicle systems) [e.g., one of ordinary skill in the art readily understands that providing additional sensor fusion systems comprising one or more cameras and one or more LiDAR sensors will result in improved object detection accuracy, mapping, etc. compared to just providing a single sensor fusion system]. Regarding claims 2-12, Eichhorn teaches the invention as claimed and as discussed above. Noting that the discussion above regarding the independent claim 1 is similarly applicable to the subject matter of the independent claims 4, 7 and 10 with respect to the claimed vehicle, grain cart, and respective monitoring and ranging devices [e.g., the subject matter of claims 4, 7 and 10 pertaining to the monitoring device and ranging device with respect to the vehicle and the grain cart is commensurate with the subject matter of claim 1 and/or similarly addressed via the discussion above regarding claim 1], Eichhorn similarly fails to explicitly or expressly teach wherein the controller is configured to steer the grain cart relative to the vehicle based on minimum and maximum distances and/or a maximum angle between the grain cart and the vehicle as provided per claims 2-12. However, the aforementioned discussion/rationale(s) as discussed above regarding claim 1 is/are similarly applicable to the system(s)/method(s) and/or the subject matter of claims 2-12 pertaining to the control means and/or method steps via which the grain cart is steered and/or controlled (which for the sake of brevity, have been summarized below) [e.g., the steering of the grain cart based on a minimum distance, minimum threshold, and/or various angle(s) between the grain cart and the vehicle as claimed merely relates to utilizing one or more additional cameras and LiDAR along the length of the grain cart, and combining/correlating the data from respective sensor fusion systems, so as to enable the grain cart to be accordingly steered in such a way that maintains a proper separation distance from the vehicle and such that a parallel alignment between the vehicle and the grain cart is maintained]; [e.g., similar to the configuration of the grain cart 10 per applicant’s Fig. 2 of which includes camera 28 and LiDAR 30 on a front portion and camera 34 and LiDAR 36 on a rear portion relative to the front portion of the grain cart, using both of the implementations of the sensors 52 per Fig. 12A-12C of Eichhorn in combination, is suggested via at least paragraph [0112] of Eichhorn]; [e.g., “As illustrated in FIGS. 12-21, it is appreciated that there can be additional challenges that may be encountered during execution of the steps/processes/methods described above. It is understood that several example implementations are discussed, and that each may be utilized individually or in combination by system 10 and grain truck 14/grain cart 18 component configurations discussed above to plot guidance lines B for approach, as would be readily understood”]; [e.g., the claimed invention(s) at best appear(s) to be with respect to one or more embodiments that are at least suggested and/or encompassed by the teachings/context per Eichhorn (e.g., especially when one of ordinary skill in the art opts to combine the various implementations/embodiments illustrated per at least 12A-12C and described per paragraph [0112] of Eichhorn to accordingly align and/or maintain alignment between the grain cart and the vehicle such that a parallel spacing and proper separation distance is achieved and maintained)]; [e.g., per Fig. 12A-12C of Eichhorn, compare the combinable configuration of the ranging device/LiDAR sensors 52 located along the length of the grain cart 18, of which are also described as being part of a sensor fusion system including one or more cameras, to the combination of cameras and LiDAR per applicant’s Fig. 2 (e.g., observe at least the camera 28 and LiDAR 30 and camera 34 and LiDAR 36, such that said cameras and LiDAR constitute respective sensor fusion systems)]; [e.g., the claimed invention(s) is/are an obvious and/or non-inventive application and/or combination that may be derived from the collective teachings/context per Eichhorn, such that multiple sensor fusion systems are utilized and such that the data from said sensor fusion systems is combined and/or correlated to achieve the proper, desired, and/or optimal alignment of the grain cart relative to the vehicle and/or maintain the parallel alignment and/or proper separation distance between the vehicle and the grain cart]. Also refer to discussion/rationale(s) as discussed above regarding claim 1. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANTHONY D TAYLOR JR whose telephone number is (469)295-9192. The examiner can normally be reached Mon-Fri 9a-5p (central time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Logan Kraft can be reached at 571-270-5065. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANTHONY DONALD TAYLOR JR./Examiner, Art Unit 3747 /KURT PHILIP LIETHEN/Primary Examiner, Art Unit 3747
Read full office action

Prosecution Timeline

Oct 20, 2023
Application Filed
Feb 08, 2025
Non-Final Rejection — §103
May 01, 2025
Response Filed
May 10, 2025
Final Rejection — §103
Aug 21, 2025
Response after Non-Final Action
Aug 21, 2025
Request for Continued Examination
Dec 13, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12583492
TRANSPORT CART SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12565260
DRIVER EVASIVE STEERING INTENT DETECTION IN VEHICLES
2y 5m to grant Granted Mar 03, 2026
Patent 12554268
ROBOT CONTROL METHOD AND APPARATUS, ROBOT, COMPUTER-READABLE STORAGE MEDIUM, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Feb 17, 2026
Patent 12540608
FUEL PUMP AND DAMPER CUP THEREOF
2y 5m to grant Granted Feb 03, 2026
Patent 12539842
Parking Assist System and Parking Assist Method
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
99%
With Interview (+83.4%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 295 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month