DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-10 of US Application No. 18/656,510 are currently pending and have been examined. Applicant amended claims 1-10.
Response to Arguments/Amendments
The previous rejections of claims 2-8 under 35 U.S.C. 112(b) as indefinite are withdrawn in consideration of amended claims 2-8.
The previous rejection of claim 9 under 35 U.S.C. 101 as directed to non-statutory subject matter is withdrawn in consideration of amended claim 9.
The previous rejections of claim 1, 4, and 9-10 under 35 U.S.C. 102 are withdrawn. However, new rejections under § 102 are set forth below.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 2, 4-6, and 8-10 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Nakabayashi et al. (US 12,324,370 B2, “Nakabayashi”).
Regarding claims 1, 9, and 10, Nakabayashi discloses a harvester and teaches:
enabling, by a processor, a detection function capable of detecting the one or more detection targets based on a first value of a first parameter associated with the target route during travel of the work-vehicle (detection modules 41-44 include operational states correspond to a detection state for effecting detection, a stopped state not effecting any detection, a wide range detection state for effecting detection of a detection range wider than a predetermined range, a narrow range detection state for effecting detection of a detection range narrower equal to or narrower than the predetermined range, a high sensitivity state in which the detection sensitivity is set higher than a predetermined value, a low sensitivity state in which the detection sensitivity is set equal to or lower than the predetermined value – see at least 12: 12-28; front detection module 41 may be enabled/disabled based on state of crop, i.e., reaped/un-reaped – see at least Fig. 5 and 12: 42-67; e.g., first parameter is state of crop in front of vehicle);
disabling by the processor, the detection function based on a second value of the first parameter associated with the target route during travel by the work-vehicle (detection modules 41-44 include operational states correspond to a detection state for effecting detection, a stopped state not effecting any detection, a wide range detection state for effecting detection of a detection range wider than a predetermined range, a narrow range detection state for effecting detection of a detection range narrower equal to or narrower than the predetermined range, a high sensitivity state in which the detection sensitivity is set higher than a predetermined value, a low sensitivity state in which the detection sensitivity is set equal to or lower than the predetermined value – see at least 12: 12-28; front detection module 41 may be enabled/disabled based on state of crop, i.e., reaped/un-reaped – see at least Fig. 5 and 12: 42-67); and
initiating predetermined processing based on a current state of the detection function (detection module detects objects based on operational modes of the detection modules – see at least 2:30-39).
Regarding claim 2, Nakabayashi further teaches:
enabling the detection function when the first value is indicative of a work-completed work route where a predetermined work on a work object has been performed (detection modules directed toward reaped, i.e., worked, land, may be enabled – see at least Fig. 5 and 12: 42-67), and disabling the detection function when the second value is indicative of an unworked work route (detection modules directed toward un-reaped, i.e., un-worked, land, may be disabled – see at least Fig. 5 and 12: 42-67).
Regarding claim 4, Nakabayashi further teaches:
wherein
the target route includes a current work route on which a predetermined work is performed for a work object (combine 10 effects an automated traveling along a traveling route – see at least Fig. 2 and 9:58-64),
the detection unit comprises a plurality of detection sensors, including a front detection sensor for detecting the one or more detection targets in front of the work vehicle, and at least one side detection sensor for detecting the one or more detection targets on at least one corresponding side of the work route (detection unit 40 includes front side detection module 41, rear side detection module 42, left side detection module 43, and right side detection module 44 – see at least Fig. 4 and 11:55-12:4, and
enabling the detection function comprises selectively enabling one or more of: the front detection sensor, or the at least one side detection sensor based on corresponding first values of one or more second parameters associated with the target route (detection modules 41-44 include operational states correspond to a detection state for effecting detection, a stopped state not effecting any detection, a wide range detection state for effecting detection of a detection range wider than a predetermined range, a narrow range detection state for effecting detection of a detection range narrower equal to or narrower than the predetermined range, a high sensitivity state in which the detection sensitivity is set higher than a predetermined value, a low sensitivity state in which the detection sensitivity is set equal to or lower than the predetermined value – see at least 12: 12-28; left/right detection modules 43, 44 may be enabled/disabled based on state of crop, i.e., reaped/un-reaped – see at least Fig. 5 and 12: 42-67; e.g., second parameter is state of crop in row next to vehicle); and
disabling the detection function comprises selectively disabling one or more of: the front detection sensor, or the at least one side detection sensor based on corresponding second values of one or more second parameters associated with the target route (detection modules 41-44 include operational states correspond to a detection state for effecting detection, a stopped state not effecting any detection, a wide range detection state for effecting detection of a detection range wider than a predetermined range, a narrow range detection state for effecting detection of a detection range narrower equal to or narrower than the predetermined range, a high sensitivity state in which the detection sensitivity is set higher than a predetermined value, a low sensitivity state in which the detection sensitivity is set equal to or lower than the predetermined value – see at least 12: 12-28; left/right detection modules 43, 44 may be enabled/disabled based on state of crop, i.e., reaped/un-reaped – see at least Fig. 5 and 12: 42-67; e.g., second parameter is state of crop in row next to vehicle).
Regarding claim 5, Nakabayashi further teaches:
wherein selectively disabling the detection function comprises
selectively disabling the at least one side detection sensor, when the corresponding second values of the second parameter indicate presence of an unworked work route adjacent to the current work route on which the work vehicle travels (left/right detection modules 43, 44 may be enabled/disabled based on state of crop, i.e., reaped/un-reaped – see at least Fig. 5 and 12: 42-67),
and selectively enabling the detection function comprises selectively enabling the at least one side detection sensor when the corresponding first values of the second parameter indicates presence of a work-completed work route adjacent to the current work route on which the work vehicle travels (left/right detection modules 43, 44 may be enabled/disabled based on state of crop, i.e., reaped/un-reaped – see at least Fig. 5 and 12: 42-67).
Regarding claim 6, Nakabayashi further teaches:
wherein selectively disabling the detection function comprises
selectively disabling at least one first detection sensor of the plurality of detection sensors, wherein the at least one disabled first detection sensor corresponds to a first predetermined range that includes the work object when the work vehicle travels on the unworked work route (operational state may include a wide/narrow range – see at least 12:13-28, 13:34-54), and
selectively enabling the detection function comprises selectively enabling at least one second detection sensor of the plurality of detection sensors, wherein the at least one enabled second detection sensor corresponds to a second predetermined range that does not include the work object (operational state may include a wide/narrow range – see at least 12:13-28, 13:34-54).
Regarding claim 8, Nakabayashi further teaches:
executing the predetermined processing for the detection target when the detection function is enabled and the detection target is detected (objects are detected when detection modules are enabled/disabled based on state of crop – see at least Fig. 5 and 12: 42-67; map indicates where crops are reaped/not yet reaped, i.e., reaped/unreaped targets detected by vehicle position in map – see at least 2:20-29).
Allowable Subject Matter
Claims 3 and 7 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Matsuzaki (US 2024/0345253 A1).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AARON L TROOST whose telephone number is (571)270-5779. The examiner can normally be reached Mon-Fri 7:30am-4pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at 313-446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AARON L TROOST/Primary Examiner, Art Unit 3666