DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
2. This office action is in response to application number 18/062,133 filed on 12/06/2022, in
which the amendments and arguments filed on 08/21/2025.
Claims 1 has been amended.
No claims have been added.
Claim 2 have been cancelled.
Claims 1 and 3-5 are currently pending and have been examined.
Priority
3. Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C 119 (a)-(d). The certified copy has been filed in parent Application No.JP2022-021600, filed on 02/15/2022.
Information Disclosure Statement
4. The information disclosure statement (IDS) submitted on 12/06/2022, 06/26/2024, and
06/05/2025 have been received and considered.
Response to Amendment
5. Applicant' s amendments to the Claims have not overcome the rejection previously set forth in the Non-Final Office Action mailed 06/30/2025. Applicants arguments, see page 5-6 filed on 08/21/2025, with respect to the rejection(s) of claim(s) 1-5 under 35 USC 103 are not persuasive. The rejection stands. The same grounds for rejection is made under 35 USC 103 as necessitated by amendment over Wykman (US 20180213731 A1) in view of Haneda (US 20190320580 A1) and further in view of Ellaboudy (US 20210000006 A1). Specifically, Haneda does teach […] a detection unit that detects a portion covered with plants on a preset traveling road before mowing begins in Paragraph 0070: “ For example, the receiving section 310 receives image data”) (Haneda Paragraph 0070: “The above-mentioned image data may be image data of a work target (for example, plants such as lawn grasses or weeds) of the lawn mower 230.”) (Haneda Paragraph 0180: “For example, if the image acquired from the receiving section 310 is an image of lawn grasses present in the forward direction in terms of a course of the lawn mower 230, the lawn recognizing section 430 determines that the image is an image before lawn mowing.” & Ellaboudy teaches […] wherein the detection unit is configured to: acquire RGB image data including color information and ortho image data, the ortho image data being data corresponding to an imaging area of the RGB image data and including point cloud data; in Paragraph 0054: “For example, the one or more distance sensors may include a lidar sensor, a radar sensor, a sonar sensor, and/or a structured light sensor. For example, sensor data captured using the one or more distance sensors 146 may include a three-dimensional point cloud”. Paragraph 0132 teaches “For example, the image data may include RGB images and/or normalized difference vegetation index data.” Paragraph 0196 teaches “FIG. 16 depicts the hardware and software components of the tractor perception system 1600. The perception system hardware is comprised of a 3-D lidar sensor 1630 and Color Camera sensor 1640 that are mounted to the front of the tractor.” Ellaboudy further teaches identify the plants based on the color information in the RGB image data; and identify height of the plants based on height information of the point cloud data in the ortho image data to detect the portion covered with the plants; in Paragraph 0118 which states “For example, filtering 710 the point cloud data may include cropping the point cloud data to the zone of interest. For example, the zone of interest may limited to a range of heights” Paragraph 0132: “For example, the image data may include RGB images and/or normalized difference vegetation index data.” Paragraph 0133 teaches “The process 900 includes detecting 920 the one or more plants based on the image data. For example, computer vision processing (e.g., using a convolutional neural network) may be implemented to detect and/or classify the one or more plants. In some implementations, point cloud data from a distance sensor (e.g., a lidar sensor) may also be used to help detect and/or classify the one or more plants.” & Paragraph 0158 teaches “For example, the vehicle control systems described herein may be used with vehicles to do jobs that interact with land and crops such as spraying, mowing,” (Note: Lidar sensor automatically detects the height of an object.) (Note: Image data gets the color of the plants). Applicant’s arguments are not persuasive.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
6. Claim(s) 1 and 3-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wykman et al. (hereinafter Wykman) to (US 20180213731 A1) in view of (US 20190320580 A1) to Haneda et al. (hereinafter Haneda) and further in view of (US 20210000006 A1) to Ellaboudy et al. (hereinafter Ellaboudy).
Regarding claim 1, Wykman discloses A mower comprising: a mower body including a
cutting blade portion for cutting grass; (Wykman Paragraph 0066: “Accordingly, movement
of the robotic mower 15 over the parcel 200 may be controlled by the control circuitry 12 in
a manner that enables the robotic mower 15 to systematically traverse the parcel while
operating a cutting blade to cut the grass on the parcel 200.”) a first position estimation unit
that estimates a self position of the mower body using a satellite positioning system; (Wykman
Paragraph 0101: “As such, the GPS receiver 852 may employ satellite based positioning in
conjunction with GPS, GLONASS, Galileo, GNSS, and/or the like to enhance accuracy of
the GPS receiver 852.”) a second position estimation unit that estimates the self position of the
mower body using a sensor unit provided on the mower body; (Wykman Paragraph 0064:
“The robotic mower 15 may be controlled, at least in part, via control circuitry 12 located
onboard. The control circuitry 12 may include, among other things, a positioning module 780 and a sensor network 30, which will be described in greater detail below.”) (Wykman Paragraph 0065: “If a sensor network 30 is employed, the sensor network 30 may include sensors related to positional determination (e.g., a GPS receiver, an accelerometer, a camera, an inertial measurement unit and/or the like).”) […] and an automatic travel control unit that causes the mower body to travel on the traveling road with reference to the self position estimated by the first position estimation unit, (Wykman Paragraph 0094: “As indicated above, the robotic mower 15 may also be configured to utilize the sensor network 30 and modules described above to engage in other functions indicative of intelligent vehicle autonomy.”) (Wykman Paragraph 0101: “Accordingly, inertial navigation systems may require a periodic position correction, which may be accomplished by getting a position fix from another more accurate method or by fixing a position of the robotic mower 15 relative to a known location. For example, navigation conducted via the IMU 850 may be used for robotic mower 15 operation for a period of time, and then a correction may be inserted when a GPS fix is obtained on robotic mower position”) and that causes the mower body to travel by switching so as to refer to the self position estimated by the second position estimation unit in the portion covered with the plants detected by the detection unit. (Wykman Paragraph 0079: “Alternatively or additionally, the applications may include applications for controlling the robotic mower 15 relative to various operations including determining lawn conditions via water stress, nutrient stress, weed infestations, and/or pest infestations (e.g., using one or more sensors of the vegetation analyzer 770).”)
Wykman does not teach […] a detection unit that detects a portion covered with plants on a preset traveling road before mowing begins; wherein the detection unit is configured to: acquire RGB image data including color information and ortho image data, the ortho image data being data corresponding to an imaging area of the RGB image data and including point cloud data; identify the plants based on the color information in the RGB image data; and identify height of the plants based on height information of the point cloud data in the ortho image data to detect the portion covered with the plants;
However, Haneda does teach […] a detection unit that detects a portion covered with plants on a preset traveling road before mowing begins; (Haneda Paragraph 0070: “ For example, the receiving section 310 receives image data”) (Haneda Paragraph 0070: “The above-mentioned image data may be image data of a work target (for example, plants such as lawn grasses or weeds) of the lawn mower 230.”) (Haneda Paragraph 0180: “For example, if the image acquired from the receiving section 310 is an image of lawn grasses present in the forward direction in terms of a course of the lawn mower 230, the lawn recognizing section 430 determines that the image is an image before lawn mowing.”)
Therefore, it would have been obvious to one of ordinary skill in art before the effective filing date of the claimed invention to have modified Wykman to include […] a detection unit that detects a portion covered with plants on a preset traveling road before mowing begins; taught by Haneda. This would have been for the benefit to include the work machine 100 includes an autonomous travelling section 110, a cutting section 120, an image-capturing section 130 and a control device 140. The control device 140 includes, for example, a judging section 142 and a control section 144. [Haneda Paragraph 0020]
Haneda does not teach […] wherein the detection unit is configured to: acquire RGB image data including color information and ortho image data, the ortho image data being data corresponding to an imaging area of the RGB image data and including point cloud data; identify the plants based on the color information in the RGB image data; and identify height of the plants based on height information of the point cloud data in the ortho image data to detect the portion covered with the plants;
However, Ellaboudy does teach […] wherein the detection unit is configured to: acquire RGB image data including color information and ortho image data, the ortho image data being data corresponding to an imaging area of the RGB image data and including point cloud data; (Ellaboudy Paragraph 0054: “For example, the one or more distance sensors may include a lidar sensor, a radar sensor, a sonar sensor, and/or a structured light sensor. For example, sensor data captured using the one or more distance sensors 146 may include a three-dimensional point cloud”) (Ellaboudy Paragraph 0132: “For example, the image data may include RGB images and/or normalized difference vegetation index data.”) (Ellaboudy Paragraph 0196: “FIG. 16 depicts the hardware and software components of the tractor perception system 1600. The perception system hardware is comprised of a 3-D lidar sensor 1630 and Color Camera sensor 1640 that are mounted to the front of the tractor.”) identify the plants based on the color information in the RGB image data; and identify height of the plants based on height information of the point cloud data in the ortho image data to detect the portion covered with the plants; (Ellaboudy Paragraph 0118: “For example, filtering 710 the point cloud data may include cropping the point cloud data to the zone of interest. For example, the zone of interest may limited to a range of heights”) (Ellaboudy Paragraph 0132: “For example, the image data may include RGB images and/or normalized difference vegetation index data.”) (Ellaboudy Paragraph 0133: “The process 900 includes detecting 920 the one or more plants based on the image data. For example, computer vision processing (e.g., using a convolutional neural network) may be implemented to detect and/or classify the one or more plants. In some implementations, point cloud data from a distance sensor (e.g., a lidar sensor) may also be used to help detect and/or classify the one or more plants.”) (Ellaboudy Paragraph 0158: “For example, the vehicle control systems described herein may be used with vehicles to do jobs that interact with land and crops such as spraying, mowing,”) (Note: Lidar sensor automatically detects the height of an object.) (Note: Image data gets the color of the plants)
Therefore, it would have been obvious to one of ordinary skill in art before the effective filing date of the claimed invention to have modified Wykman in view of Hanedaa to include […] wherein the detection unit is configured to: acquire RGB image data including color information and ortho image data, the ortho image data being data corresponding to an imaging area of the RGB image data and including point cloud data; identify the plants based on the color information in the RGB image data; and identify height of the plants based on height information of the point cloud data in the ortho image data to detect the portion covered with the plants; taught by Ellaboudy. This would have been for the benefit to include a distance sensor connected to a vehicle, wherein the distance sensor is configured to output range data reflecting distances of objects with respect to the vehicle; actuators configured to control motion of the vehicle; and a processing apparatus configured to: access range data captured using the distance sensor; detect a crop row based on the range data to obtain position data for the crop row; determine , based on the position data for the crop row, a yaw and a lateral position of the vehicle with respect to a lane bounded by the crop row; and based on the yaw and the lateral position, control, using one or more of the actuators, the vehicle to move along a length of the lane bounded by the crop row. [Ellaboudy Paragraph 0005]
Regarding claim 3, Wykman in view of Haneda and further in view of Ellaboudy teaches claim 1, accordingly, the rejection of claim 1 is incorporated above.
Wykman does not disclose The mower according to claim 1, wherein the detection unit compares first image data with second image data to detect the portion covered with the plants, the first image data being an image including the traveling road and acquired at a first time, and the second image data being data acquired at a second time different from the first time.
However, Haneda does teach The mower according to claim 1, wherein the detection unit compares first image data with second image data to detect the portion covered with the plants, the first image data being an image including the traveling road and acquired at a first time, and the second image data being data acquired at a second time different from the first time. (Haneda Paragraph 0027: “the judging section 142 judges the state of the cutting section 120 based on an image captured by an image-capturing section 130”) (Haneda Paragraph 0030: “Examples of the factors to consider for judging the state of the cutting section 120 may include (i) the type of lawn grasses, (ii) the number or density of lawn grasses, (iii) the appearance of cut portions of lawn grasses,”) (Haneda Paragraph 0034: “For example, first, a database about features of images of a cut work target is created for each state of the cutting section 120.”) (Haneda Paragraph 0036: “In one embodiment, the judging section 142 compares a feature of the image captured by the image-capturing section 130 and features of images that are associated with respective states of the cutting section 120 in a database, and decides the state of the cutting section 120 matching the image captured by the image-capturing section 130.”) (Haneda Paragraph 0057: “The monitoring camera 220 may capture an image of lawn grasses present around the lawn mower 230 while it is working. The monitoring camera 220 may capture an image of lawn grasses present in the forward direction in terms of a course of the lawn mower 230. The monitoring camera 220 may capture an image of lawn grasses present in a region that the lawn mower 230 passed through. Lawn grasses may be one example of a work target of the lawn mower 230. Lawn grasses may be one example of an object of image data.”) (Note: The lawn mower that captures an image while it is working is captured at a different time than when the lawn mower has passed through a region of grass.)
Therefore, it would have been obvious to one of ordinary skill in art before the effective filing date of the claimed invention to have modified Wykman to include The mower according to claim 1, wherein the detection unit compares first image data with second image data to detect the portion covered with the plants, the first image data being an image including the traveling road and acquired at a first time, and the second image data being data acquired at a second time different from the first time taught by Haneda. This would have been for the benefit to include the work machine 100 includes an autonomous travelling section 110, a cutting section 120, an image-capturing section 130 and a control device 140. The control device 140 includes, for example, a judging section 142 and a control section 144. [Haneda Paragraph 0020]
Regarding claim 4, Wykman in view of Haneda and further in view of Ellaboudy teaches claim 3, accordingly, the rejection of claim 3 is incorporated above.
Wykman does not disclose The mower according to claim 3, wherein: the first time is a mowing time; and the second time is a time in which there are less plants than the mowing time.
However, Haneda does disclose The mower according to claim 3, wherein: the first time is a mowing time; and the second time is a time in which there are less plants than the mowing time. (Haneda Paragraph 0057: “The monitoring camera 220 may capture an image of lawn grasses present around the lawn mower 230 while it is working. The monitoring camera 220 may capture an image of lawn grasses present in the forward direction in terms of a course of the lawn mower 230. The monitoring camera 220 may capture an image of lawn grasses present in a region that the lawn mower 230 passed through. Lawn grasses may be one example of a work target of the lawn mower 230. Lawn grasses may be one example of an object of image data.”) (Note: The lawn mower while it is working is captured at a different time than when the lawn mower has passed through a region of grass.)
Therefore, it would have been obvious to one of ordinary skill in art before the effective filing date of the claimed invention to have modified Wykman to include The mower according to claim 3, wherein: the first time is a mowing time; and the second time is a time in which there are less plants than the mowing time taught by Haneda. This would have been for the benefit to include the work machine 100 includes an autonomous travelling section 110, a cutting section 120, an image-capturing section 130 and a control device 140. The control device 140 includes, for example, a judging section 142 and a control section 144. [Haneda Paragraph 0020]
Regarding claim 5, Wykman discloses The mower according to claim 1, wherein the sensor unit includes at least one of a plurality of sensors including a distance measuring sensor, a camera, a gyro sensor, a magnetic sensor, an acceleration sensor, and a radar sensor. (Wykman Paragraph 0065: “ If a sensor network 30 is employed, the sensor network 30 may include sensors related to positional determination (e.g., a GPS receiver, an accelerometer, a camera, an inertial measurement unit and/or the like).”) (Wykman Paragraph 0073: “a sensor network 30 disposed at the robotic mower 15”)
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KEVIN J HARVEY whose telephone number is 571-272-5327. The examiner can normally be reached 7:16AM-4:46PM M, 6:30AM-5:00PM T-W, 7:30AM-5:00PM Th, 8:00AM-4:00PM F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito Robinson can be reached at 571-270-3921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/K.J.H./Junior Patent Examiner, Art Unit 3664
/KITO R ROBINSON/Supervisory Patent Examiner, Art Unit 3664