DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This action is in response to amendments and remarks filed on 09/30/2025. The examiner notes the following adjustments to the claims by the applicant:
Claims 1, 5, 9, 10, and 14-18 are amended; and
Claims 6-8 are cancelled.
Therefore, Claims 1-5 and 9-20 are pending examination, in which Claims 1, 10 and 18 are independent claims.
In light of the instant amendments and arguments:
Further examination resulted in a new rejection of Claims 1-5 and 9-20 under 35 U.S.C. § 103, as detailed below.
THIS ACTION IS MADE FINAL. Necessitated by amendment.
Claim Objections
Amended Claim 10 is objected to because of the following informality:
The phrase “classified s a dull cut end” is grammatically incorrect, and should be “classified as a dull cut end”.
Appropriate correction is required.
Response to Arguments
Applicant presents the following arguments regarding the previous office action:
To overcome the 35 U.S.C. § 102 and § 103 rejections, the applicant has amended each independent claim to include the additional underlined limitations: "identify a plurality of cut ends of the cut crop in the image; classify each of the plurality of cut ends of the cut crop as one of a sharp cut end and a dull cut end; calculate a frequency of dull cut ends of the cut crop as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end; determine a cut quality based on the frequency of dull cut ends of the cut crop";
“Applicant respectfully submits that none of the prior art references of record teach or disclose determining the cut quality of a blade based on the ratio of dull cut ends to the sum of the dull cut ends and the sharp cut ends. Referring to pages 16 and 17 of the Non-Final Office Action mailed on 7/29/25, the Examiner notes that, with respect to claims 7 and 8 as originally submitted, the Dugas reference discloses "the message may be triggered when the cut quality diminishes to a predetermined level, or to a predetermined level for a predetermined amount of time". The Dugas reference does not teach or otherwise disclose with any specificity what may be considered and/or define the predetermined level or the predetermined level for a predetermined level of time. Applicant respectfully submits that the Dugas reference does not teach or disclose determining the cut quality of a blade based on the ratio of dull cut ends to the sum of the dull cut ends and the sharp cut ends.”.
Applicant's arguments A. and B. appear to be directed to the instantly amended subject matter. Accordingly, they have been addressed in the rejections below.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 5, 10-11 and 14-19 are rejected under 35 U.S.C. §103 as being unpatentable over the combination of Haneda et al. (US 11,457,559 B2, henceforth Haneda) and Dugas et al. (US 12,102,033B2), henceforth Dugas.
Regarding Claim 1, Haneda explicitly discloses the limitations: a mower implement {620, Fig. 6} comprising: a frame {body and internal structure of 102, Fig. 1, and 602, Fig. 6} moveable across a ground surface in a direction of travel during operation {“the work machine 102 for example can travel by automatic operation by a computer mounted on the work machine 102. The work machine 102 may travel by remote manipulation by a user. The work machine 102 may be a travelling body to run, a travelling 25 body to fly, or a travelling body to travel in or on water.”, Col. 2, Lns. 20-25}; a cutter coupled to the frame and including a blade operable to cut crop material as the frame moves across the ground surface {“the lawn mower 230 includes a work unit 620. The work unit 620 for example has a blade disk 622, a cutter blade 624, a motor for work 626 and a shaft 628. The lawn mower 230 may include a position adjusting section 630 that adjusts a position of the work unit 620. The work unit 620 may be one example of a cutting section.”, Col. 23, Lns. 13-18 and Fig. 6}; an image sensor coupled to the frame {image-capturing unit 660, Fig. 6} and positioned to capture an image of cut crop rearward of the cutter relative to the direction of travel during operation {“the image-capturing unit 660 captures an image of the space around the lawn mower 230. The image-capturing unit 660 may capture an image of lawn grasses to be a work target of the lawn mower 230. The image-capturing unit 660 may capture an image of lawn grasses cut by the lawn mower 230. The image-capturing unit 660 may acquire a still image of an object or acquire a moving image of an object. The image-capturing unit 660 may have a plurality of image sensors. The image-capturing unit 660 may be a 360-degree angle camera.”, Col. 23, Lns. 43-52}; a blade diagnostic controller {judgement processing section 440, including blade judging section 444 and interfacing with “learning model”, Fig. 4} including a processor and a memory having a diagnostic algorithm stored therein, wherein the processor is operable to execute the diagnostic algorithm {“information processing device may include: (i) a data processing device having processors….storage devices (including external storage devices) such as a memory or a HDD. In the above-mentioned information processing device, the above-mentioned data processing device or storage devices may store the above-mentioned software or program. Upon being executed by a processor, the above-mentioned software or program causes the above-mentioned information processing device to execute operations stipulated by the software or program”, Col. 5, Lns. 7-23; and control device in Fig. 1} to: capture an image of the cut crop rearward of the cutter with the image sensor {“The image-capturing unit 660 may capture an image of lawn grasses cut by the lawn mower 230.…The image-capturing unit 660 may be a 360-degree angle camera.”, Col. 23, Lns. 43-52} as the frame moves across the ground surface {imaging analyzing section 320, Figs. 3-4, capture images of the cut grass to for analysis to determine the “cut state”: “The cut state of lawn grasses may be the state of cut surfaces.”, Col. 5, Lns. 7-18}; identify a plurality of cut ends of the cut crop in the image {“the lawn state judging section 442 judges the state of lawn grasses. For example, the lawn state judging section 442 judges the cut state of lawn grasses based on a feature of end portions of the lawn grasses.”, Col. 18, Lns. 15-18; “The cut state of lawn grasses may be the state of cut surfaces.”, Col. 15, Lns. 55-56}; determine a cut quality of the cut end of the cut crop {“The lawn recognizing section 430 (i) may recognize the form of lawn grasses based on a predetermined determination criterion or algorithm, or…utilizing a learning model obtained through machine learning. The above-mentioned determination criterion may be a general criterion for extracting an outline of an object, or information indicating a condition about each among one or more factors to consider to be used for extracting the shapes or end portions of lawn grasses.”, Col. 14, Lns. 3-11}; correlate the cut quality to a blade sharpness index {“the lawn state judging section 442 transmits, to the blade state judging section 444, information indicating the cut state of the lawn grass. Then, the blade state judging section 444 judges the state of a blade based on the cut state of the lawn grass.”, Col. 27, Lns. 60-64; “the parameter generating section 446 may generate a control parameter utilizing a result of judgment about the state of a blade.”, Col. 18, Lns. 41-43}; and communicate an index signal to a communicator, wherein the index signal controls the communicator to generate a communication {“if a state parameter…satisfies a predetermined condition, the instruction generating section 330 generates an instruction for displaying, on a user interface of the lawn mower 230, a message corresponding to the above-mentioned condition. For example, if the state parameter indicates that maintenance of or a check on a blade is necessary, the instruction generating section 330 generates an instruction for displaying, on the user interface of the lawn mower 230, a message indicating that maintenance of or a check on the blade is recommended.”, Col. 10, Lns. 47-57; also, transmitting section 340, Fig. 3, and Col. 10, Lns. 3-18} indicating the blade sharpness index {state-of-a-blade: “the lawn state judging section 442 transmits, to the blade state judging section 444, a result of judgment about the cut state of lawn grasses. Thereby, the blade state judging section 444 can judge the state of a blade utilizing a result of judgment about the cut state of lawn grasses”, Col. 18, Lns. 31-34; also, “the control unit 680 may execute various types of judgment processes. The control unit 680 may execute at least one of judgment processes at the judgment processing section 440…the control unit 680 judges the state of the work unit 620 based on image data of an image captured by the image-capturing unit 660. The state of the work unit 620 may be the cutting performance of the cutter blade 624.”, Col. 24, Lns. 7-16}.
Haneda does not appear to explicitly recite the limitation: classify each of the plurality of cut ends of the cut crop as one of a sharp cut end and a dull cut end; calculate a frequency of dull cut ends of the cut crop as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end; determine a cut quality based on the frequency of dull cut ends of the cut crop.
However, Dugas explicitly recites limitation: classify each of the plurality of cut ends of the cut crop as one of a sharp cut end and a dull cut end {“The control system is configured to receive the signal from the sensor and programmed to 1) classify a cut quality of the billet based on the signal, wherein classifying the cut quality includes assigning a cut quality indicator from a range of cut quality indicators to the billet, wherein the range of cut quality indicators includes at least one indicator of relatively high cut quality and at least one indicator of relatively low cut quality, and 2) index the cut quality indicator into the memory.”, Col. 1, Lns. 50-58}; calculate a frequency of dull cut ends of the cut crop as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end {calculating a ratio and/or a proportion using a pair of known quantities (such as the low and high cut qualities discussed in Col. 1, Lns. 50-58) is well known to one skilled in the art}; determine a cut quality based on the frequency of dull cut ends of the cut crop {determination of repetitive poor cut quality over a period of time: “The control system 100 is configured to communicate a message (illustrated schematically in FIG. 4 and described above), by way of the human machine interface 108, informative of blade wear based on the indexed cut qualities. For example, the message may be triggered when the cut quality diminishes to a predetermined level, or to a predetermined level for a predetermined amount of time, or in other suitable ways.”, Col. 7, Lns. 52-59}.
Haneda and Dugas are analogous art because they both deal with the quality of cuts produced by a crop or grass cutting machine.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Haneda and Dugas before them, to modify the teachings of Haneda to include the teachings of Dugas to classify and index data related to cut quality {“The control system is configured to receive the signal from the sensor and programmed to 1) analyze the three-dimensional appearance of the at least a portion of the billet, 2) classify the three-dimensional appearance using an indicator of cut quality and 3) index the indicator of cut quality into the memory.”, Abstract}.
Regarding Claim 2, the combination of Haneda and Dugas discloses the limitations of Claim 1, as discussed supra. In addition, Haneda explicitly discloses the limitation: wherein the processor is operable to execute the diagnostic algorithm to automatically communicate a maintenance request signal to the communicator when the blade sharpness index is below a sharpness threshold, wherein the maintenance request signal controls the communicator to generate a communication requesting maintenance for the blade {“if a state parameter…satisfies a predetermined condition, the instruction generating section 330 generates an instruction for displaying, on a user interface of the lawn mower 230, a message corresponding to the above-mentioned condition. For example, if the state parameter indicates that maintenance of or a check on a blade is necessary, the instruction generating section 330 generates an instruction for displaying, on the user interface of the lawn mower 230, a message indicating that maintenance of or a check on the blade is recommended.”, Col. 10, Lns. 47-57}.
Regarding Claim 5, Haneda discloses the limitations of Claim 1, as discussed supra. In addition, Haneda explicitly discloses the limitation: wherein the processor is operable to execute the diagnostic algorithm to identify the plurality of cut ends in the image via pattern matching and recognition using a neural network {“A feature about the appearance of the work target 104 may be a feature about (i) at least one of the shape, hue and luster of the work target 104, (ii) a matter attaching to the work target 104, (iii) a leakage from the work target 104, and the like. A feature about the appearance of the work target 104 may be a feature about end portions of the above-mentioned work target. In a feature recognition process of the feature recognizing section 120, a known image recognition technology may be utilized…In an image recognition process, an image recognition technology utilizing machine learning may be utilized. In information processing of the feature recognizing section 120, machine learning may be utilized. The machine learning may be supervised learning, unsupervised learning, or reinforcement learning. In the learning process, learning techniques using a neural network technology...”, Col. 3, Lns. 21-44; see also “lawn state judging section 442” in Col. 17, Lns. 55-64}.
Haneda does not appear to explicitly recite the limitation: identify the cut end of the cut crop in the image via pattern matching and recognition using a convolutional neural network.
However, Dugas explicitly recites limitation: identify the cut end of the cut crop in the image {“The control system 100 receives the signal from the sensor 70 indicative of the appearance of the billet B. The control system 100 may continuously or periodically analyze the appearance of the billets B downstream of the chopper 28, e.g., to measure parameters of a cut.”, Col. 6, Lns. 4-8} via pattern matching and recognition using a convolutional neural network {convolution neural network used to identify features of the cut end of a crop: “For example, roundness, degree of crushing, number of cut surfaces, appearance of severed fibers, or a deviation from an optimal appearance are parameters in determining the level of damage. As a more specific example, severed fibers in the cut area A may have an increasingly jagged and/or loose appearance as the blade 30 wears. The control system 100 may measure lengths of the severed fibers based on the signal from the sensor 70 to assess the level of damage and therefore the cut quality. The lengths may be saved in the memory 106 but need not be saved in some implementations. Such an algorithm may be hard-coded or may employ a neural network trained with pre-classified images to recognize images having various lengths of severed fibers. For example, the neural network may include a convolutional neural network.”, Col. 12, Lns. 21-24}.
Regarding Claim 10, Haneda explicitly discloses the limitations: a mower implement {620, Fig. 6} comprising: a cutter including a blade operable to cut crop material {“the lawn mower 230 includes a work unit 620. The work unit 620 for example has a blade disk 622, a cutter blade 624, a motor for work 626 and a shaft 628. The lawn mower 230 may include a position adjusting section 630 that adjusts a position of the work unit 620. The work unit 620 may be one example of a cutting section.”, Col. 23, Lns. 13-18, and Fig. 6}; an image sensor {image-capturing unit 660, Fig. 6} positioned to capture an image of cut crop stubble rearward of the cutter relative to a direction of travel during operation {“the image-capturing unit 660 captures an image of the space around the lawn mower 230. The image-capturing unit 660 may capture an image of lawn grasses to be a work target of the lawn mower 230.”, Col. 23, Lns. 43-52}; a blade diagnostic controller {judgement processing section 440, including blade judging section 444 and interfacing with “learning model”, Fig. 4} including a processor and a memory having a diagnostic algorithm stored therein, wherein the processor is operable to execute the diagnostic algorithm {“information processing device may include: (i) a data processing device having processors….storage devices (including external storage devices) such as a memory or a HDD…the above-mentioned software or program causes the above-mentioned information processing device to execute operations stipulated by the software or program”, Col. 5, Lns. 7-23; and control device in Fig. 1} to: capture an image {“The image-capturing unit 660 may capture an image of lawn grasses cut by the lawn mower 230.…The image-capturing unit 660 may be a 360-degree angle camera.”, Col. 23, Lns. 43-52} of the cut crop stubble rearward of the cutter with the image sensor {cut state: “the lawn state judging section 442 transmits, to the blade state judging section 444, information indicating the cut state of the lawn grass. Then, the blade state judging section 444 judges the state of a blade based on the cut state of the lawn grass.”, Col. 27, Lns. 60-64; also Col. 16, Lns. 15-18}; identify a plurality of cut ends of the cut crop stubble in the image {“the lawn state judging section 442 recognizes a feature of end portions of lawn grasses based on the form of the lawn grasses, and judges the state of the lawn grasses based on the feature. The lawn state judging section 442 (i) may recognize a feature of lawn grasses or a feature of end portions of the lawn grasses based on a predetermined determination criterion, or (ii) may recognize a feature of lawn grasses or a feature of end portions of the lawn grasses”, Col. 16, Lns. 15-23}; determine a cut quality of the cut end of the cut crop stubble {“The lawn recognizing section 430 (i) may recognize the form of lawn grasses based on a predetermined determination criterion or algorithm, or…utilizing a learning model obtained through machine learning. The above-mentioned determination criterion may be a general criterion for extracting an outline of an object, or information indicating a condition about each among one or more factors to consider to be used for extracting the shapes or end portions of lawn grasses.”, Col. 14, Lns. 3-11}; correlate the cut quality to a blade sharpness index {“the lawn state judging section 442 transmits, to the blade state judging section 444, information indicating the cut state of the lawn grass. Then, the blade state judging section 444 judges the state of a blade based on the cut state of the lawn grass.”, Col. 27, Lns. 60-64; “the parameter generating section 446 may generate a control parameter utilizing a result of judgment about the state of a blade.”, Col. 18, Lns. 41-43}; and communicate an index signal to a communicator, wherein the index signal controls the communicator to generate a communication indicating the blade sharpness index {“if a state parameter…satisfies a predetermined condition, the instruction generating section 330 generates an instruction for displaying, on a user interface of the lawn mower 230, a message corresponding to the above-mentioned condition. For example, if the state parameter indicates that maintenance of or a check on a blade is necessary, the instruction generating section 330 generates an instruction for displaying, on the user interface of the lawn mower 230”, Col. 10, Lns. 47-57; also, transmitting section 340, Fig. 3, and Col. 10, Lns. 3-18} indicating the blade sharpness index {state-of-a-blade: “the lawn state judging section 442 transmits, to the blade state judging section 444, a result of judgment about the cut state of lawn grasses. Thereby, the blade state judging section 444 can judge the state of a blade utilizing a result of judgment about the cut state of lawn grasses”, Col. 18, Lns. 31-34; also, “the control unit 680 may execute various types of judgment processes. The control unit 680 may execute at least one of judgment processes at the judgment processing section 440…the control unit 680 judges the state of the work unit 620 based on image data of an image captured by the image-capturing unit 660. The state of the work unit 620 may be the cutting performance of the cutter blade 624.”, Col. 24, Lns. 7-16}.
Haneda does not appear to explicitly recite the limitation: classify each of the plurality of cut ends of the cut crop stubble as one of a sharp cut end and a dull cut end; calculate a frequency of dull cut ends of the cut crop stubble as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end; determine a cut quality based on the frequency of dull cut ends of the cut crop stubble.
However, Dugas explicitly recites limitation: classify each of the plurality of cut ends of the cut crop stubble as one of a sharp cut end and a dull cut end {“The control system is configured to receive the signal from the sensor and programmed to 1) classify a cut quality of the billet based on the signal, wherein classifying the cut quality includes assigning a cut quality indicator from a range of cut quality indicators to the billet, wherein the range of cut quality indicators includes at least one indicator of relatively high cut quality and at least one indicator of relatively low cut quality, and 2) index the cut quality indicator into the memory.”, Col. 1, Lns. 50-58}; calculate a frequency of dull cut ends of the cut crop stubble as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end {calculating ratio and/or proportion using a pair of known quantities (such as the low and high cut qualities discussed in Col. 1, Lns. 50-58) is well known to one skilled in the art}; determine a cut quality based on the frequency of dull cut ends of the cut crop stubble {determination of repetitive poor cut quality over a period of time: “The control system 100 is configured to communicate a message (illustrated schematically in FIG. 4 and described above), by way of the human machine interface 108, informative of blade wear based on the indexed cut qualities. For example, the message may be triggered when the cut quality diminishes to a predetermined level, or to a predetermined level for a predetermined amount of time, or in other suitable ways.”, Col. 7, Lns. 52-59}.
Regarding Claim 11, the combination of Haneda and Dugas discloses the limitations of Claim 10, as discussed supra. In addition, Haneda explicitly discloses the limitation: wherein the processor is operable to execute the diagnostic algorithm to automatically communicate a maintenance request signal to the communicator when the blade sharpness index is below a sharpness threshold, wherein the maintenance request signal controls the communicator to generate a communication requesting maintenance for the blade {“if a state parameter…satisfies a predetermined condition, the instruction generating section 330 generates an instruction for displaying, on a user interface of the lawn mower 230, a message corresponding to the above-mentioned condition. For example, if the state parameter indicates that maintenance of or a check on a blade is necessary, the instruction generating section 330 generates an instruction for displaying, on the user interface of the lawn mower 230, a message indicating that maintenance of or a check on the blade is recommended.”, Col. 10, Lns. 47-57}.
Regarding Claim 14, the combination of Haneda and Dugas discloses the limitations of Claim 11, as discussed supra. In addition, Haneda explicitly discloses the limitation: wherein the processor is operable to execute the diagnostic algorithm to identify the cut end of the cut crop stubble in the image via pattern matching and recognition using a neural network {“A feature about the appearance of the work target 104 may be a feature about (i) at least one of the shape, hue and luster of the work target 104, (ii) a matter attaching to the work target 104, (iii) a leakage from the work target 104, and the like. A feature about the appearance of the work target 104 may be a feature about end portions of the above-mentioned work target. In a feature recognition process of the feature recognizing section 120, a known image recognition technology may be utilized…In an image recognition process, an image recognition technology utilizing machine learning may be utilized. In information processing of the feature recognizing section 120, machine learning may be utilized. The machine learning may be supervised learning, unsupervised learning, or reinforcement learning. In the learning process, learning techniques using a neural network technology...”, Col. 3, Lns. 21-44}
Haneda does not appear to explicitly recite the limitation: identify the cut end of the cut crop stubble in the image via pattern matching and recognition using a convolution neural network.
However, Dugas explicitly recites limitation: identify the cut end of the cut crop stubble in the image {“The control system 100 receives the signal from the sensor 70 indicative of the appearance of the billet B. The control system 100 may continuously or periodically analyze the appearance of the billets B downstream of the chopper 28, e.g., to measure parameters of a cut.”, Col. 6, Lns. 4-8} via pattern matching and recognition using a convolutional neural network {convolution neural network used to identify features of the cut end of a crop: “For example, roundness, degree of crushing, number of cut surfaces, appearance of severed fibers, or a deviation from an optimal appearance are parameters in determining the level of damage. As a more specific example, severed fibers in the cut area A may have an increasingly jagged and/or loose appearance as the blade 30 wears. The control system 100 may measure lengths of the severed fibers based on the signal from the sensor 70 to assess the level of damage and therefore the cut quality. The lengths may be saved in the memory 106 but need not be saved in some implementations. Such an algorithm may be hard-coded or may employ a neural network trained with pre-classified images to recognize images having various lengths of severed fibers. For example, the neural network may include a convolutional neural network.”, Col. 12, Lns. 21-24}.
Regarding Claim 15, the combination of Haneda and Dugas discloses the limitations of Claim 14, as discussed supra. Haneda does not appear to explicitly recite the limitation: wherein the convolutional neural network is operable to classify each of the plurality of the cut crop stubble as one of a sharp cut end and a dull cut end.
However, Dugas explicitly recites limitation: wherein the convolutional neural network {Col. 12, Lns. 21-24} is operable to classify the cut end of the cut crop stubble as one of a sharp cut end and a dull cut end {“The control system is configured to receive the signal from the sensor and programmed to 1) classify a cut quality of the billet based on the signal, wherein classifying the cut quality includes assigning a cut quality indicator from a range of cut quality indicators to the billet, wherein the range of cut quality indicators includes at least one indicator of relatively high cut quality and at least one indicator of relatively low cut quality, and 2) index the cut quality indicator into the memory.”, Col. 1, Lns. 50-58}.
Regarding Claim 16, the combination of Haneda and Dugas discloses the limitations of Claim 15, as discussed supra. Haneda does not appear to explicitly recite the limitations: wherein the frequency of dull cut ends of the cut crop stubble is calculated over a period of time from a plurality of images.
However, Dugas explicitly recites limitations: wherein the frequency of dull cut ends of the cut crop stubble is calculated over a period of time from a plurality of images {determination of repetitive poor cut quality over a period of time: “The control system 100 is configured to communicate a message (illustrated schematically in FIG. 4 and described above), by way of the human machine interface 108, informative of blade wear based on the indexed cut qualities. For example, the message may be triggered when the cut quality diminishes to a predetermined level, or to a predetermined level for a predetermined amount of time, or in other suitable ways.”, Col. 7, Lns. 52-59}.
Regarding Claim 17, the combination of Haneda and Dugas discloses the limitations of Claim 16, as discussed supra. In addition, Haneda explicitly recites the limitations: wherein the processor is operable to execute the diagnostic algorithm {blade judging section 444, Fig. 4 of image analyzing section 320, Figs. 3-4 and judgement processing section 440, Fig. 4: “the image analyzing section 320 transmits at least one of the state parameter, the control parameter and the water-supply parameter to the instruction generating section 330.”, Col. 10, Lns. 5-8} to determine the cut quality based on a moisture content of the crop material {plant water content parameter: “various types of parameters may include (i) a parameter indicating the state of the lawn mower 230 (which may be sometimes referred to as a state parameter)…(iii) a parameter about whether water-supply to plants in the garden is necessary or not, or the level of water content in a medium of the plants (which may be sometimes referred to as a water-supply parameter), and the like. The state parameter may be a parameter indicating the state of a blade to cut lawn grasses.”, Col. 9, Lns. 33-42, and “The parameter about the level of water content in a medium may be the water content in the medium”, Col. 9, Lns. 54-55} and a speed of the blade {“the parameter generating section 446 receives a result of judgment about the state of the blade from the blade state judging section 444. The parameter generating section 446 generates a control parameter based on a result of judgment about the state of the blade. For example, if the cutting performance of the blade does not satisfy a predetermined condition does not satisfy a predetermined condition, the parameter generating section 446 decides the control parameter such…(ii) a rotational speed of the blade becomes higher, as compared with a case where the cutting performance of the blade satisfies the predetermined condition.”, Col. 20, Ln. 62 - Col. 21, Ln. 6}, using a blade sharpness index model saved on the memory {“The lawn recognizing section 430 (i) may recognize the form of lawn grasses based on a predetermined determination criterion or algorithm, or…utilizing a learning model obtained through machine learning. The above-mentioned determination criterion may be a general criterion for extracting an outline of an object, or information indicating a condition about each among one or more factors to consider to be used for extracting the shapes or end portions of lawn grasses.”, Col. 14, Lns. 3-11}.
Regarding Claim 18, Haneda explicitly discloses the limitations: method of monitoring a blade of a mower implement {602, Fig. 6}, the method comprising: capturing an image of cut crop stubble rearward of the cutter {“The image-capturing unit 660 may capture an image of lawn grasses cut by the lawn mower 230.…The image-capturing unit 660 may be a 360-degree angle camera.”, Col. 23, Lns. 43-52} with an image sensor mounted to the mower implement {imaging analyzing section 320, Figs. 3-4, capture images of the cut grass to for analysis to determine the “cut state”: “The cut state of lawn grasses may be the state of cut surfaces.”, Col. 5, Lns. 7-18} as the mower implement moves across a ground surface {“the work machine 102 for example can travel by automatic operation by a computer mounted on the work machine 102. The work machine 102 may travel by remote manipulation by a user. The work machine 102 may be a travelling body to run, a travelling 25 body to fly, or a travelling body to travel in or on water.”, Col. 2, Lns. 20-25}; identifying a plurality of cut ends of the cut crop stubble in the image {“the lawn state judging section 442 judges the state of lawn grasses. For example, the lawn state judging section 442 judges the cut state of lawn grasses based on a feature of end portions of the lawn grasses.”, Col. 18, Lns. 15-18; “The cut state of lawn grasses may be the state of cut surfaces.”, Col. 15, Lns. 55-56} with a blade diagnostic controller {judgement processing section 440, including blade judging section 444 and interfacing with “learning model”, Fig. 4} using a neural network; determining a cut quality of the cut end of the cut crop stubble {“The lawn recognizing section 430 (i) may recognize the form of lawn grasses based on a predetermined determination criterion or algorithm, or…utilizing a learning model obtained through machine learning. The above-mentioned determination criterion may be a general criterion for extracting an outline of an object, or information indicating a condition about each among one or more factors to consider to be used for extracting the shapes or end portions of lawn grasses.”, Col. 14, Lns. 3-11} with the blade diagnostic controller using the neural network {cut quality determined using features identified with machine learning/neural network: “A feature about the appearance of the work target 104 may be a feature about (i) at least one of the shape, hue and luster of the work target 104, (ii) a matter attaching to the work target 104, (iii) a leakage from the work target 104, and the like. A feature about the appearance of the work target 104 may be a feature about end portions of the above-mentioned work target…In an image recognition process, an image recognition technology utilizing machine learning may be utilized. In information processing of the feature recognizing section 120, machine learning may be utilized. The machine learning may be supervised learning, unsupervised learning, or reinforcement learning. In the learning process, learning techniques using a neural network technology...”, Col. 3, Lns. 21-44}; correlating the cut quality to a blade sharpness index with the blade diagnostic controller {“the lawn state judging section 442 transmits, to the blade state judging section 444, information indicating the cut state of the lawn grass. Then, the blade state judging section 444 judges the state of a blade based on the cut state of the lawn grass.”, Col. 27, Lns. 60-64; “the parameter generating section 446 may generate a control parameter utilizing a result of judgment about the state of a blade.”, Col. 18, Lns. 41-43}; and communicating an index signal to a communicator, wherein the index signal controls the communicator to generate a communication {“if a state parameter…satisfies a predetermined condition, the instruction generating section 330 generates an instruction for displaying, on a user interface of the lawn mower 230, a message corresponding to the above-mentioned condition. For example, if the state parameter indicates that maintenance of or a check on a blade is necessary, the instruction generating section 330 generates an instruction for displaying, on the user interface of the lawn mower 230” , Col. 10, Lns. 47-57; also, transmitting section 340, Fig. 3, and Col. 10, Lns. 3-18} indicating the blade sharpness index {state-of-a-blade: “the lawn state judging section 442 transmits, to the blade state judging section 444, a result of judgment about the cut state of lawn grasses. Thereby, the blade state judging section 444 can judge the state of a blade utilizing a result of judgment about the cut state of lawn grasses”, Col. 18, Lns. 31-34}.
Haneda does not appear to explicitly recite the limitation: identifying a cut end of the cut crop stubble in the image with a blade diagnostic controller using a convolutional neural network; determining a cut quality of the cut end of the cut crop stubble with the blade diagnostic controller using the convolutional neural network; classify each of the plurality of cut ends of the cut crop as one of a sharp cut end and a dull cut end; calculate a frequency of dull cut ends of the cut crop as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end; determine a cut quality based on the frequency of dull cut ends of the cut crop.
However, Dugas explicitly recites limitation: identifying a cut end of the cut crop stubble {“The control system 100 receives the signal from the sensor 70 indicative of the appearance of the billet B. The control system 100 may continuously or periodically analyze the appearance of the billets B downstream of the chopper 28, e.g., to measure parameters of a cut.”, Col. 6, Lns. 4-8} in the image with a blade diagnostic controller using a convolutional neural network; determining a cut quality of the cut end of the cut crop stubble with the blade diagnostic controller using the convolutional neural network {convolution neural network used to identify features of the cut end of a crop: “For example, roundness, degree of crushing, number of cut surfaces, appearance of severed fibers, or a deviation from an optimal appearance are parameters in determining the level of damage. As a more specific example, severed fibers in the cut area A may have an increasingly jagged and/or loose appearance as the blade 30 wears. The control system 100 may measure lengths of the severed fibers based on the signal from the sensor 70 to assess the level of damage and therefore the cut quality. The lengths may be saved in the memory 106 but need not be saved in some implementations. Such an algorithm may be hard-coded or may employ a neural network trained with pre-classified images to recognize images having various lengths of severed fibers. For example, the neural network may include a convolutional neural network.”, Col. 12, Lns. 21-24}; classify each of the plurality of cut ends of the cut crop as one of a sharp cut end and a dull cut end {“The control system is configured to receive the signal from the sensor and programmed to 1) classify a cut quality of the billet based on the signal, wherein classifying the cut quality includes assigning a cut quality indicator from a range of cut quality indicators to the billet, wherein the range of cut quality indicators includes at least one indicator of relatively high cut quality and at least one indicator of relatively low cut quality, and 2) index the cut quality indicator into the memory.”, Col. 1, Lns. 50-58}; calculate a frequency of dull cut ends of the cut crop as a ratio of a total number of the plurality of cut ends classified as a dull cut end relative to a sum of the total number of the plurality of cut ends classified as a dull cut end and a total number of the plurality of cut ends classified as a sharp cut end {calculating ratio and/or proportion using a pair of known quantities (such as the low and high cut qualities discussed in Col. 1, Lns. 50-58) is well known to one skilled in the art}; determine a cut quality based on the frequency of dull cut ends of the cut crop {determination of repetitive poor cut quality over a period of time: “The control system 100 is configured to communicate a message (illustrated schematically in FIG. 4 and described above), by way of the human machine interface 108, informative of blade wear based on the indexed cut qualities. For example, the message may be triggered when the cut quality diminishes to a predetermined level, or to a predetermined level for a predetermined amount of time, or in other suitable ways.”, Col. 7, Lns. 52-59}.
Regarding Claim 19, the combination of Haneda and Dugas discloses the limitations of Claim 18, as discussed supra. In addition, Haneda explicitly discloses the limitation: wherein the processor is operable to execute the diagnostic algorithm to automatically communicate a maintenance request signal to the communicator when the blade sharpness index is below a sharpness threshold , wherein the maintenance request signal controls the communicator to generate a communication requesting maintenance for the blade {“if a state parameter…satisfies a predetermined condition, the instruction generating section 330 generates an instruction for displaying, on a user interface of the lawn mower 230, a message corresponding to the above-mentioned condition. For example, if the state parameter indicates that maintenance of or a check on a blade is necessary, the instruction generating section 330 generates an instruction for displaying, on the user interface of the lawn mower 230, a message indicating that maintenance of or a check on the blade is recommended.”, Col. 10, Lns. 47-57}.
Claims 3-4, 12-13 and 20 are rejected under 35 U.S.C. §103 as being unpatentable over the combination of Haneda, Dugas and Wente et al. (US 12,277,692 B2, henceforth Wente).
Regarding Claim 3, the combination of Haneda and Dugas discloses the limitations of Claim 1, as discussed supra. The combination of Haneda and Dugas does not appear to explicitly recite the limitation: wherein the processor is operable to execute the diagnostic algorithm to estimate a remaining life of the blade based on the blade sharpness index.
However, Wendte explicitly recites limitation: wherein the processor is operable to execute the diagnostic algorithm to estimate a remaining life of the blade based on the blade sharpness index {“The blade wear detection MLM 350 may output a blade wear signal based on each input image. The blade wear signal may provide an indication of a blade wear level.”, Col. 10, Lns. 41-44, and “the harvester 710 may transmit an alert signal (see, e.g., operation 408 depicted in FIG. 5 ) to the server 720 in response to determining a wear level signal exceeds a threshold.”, Col. 12, Lns. 21-24, one skilled in the art will appreciate the remaining blade life is the difference between the determined blade wear level and the threshold wear level}.
The combination of Haneda and Dugas along Wente are analogous art because they deal with the quality of cuts produced by a crop or grass cutting machine.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Haneda, Dugas and Wente before them, to modify the teachings of Haneda and Dugas to include the teachings of Wente to maintain a high level of cutting quality, at all times, while avoiding the need for manual checking of blade wear {“the improved devices and methods provide monitoring of the blades while avoiding stoppages of the harvester, thereby increasing the processing speed of the harvester 10 while improving the quality of the resulting billets B”, Col. 6, Lns. 63-67}.
Regarding Claim 4, the combination of Haneda, Dugas and Wendte discloses the limitations of Claim 3, as discussed supra. The combination of Haneda and Dugas does not appear to explicitly recite the limitation: wherein the processor is operable to execute the diagnostic algorithm to communicate a life expectancy signal to the communicator, wherein the life expectancy signal controls the communicator to generate a communication indicating the remaining life of the blade.
However, Wendte explicitly recites limitation: wherein the processor is operable to execute the diagnostic algorithm to communicate a life expectancy signal to the communicator {“The second signal may be an alert indicating that the blade of the chopper 28 has become worn and should be sharpened…the second signal indicates the wear level of the blade…the alert may be output when the signal is greater than or equal to the threshold.”, Col. 8, Lns. 4-16}, wherein the life expectancy signal controls the communicator to generate a communication indicating the remaining life of the blade {“The server 720 may provide the alert signal to an operator via the terminal 730, the operator interface 66, etc. For example, the server 720 may transmit the alert signal to the operator via a second communication link.”, Col. 13, Lns. 8-12}.
Regarding Claim 12, the combination of Haneda and Dugas discloses the limitations of Claim 10, as discussed supra. The combination of Haneda and Dugas does not appear to explicitly recite the limitation: wherein the processor is operable to execute the diagnostic algorithm to estimate a remaining life of the blade based on the blade sharpness index.
However, Wendte explicitly recites limitations: wherein the processor is operable to execute the diagnostic algorithm to estimate a remaining life of the blade based on the blade sharpness index {“The blade wear detection MLM 350 may output a blade wear signal based on each input image. The blade wear signal may provide an indication of a blade wear level.”, Col. 10, Lns. 41-44, and “the harvester 710 may transmit an alert signal (see, e.g., operation 408 depicted in FIG. 5 ) to the server 720 in response to determining a wear level signal exceeds a threshold.”, Col. 12, Lns. 21-24, one skilled in the art will appreciate the remaining blade life is the difference between the determined blade wear level and the threshold wear level}.
Regarding Claim 13, the combination of Haneda, Dugas and Wente discloses the limitations of Claim 12, as discussed supra. The combination of Haneda and Dugas does not appear to explicitly recite the limitation: wherein the processor is operable to execute the diagnostic algorithm to communicate a life expectancy signal to the communicator, wherein the life expectancy signal controls the communicator to generate a communication indicating the remaining life of the blade.
However, Wendte explicitly recites limitation: wherein the processor is operable to execute the diagnostic algorithm to communicate a life expectancy signal to the communicator {“The second signal may be an alert indicating that the blade of the chopper 28 has become worn and should be sharpened…the second signal indicates the wear level of the blade…the alert may be output when the signal is greater than or equal to the threshold.”, Col. 8, Lns. 4-16}, wherein the life expectancy signal controls the communicator to generate a communication indicating the remaining life of the blade {“The server 720 may provide the alert signal to an operator via the terminal 730, the operator interface 66, etc. For example, the server 720 may transmit the alert signal to the operator via a second communication link.”, Col. 13, Lns. 8-12}.
Regarding Claim 20, the combination of Haneda and Dugas discloses the limitations of Claim 18, as discussed supra. The combination of Haneda and Dugas does not appear to explicitly disclose the limitations: estimating a remaining life of the blade based on the blade sharpness index and communicating a life expectancy signal to the communicator, wherein the life expectancy signal controls the communicator to generate a communication indicating the remaining life of the blade.
However,