Prosecution Insights
Last updated: April 19, 2026
Application No. 17/803,842

Classification and sawing of wood shingles using machine vision

Final Rejection §103
Filed
Dec 22, 2022
Examiner
SATCHER, DION JOHN
Art Unit
2676
Tech Center
2600 — Communications
Assignee
Clair Industrial Development Corporation Ltd.
OA Round
2 (Final)
85%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
33 granted / 39 resolved
+22.6% vs TC avg
Moderate +14% lift
Without
With
+14.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
29 currently pending
Career history
68
Total Applications
across all art units

Statute-Specific Performance

§101
14.2%
-25.8% vs TC avg
§103
61.9%
+21.9% vs TC avg
§102
15.1%
-24.9% vs TC avg
§112
8.3%
-31.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 39 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Applicant’s Amendments filed on 10/27/2025 has been entered and made of record. Currently pending Claim(s): Independent Claim(s): Amended Claim(s): 1–9 and 18–21 1, 18 and 21 1–3, 8, 9, 18, 19 and 21 Response to Applicant’s Arguments This office action is responsive to Applicant’s Arguments/Remarks Made in an Amendment received on 10/27/2025. In view of applicant Arguments/Remarks and amendment filed on 10/27/2025 with respect to independent claims 1, 18 and 21 under 35 U.S.C 101, claim rejection has been fully considered and the arguments are found to be persuasive (See Page(s) 8 and 9), therefore the claim rejection with respect to 35 U.S.C. 101 is withdrawn. Applicant’s Reply (October 27, 2025) includes substantive amendments to the claims. The Office action has been updated with new grounds of rejection addressing those amendments. Further Applicant’s Arguments/Remarks with respect to independent claims 1 and 18 have been considered but are moot because the arguments do not apply to any of the references being used in the current rejection and the arguments are now rejection by newly cited art ‘Chattopadhyay et al. (US 20190099886 A1)’ as explained in the body of the rejection. Examiner is interpreting the abstraction as ignoring the defect. After reviewing the arguments and remarks for the claim 21 and preforming an updated search and considering the prior cited art, Examiner is indicating claim 21 as allowable. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or non-obviousness. Claim(s) 1 and 2 are rejected under 35 U.S.C. 103 as being unpatentable over Longfellow (US 8113098 B1, hereafter, “Longfellow”) in view of Chattopadhyay et al. (US 20190099886 A1, hereafter, “Chattopadhyay”). Regarding claim 1, Longfellow teaches a method of wood shingle classification and sawing by machine vision (See Longfellow, [Abstract], Saw mill for machine vision detection of undesirable features in the wood in shingles being cut from billets and automated optimized saw operation are disclosed) comprising the steps of: taking an image of a wood slab prior to sawing a shingle from the wood slab (See Longfellow, [Col. 5, ln. 14–19], FIGS. 8 and 9 illustrate elements of the visual imaging system 500, which comprises a camera 510, a CPU 520, 15 vision software 530 that includes "blob tools" for identifying wood quality and determining the side cuts and software analytical tools and algorithms for combining and analyzing data acquired from the blob tools. [Col. 4, ln. 25–30], FIGS. 4-7 show various views of the transition station 400, along with the visual imaging system 500. The transition station 400 picks up the billet 10B after it has been squared, carries the billet 10B past the visual imaging system 500, and then aligns and pushes the billet 10B toward the gang rip saw station 600. Note: the visualization is performed before the sawing); identifying a defect in said image (See Longfellow, [Col. 6, ln. 24 and 25], The visual imaging system 500 uses the software-based conventional "blob tools" to detect a defect in the billet 10B); [comparing said image of said identified defect to images of confirmed defects in a database of images of confirmed defects to find a match of said identified defect in said images of confirmed defects; when the match is not found; considering said identified defect as a false defect, and making abstraction of said identified defect; and while making abstraction of said identified defect]: classifying said shingle; and sawing said shingle from said wood slab (See Longfellow, [Col. 7, ln. 64–67 – Col. 8, ln. 1–3], After steps, the data from this high-contrast map and the post processing are incorporated into the defect map. The defect map now contains data relating to all types of detected defects or wood characteristics that influence the quality grading. The visual imaging system 500 determines the grade and optimal cut of the billet 10B, based on this defect map). However, Longfellow fail(s) to teach comparing said image of said identified defect to images of confirmed defects in a database of images of confirmed defects to find a match of said identified defect in said images of confirmed defects; when the match is not found; considering said identified defect as a false defect, and making abstraction of said identified defect; and while making abstraction of said identified defect. Chattopadhyay, working in the same field of endeavor, teaches: comparing said image of said identified defect to images of confirmed defects in a database of images of confirmed defects to find a match of said identified defect in said images of confirmed defects (See Chattopadhyay, ¶ [0072], For example, certain combinations of the features may generate an outlier (if one of the features does not satisfy its threshold) but may not be indicative of a faulty robot. In some examples, the fault identifier 302 compares the features to a table or listing stored in the database 304. The table represents patterns of features and/or outlier feature(s) that have historically appeared when corresponding defects (e.g., a worn belt) are present); when the match is not found (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table)); considering said identified defect as a false defect, and making abstraction of said identified defect (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table), the fault identifier 302 generates an alert, at block 808, to inform the technician 120 or other user. For example, the fault identifier may transmit (e.g., over the network 116) a message to the technician 118 that the previous alert from the robot health monitor 110 is a false alarm. As such, the technician 118 may ignore the previous alert or notification. Note: the examiner is interpreting the abstraction as ignoring the defect if it a false alarm. Which Chattopadhyay is doing considering it’s a false alarm); and while making abstraction of said identified defect (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table), the fault identifier 302 generates an alert, at block 808, to inform the technician 120 or other user. For example, the fault identifier may transmit (e.g., over the network 116) a message to the technician 118 that the previous alert from the robot health monitor 110 is a false alarm. As such, the technician 118 may ignore the previous alert or notification. Note: the examiner is interpreting the abstraction as ignoring the defect if it a false alarm. Which Chattopadhyay is doing considering it’s a false alarm). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference to comparing said image of said identified defect to images of confirmed defects in a database of images of confirmed defects to find a match of said identified defect in said images of confirmed defects; when the match is not found; considering said identified defect as a false defect, and making abstraction of said identified defect; and while making abstraction of said identified defect based on the method of Chattopadhyay’s reference. The suggestion/motivation would have been to suppress the and prevent the generation of false alarms and identify valid defects in the object to more accurately identify defects and process the object accurately and timely based on the defect detection (See Chattopadhyay, ¶ [0021–0022]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Chattopadhyay with Longfellow to obtain the invention as specified in claim 1. Regarding claim 2, Longfellow in view of Chattopadhyay teaches the method as claimed in claim 1, also comprising the step of, [when the match is found, considering said identified defect as a real defect and adding said image of said identified defect to said database]. However, Longfellow fail(s) to teach when the match is found, considering said identified defect as a real defect and adding said image of said identified defect to said database. Chattopadhyay, working in the same field of endeavor, teaches: when the match is found, considering said identified defect as a real defect and adding said image of said identified defect to said database (See Chattopadhyay, ¶ [0052], In some examples, the database 304 includes a table of possible faults correlated to certain feature combinations. The fault identifier 302 may use the table to identify the type(s) of fault(s), …, In some examples, after a repair is made on a robot, the actual cause of the fault (e.g., a loose belt) may be uploaded to the fault classifier 122, which can then use the information in future predictions. Note: After the fault is identified (matched) and the repair is made the fault may be uploaded to the classifier which the Examiner is interpreting as uploading to the database); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference when the match is found, considering said identified defect as a real defect and adding said image of said identified defect to said database based on the method of Chattopadhyay’s reference. The suggestion/motivation would have been to suppress the and prevent the generation of false alarms and identify valid defects in the object to more accurately identify defects and process the object accurately and timely based on the defect detection (See Chattopadhyay, ¶ [0021–0022]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Chattopadhyay with Longfellow to obtain the invention as specified in claim 2. Claim(s) 18–20 are rejected under 35 U.S.C. 103 as being unpatentable over Longfellow (US 8113098 B1, hereafter, “Longfellow”) in view of Chattopadhyay et al. (US 20190099886 A1, hereafter, “Chattopadhyay”) further in view of Bolton et al. (US 10,825,164 B1, hereafter, “Bolton”). Regarding claim 18, Longfellow teaches a method of wood shingle classification and sawing using machine vision (See Longfellow, [Abstract], Saw mill for machine vision detection of undesirable features in the wood in shingles being cut from billets and automated optimized saw operation are disclosed) comprising the steps of: taking an image of a wood slab , the image including an identified defect in the wood slab (See Longfellow, [Col. 5, ln. 14–19], FIGS. 8 and 9 illustrate elements of the visual imaging system 500, which comprises a camera 510, a CPU 520, 15 vision software 530 that includes "blob tools" for identifying wood quality and determining the side cuts and software analytical tools and algorithms for combining and analyzing data acquired from the blob tools); [comparing, using artificial intelligence, said image of said identified defect to images of confirmed wood defects in a database of images of confirmed wood defects to find a match of said image of said identified defect in said images; when the match is found, adding said identified defect to said database of confirmed wood defects]; sawing a shingle from said wood slab and classifying said shingle according to position and nature of said identified defect (See Longfellow, [Col. 7, ln. 29–32], Using this information, together with an optimization algorithm, the visual imaging system 500 then determines the best 30 possible saw cuts 20 and 21 on the billet 10B to optimize the value of the finished shingle product 10C. [Col. 8, ln. 1–3], The visual imaging system 500 determines the grade and optimal cut of the billet 10B, based on this defect map); and [training said artificial intelligence on images of wood defects that are associable to a subjectivity of experienced shingle sawyers; when the match is not found: considering said identified defect as a false defect]; and sawing said shingle from said wood slab [while ignoring said identified defect] However, Longfellow fail(s) to teach comparing, using artificial intelligence, said image of said identified defect to images of confirmed wood defects in a database of images of confirmed wood defects to find a match of said image of said identified defect in said images; when the match is found, adding said identified defect to said database of confirmed wood defects; training said artificial intelligence on images of wood defects that are associable to a subjectivity of experienced shingle sawyers; when the match is not found: considering said identified defect as a false defect; while ignoring said identified defect. Bolton, working in the same field of endeavor, teaches: comparing, using artificial intelligence (See Bolton, [Col. 10, ln. 66–67 – Col. 11, ln. 1–3], The controller 308 can be programmed with certain parameters to be used by the various image processing tools to detect various defects. In some examples, a machine learning algorithm can be used to help determine these parameters); training said artificial intelligence on images of wood defects that are associable to a subjectivity of experienced shingle sawyers (See Bolton, [Col. 9, ln. 48–50], In some examples, the controller 308 utilizes a learning algorithm software to "learn" to grade veneer sheets over time. [Col. 9, ln. 54–60], The software can then determine what features of the images 55 differentiate the different grades of sheets. Then, when an image or images of a new veneer sheet is analyzed by the software, the features of this new sheet can be compared to the learned features to determine a grade of the new veneer sheet). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference comparing, using artificial intelligence; training said artificial intelligence on images of wood defects that are associable to a subjectivity of experienced shingle sawyers based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). However, Longfellow and Bolton fail(s) to teach said image of said identified defect to images of confirmed wood defects in a database of images of confirmed wood defects to find a match of said image of said identified defect in said images; when the match is found, adding said identified defect to said database of confirmed wood defects; when the match is not found: considering said identified defect as a false defect; while ignoring said identified defect. Chattopadhyay, working in the same field of endeavor, teaches: said image of said identified defect to images of confirmed wood defects in a database of images of confirmed wood defects to find a match of said image of said identified defect in said images (See Chattopadhyay, ¶ [0052], In some examples, the database 304 includes a table of possible faults correlated to certain feature combinations. The fault identifier 302 may use the table to identify the type(s) of fault(s)); when the match is found (See Chattopadhyay, ¶ [0052], In some examples, the database 304 includes a table of possible faults correlated to certain feature combinations. The fault identifier 302 may use the table to identify the type(s) of fault(s), …, In some examples, after a repair is made on a robot, the actual cause of the fault (e.g., a loose belt) may be uploaded to the fault classifier 122, which can then use the information in future predictions. Note: After the fault is identified (matched) and the repair is made the fault is uploaded to the classifier which the Examiner is interpreting as uploading to the database), adding said identified defect to said database of confirmed wood defects (See Chattopadhyay, ¶ [0052], In some examples, the database 304 includes a table of possible faults correlated to certain feature combinations. The fault identifier 302 may use the table to identify the type(s) of fault(s), …, In some examples, after a repair is made on a robot, the actual cause of the fault (e.g., a loose belt) may be uploaded to the fault classifier 122, which can then use the information in future predictions. Note: After the fault is identified (matched) and the repair is made the fault may be uploaded to the classifier which the Examiner is interpreting as uploading to the database); when the match is not found (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table)); when the match is not found (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table)): considering said identified defect as a false defect (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table)); while ignoring said identified defect (See Chattopadhyay, ¶ [0072], If the fault identifier 302 determines the outlier feature is a false alarm (e.g., the pattern of features does not match to an entry in the table), the fault identifier 302 generates an alert, at block 808, to inform the technician 120 or other user. For example, the fault identifier may transmit (e.g., over the network 116) a message to the technician 118 that the previous alert from the robot health monitor 110 is a false alarm. As such, the technician 118 may ignore the previous alert or notification. Note: the examiner is interpreting the abstraction as ignoring the defect if it a false alarm. Which Chattopadhyay is doing considering it’s a false alarm); and Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference said image of said identified defect to images of confirmed wood defects in a database of images of confirmed wood defects to find a match of said image of said identified defect in said images; when the match is found, adding said identified defect to said database of confirmed wood defects; when the match is not found: considering said identified defect as a false defect; while ignoring said identified defect based on the method of Chattopadhyay’s reference. The suggestion/motivation would have been to suppress the and prevent the generation of false alarms and identify valid defects in the object to more accurately identify defects and process the object accurately and timely based on the defect detection (See Chattopadhyay, ¶ [0021–0022]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton and Chattopadhyay with Longfellow to obtain the invention as specified in claim 18. Regarding claim 19, Longfellow in view of Chattopadhyay and further in view of Bolton teaches the method as claimed in claim 18, [wherein said step of training includes managing said database using a human-subjectivity port having access said images in said database for correcting a tagging of one of said images through said human-subjectivity port]. However, Longfellow and Chattopadhyay fail(s) to teach wherein said step of training includes managing said database using a human-subjectivity port having access said images in said database for correcting a tagging of one of said images through said human-subjectivity port. Bolton, working in the same field of endeavor, teaches: wherein said step of training includes managing said database using a human-subjectivity port having access said images in said database for correcting a tagging of one of said images through said human-subjectivity port (See Bolton, [Col. 11, ln. 1–7], In some examples, a machine learning algorithm can be used to help determine these parameters. In these examples, a first set of images that have a certain grade (e.g., GI as determined by manual grading) can be input to the controller 308. Then, a second set of images having a different grade (e.g., G2 as determined by manual grading) can be input to the controller 308); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference wherein said step of training includes managing said database using a human-subjectivity port having access said images in said database for correcting a tagging of one of said images through said human-subjectivity port based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton with Longfellow and Chattopadhyay to obtain the invention as specified in claim 19. Regarding claim 20, Longfellow in view of Chattopadhyay and further in view of Bolton teaches the method as claimed in claim 18, [wherein said step of classifying is done using a skip-a-scan approach]. However, Longfellow and Chattopadhyay fail(s) to teach wherein said step of classifying is done using a skip-a-scan approach. Bolton, working in the same field of endeavor, teaches: wherein said step of classifying is done using a skip-a-scan approach (See Bolton, [Col 7, ln. 8–13], By using both types of cameras to grade veneer sheets, the system 300 can take advantage of the defect detection strengths of each camera type. That is, the system 300 can use the black and white camera to detect certain defects of veneer sheets and the color camera to detect others. [Col 7, ln. 16–18], In some embodiments, infrared cameras can be used in addition to black and white and color cameras 304, 306 to capture heat signatures from veneer. Note: an additional scan for infrared can be used but just using the black and white camera and the color camera implies skipping the infrared camera); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference wherein said step of classifying is done using a skip-a-scan approach based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton with Longfellow and Chattopadhyay to obtain the invention as specified in claim 20. Claim(s) 3 are rejected under 35 U.S.C. 103 as being unpatentable over Longfellow (US 8113098 B1, hereafter, “Longfellow”) in view of Chattopadhyay et al. (US 20190099886 A1, hereafter, “Chattopadhyay”) and further in view of Wen et al. (US 20200294222 A1, hereafter, "Wen"). Regarding claim 3, Longfellow in view of Chattopadhyay teaches the method as claimed in claim 2, wherein said step of comparing comprises the steps of: [comparing pixels of said image of said identified defect to pixels of images in said database and finding an array of matching pixels on said image of said identified defect and on at least one of said images, wherein said array contains a percentage of pixels in said image]. However, Longfellow and Chattopadhyay fail(s) to teach comparing pixels of said image of said identified defect to pixels of images in said database and finding an array of matching pixels on said image of said identified defect and on at least one of said images, wherein said array contains a percentage of pixels in said image. Wen, working in the same field of endeavor, teaches: comparing pixels of said image of said identified defect to pixels of images in said database and finding an array of matching pixels on said image of said identified defect and on at least one of said images, wherein said array contains a percentage of pixels in said image (See Wen, ¶ [0048], Thus, the defect classification model may first extract the feature of the image of the to-be-inspected object using the feature extraction portion, thereby generating a target eigenvector. Then, the target eigenvector is compared with a plurality of eigenvectors in the corresponding relationship table successively, and if an eigenvector in the corresponding relationship table is identical or similar to the target eigenvector, then defect information corresponding to the eigenvector in the corresponding relationship table is used as defect information indicated by the image of the object. Note: Examiner eigenvector is being interpreted as the array). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference comparing pixels of said image of said identified defect to pixels of images in said database and finding an array of matching pixels on said image of said identified defect and on at least one of said images, wherein said array contains a percentage of pixels in said image based on the method of Wen’s reference. The suggestion/motivation would have been to improve the surface quality inspection of objects (See Wen, ¶ [0003, 0021]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Wen with Longfellow and Chattopadhyay to obtain the invention as specified in claim 3. Claim(s) 4–6 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Longfellow (US 8113098 B1, hereafter, “Longfellow”) in view of Chattopadhyay et al. (US 20190099886 A1, hereafter, “Chattopadhyay”) further in view of Wen et al. (US 20200294222 A1, hereafter, "Wen") and further in view of Bolton et al. (US 10,825,164 B1, hereafter, “Bolton”). Regarding claim 4, Longfellow in view of Chattopadhyay further in view of Wen teaches the method as claimed in claim 3, [wherein said step of identifying comprises the step of considering black and white defects, and using a clear-below-the-clear-line approach]. However, Longfellow, Chattopadhyay and Wen fail(s) to teach wherein said step of identifying comprises the step of considering black and white defects, and using a clear-below-the-clear-line approach. Bolton, working in the same field of endeavor, teaches: wherein said step of identifying comprises the step of considering black and white defects, and using a clear-below-the-clear-line approach (See Bolton, [Col. 7, ln. 3–6], In particular, black and white images can be used for, and are typically preferable for measuring the dimensions of a veneer sheet and identifying void areas within the sheet. [Col. 14, ln. 55–56], In block 2022, the imaging system detects knots on the veneer sheet as described above. Note: Black and white image are used for identifying void area defects and the knot detection is being interpreted as the clear-below-the-clear-line approach since the approach isn’t defined in the claims); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference wherein said step of identifying comprises the step of considering black and white defects, and using a clear-below-the-clear-line approach based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton with Longfellow, Chattopadhyay and Wen to obtain the invention as specified in claim 4. Regarding claim 5, Longfellow in view of Chattopadhyay further in view of Wen teaches the method as claimed in claim 3, [wherein said step of identifying is affected using an optimization- by-inversion approach]. However, Longfellow, Chattopadhyay and Wen fail(s) to teach wherein said step of identifying is affected using an optimization-by-inversion approach. Bolton, working in the same field of endeavor, teaches: wherein said step of identifying is affected using an optimization-by-inversion approach (See Bolton, [Col. 3, ln. 30–36], In some embodiments, the method can comprise translating the black and white image such that it has the same horizontal spacing as a reference image before performing the computer processing of the black and white image and translating the color image such that it has same horizontal spacing as the reference image before performing the computer processing of the color image. Note: the translation is being interpreted as inversion); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference wherein said step of identifying is affected using an optimization- by-inversion approach based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton with Longfellow, Chattopadhyay and Wen to obtain the invention as specified in claim 5. Regarding claim 6, Longfellow in view of Chattopadhyay further in view of Wen teaches the method as claimed in claim 5, [wherein said step of classifying is done using classifications from a group of classifications containing a Clear or Better classification and a Utility classification]. However, Longfellow, Chattopadhyay and Wen fail(s) to teach wherein said step of classifying is done using classifications from a group of classifications containing a Clear or Better classification and a Utility classification. Bolton, working in the same field of endeavor, teaches: wherein said step of classifying is done using classifications from a group of classifications containing a Clear or Better classification and a Utility classification (See Bolton, [Col. 15, ln. 21–29], That is, if a veneer sheet fails a first test or set of tests (i.e., the detected parameters are outside of allowable levels), then the veneer sheet can be sent to a bin for the lowest quality veneer (i.e., scrap). If the first set of tests is passed but a subsequent test or set of tests is failed, then the veneer sheet can be sent to a bin for a slightly higher quality of veneer. This can continue any number of times. If a veneer sheet passes every test, then it can be assigned the highest quality grade. Note: Examiner is interpreting the lower quality grade as the utility classification and the higher grades as Clear or Better classification); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference wherein said step of classifying is done using classifications from a group of classifications containing a Clear or Better classification and a Utility classification based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton with Longfellow, Chattopadhyay and Wen to obtain the invention as specified in claim 6. Regarding claim 9, Longfellow Longfellow in view of Chattopadhyay further in view of Wen and further in view of Bolton teaches teaches the method as claimed in claim 6, [wherein said Utility classification contains Grade C, "Second Clear" shingles and Grade D shingles; and the method further comprises the step of sorting said Grade C, "Second Clear" shingles from said Utility Classification by default] However, Longfellow, Chattopadhyay and Wen fail(s) to teach wherein said Utility classification contains Grade C, "Second Clear" shingles and Grade D shingles; and the method further comprises the step of sorting said Grade C, "Second Clear" shingles from said Utility Classification by default. Bolton, working in the same field of endeavor, teaches: wherein said Utility classification contains Grade C, "Second Clear" shingles and Grade D shingles; and the method further comprises the step of sorting said Grade C, "Second Clear" shingles from said Utility Classification by default (See Bolton, [Col. 15, ln. 21–29], That is, if a veneer sheet fails a first test or set of tests (i.e., the detected parameters are outside of allowable levels), then the veneer sheet can be sent to a bin for the lowest quality veneer (i.e., scrap). If the first set of tests is passed but a subsequent test or set of tests is failed, then the veneer sheet can be sent to a bin for a slightly higher quality of veneer. This can continue any number of times. If a veneer sheet passes every test, then it can be assigned the highest quality grade. Note: The lower grades are based on passing zero or few tests. Examiner is interpreting passing two tests as Grade C and passing one tests as Grade D); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference wherein said Utility classification contains Grade C, "Second Clear" shingles and Grade D shingles; and the method further comprises the step of sorting said Grade C, "Second Clear" shingles from said Utility Classification by default based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton with Longfellow, Chattopadhyay and Wen to obtain the invention as specified in claim 9. Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Longfellow (US 8113098 B1, hereafter, “Longfellow”) in view of Chattopadhyay et al. (US 20190099886 A1, hereafter, “Chattopadhyay”) further in view of Bolton et al. (US 10,825,164 B1, hereafter, “Bolton”). Regarding claim 7, Longfellow in view of Chattopadhyay teaches the method as claimed in claim 1, [further comprising the step of using artificial intelligence networks in said step of comparing said image of said identified defect to images of confirmed wood defects in a database of confirmed wood defects]. However, Longfellow fail(s) to teach further comprising the step of using artificial intelligence networks. Bolton, working in the same field of endeavor, teaches: further comprising the step of using artificial intelligence networks (See Bolton, [Col. 10, ln. 66–67 – Col. 11, ln. 1–3], The controller 308 can be programmed with certain parameters to be used by the various image processing tools to detect various defects. In some examples, a machine learning algorithm can be used to help determine these parameters); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference further comprising the step of using artificial intelligence networks in said step of comparing said image of said defect to images of confirmed wood defects in a database of confirmed wood defects based on the method of Bolton’s reference. The suggestion/motivation would have been to increase the speed and accuracy of wood grading (See Bolton, [Col. 1, ln. 13–50 and Col. 2, ln. 23–26]). However, Longfellow and Bolton fail(s) to teach comparing said image of said identified defect to images of confirmed defects in a database of confirmed defects. Chattopadhyay, working in the same field of endeavor, teaches: comparing said image of said identified defect to images of confirmed wood defects in a database of confirmed wood defects (See Chattopadhyay, ¶ [0072], For example, certain combinations of the features may generate an outlier (if one of the features does not satisfy its threshold) but may not be indicative of a faulty robot. In some examples, the fault identifier 302 compares the features to a table or listing stored in the database 304. The table represents patterns of features and/or outlier feature(s) that have historically appeared when corresponding defects (e.g., a worn belt) are present); Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference comparing said image of said identified defect to images of confirmed wood defects in a database of confirmed wood defects based on the method of Bolton’s reference. The suggestion/motivation would have been to suppress the and prevent the generation of false alarms and identify valid defects in the object to more accurately identify defects and process the object accurately and timely based on the defect detection (See Chattopadhyay, ¶ [0021–0022]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Bolton and Chattopadhyay with Longfellow to obtain the invention as specified in claim 7. Claim(s) 8 is rejected under 35 U.S.C. 103 as being unpatentable over Longfellow (US 8113098 B1, hereafter, “Longfellow”) in view of Chattopadhyay et al. (US 20190099886 A1, hereafter, “Chattopadhyay”) further in view of Roy et al. (US 11,364,589 B2, hereafter, "Roy"). Regarding claim 8, Longfellow in view of Chattopadhyay teaches the method as claimed in claim 1, [further comprising the step of ignoring machine defects, wood block imperfections and sapwood from said slab, during said step of identifying]. However, Longfellow and Chattopadhyay fail(s) to teach further comprising the step of ignoring machine defects, wood block imperfections and sapwood from said slab, during said step of identifying. Roy, working in the same field of endeavor, teaches: further comprising the step of ignoring machine defects, wood block imperfections and sapwood from said slab, during said step of identifying (See Roy, [Col. 5, ln. 4–10], For instance, some defects may be associated to a class categorizing them as requiring removal by cutting, whereas other defects may be associated to another class categorizing them as repairable by application of wood filler, and therefore to be ignored, or otherwise addressed as such, from the point of view of the cutting operation). Thus, it would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify Longfellow’s reference to further comprising the step of ignoring machine defects, wood block imperfections and sapwood from said slab, during said step of identifying based on the method of Roy’s reference. The suggestion/motivation would have been to decrease the time required for processing the wood (See Roy, [Col. 2, ln. 36–63]). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Roy with Longfellow and Chattopadhyay to obtain the invention as specified in claim 8. Allowable Subject Matter The following is an examiner’s statement for reason for allowance: the present invention is directed to a Classification and Sawing of the Wood Shingles using Machine Vision. Claims 21 are allowed. Claims 21 are independent claims. Claims 21 is allowed as applicant’s Arguments/Remarks filed on 10/27/2025 are persuasive on pages 10 and 11. Further Applicant’s Reply includes substantive claim amendments that have differentiated the claimed invention from the cited prior art. Upon completing an updated prior art search and considering the combination of limitations as presented as a whole for the claim, the feature highlighted below are considered an improvement over the prior art and have not been found to be anticipated or rendered obvious by a combination of prior art. Independent Claim(s) 21 recites, inter alia the uniquely distinct features as shown in the excerpt below: [21] “A method of wood shingle classification by machine vision and shingle packaging the method comprising the steps of: taking an image of a wood shingle and determining a clear line on said wood shingle; identifying a single defect in said image; when said single defect is above said clear line; classifying said wood shingle as a Clear-or-Better shingle grade combining Grade A and Grade B together; and packaging said wood shingle: when said single defect is below said clear line; classifying said wood shingle as a Utility shingle grade combining Grade C and Grade D together; and packaging said wood shingle.”, as recited by independent claim 21, in combination with the other elements/steps of the claim. These features, considered in combination with the remainder of the claim’s limitations are not fairly disclosed, thought or suggested by the cited prior art. Specifically, the closest prior art (Previously cited), Longfellow (US 8113098 B1), Chattopadhyay et al. (US 20190099886 A1), Wen et al. (US 20200294222 A1), Bolton et al. (US 10,825,164 B1), and Roy et al. (US 11,364,589 B2), fails to either anticipate or render obvious the above underlined limitations. Accordingly, claim(s) 21 is allowable over the prior art of record. Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. DeLean (US 7369685 B2) teaches Once a match has been determined, the system can initiate actions at a step 1820, which may include allowing access to a facility, and which may optionally include storing the newly captured image in the reference database for future matching purposes. If there is no match, then the system can repeat the above steps. Auerbach (US 7844100 B2) teaches a method for inspecting a sample, consisting of receiving a definition of image attributes that are characteristic of defects, and processing an image of the sample so as to identify candidate defects on the sample. The method further includes forming distributions of values of the respective attributes from the candidate defects, and selecting a set of the candidate defects that are characterized by respective candidate attribute values that fall in one or more tails of the distributions. The selected set is presented to a human operator, and respective classifications of the candidate defects in the selected set are received from the operator. A definition of the one or more tails of the distributions is refined responsively to the classifications. The method may be used as a filter to remove false alarms, or nuisances. The method may also be used to categorize the candidate defects into two or more classes. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DION J SATCHER whose telephone number is (703)756-5849. The examiner can normally be reached Monday - Thursday 5:30 am - 2:30 pm, Friday 5:30 am - 9:30 am PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Henok Shiferaw can be reached at (571) 272-4637. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DION J SATCHER/ Patent Examiner, Art Unit 2676 /Henok Shiferaw/ Supervisory Patent Examiner, Art Unit 2676
Read full office action

Prosecution Timeline

Dec 22, 2022
Application Filed
Aug 01, 2025
Non-Final Rejection — §103
Oct 24, 2025
Interview Requested
Oct 27, 2025
Response Filed
Nov 04, 2025
Examiner Interview Summary
Nov 04, 2025
Applicant Interview (Telephonic)
Jan 29, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586218
MOTION ESTIMATION WITH ANATOMICAL INTEGRITY
2y 5m to grant Granted Mar 24, 2026
Patent 12579787
INSTRUMENT RECOGNITION METHOD BASED ON IMPROVED U2 NETWORK
2y 5m to grant Granted Mar 17, 2026
Patent 12573066
Depth Estimation Using a Single Near-Infrared Camera and Dot Illuminator
2y 5m to grant Granted Mar 10, 2026
Patent 12555263
SYSTEMS AND METHODS FOR TWO-STAGE OBJECTION DETECTION
2y 5m to grant Granted Feb 17, 2026
Patent 12548140
DETERMINING PROCESS DEVIATIONS THROUGH VIDEO ANALYSIS
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
85%
Grant Probability
99%
With Interview (+14.2%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 39 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month