Prosecution Insights
Last updated: April 19, 2026
Application No. 18/596,776

PRINT SYSTEM, INSPECTION APPARATUS, METHOD OF CONTROLLING INSPECTION APPARATUS, AND STORAGE MEDIUM

Non-Final OA §103
Filed
Mar 06, 2024
Examiner
WINDSOR, COURTNEY J
Art Unit
2661
Tech Center
2600 — Communications
Assignee
Canon Kabushiki Kaisha
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
96%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
217 granted / 252 resolved
+24.1% vs TC avg
Moderate +9% lift
Without
With
+9.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
32 currently pending
Career history
284
Total Applications
across all art units

Statute-Specific Performance

§101
5.4%
-34.6% vs TC avg
§103
51.1%
+11.1% vs TC avg
§102
20.5%
-19.5% vs TC avg
§112
17.9%
-22.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 252 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on March 6, 2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claim 1 objected to because of the following informalities: Claim 1, line 13, “toidentify” should read “to identify” Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 4-10 and 12-18 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Publication No. 2019/0281171 to Hayashi (hereinafter Hayashi), and further in view of U.S. Publication No. 2012/0121139 to Kojima et al. (hereinafter Kojima). Regarding independent claim 1, Hayashi discloses A print system (abstract, “A diagnosis system;” paragraph 0005, “Therefore, a system has been proposed in which a pre-set evaluation chart is printed, a printed evaluation chart is read or scanned, an abnormal image is extracted from the read or scanned image based on information on internal conditions of an apparatus, and then an abnormal image is analyzed to identify the cause of the failure.”) comprising a printing apparatus (paragraph 0034, “The image forming unit 12 includes,”…“the image forming apparatus 1000 is a color image forming apparatus that can print color image”) and an inspection apparatus (Figure 3, element 66, “determination unit” and 65, “diagnosis unit”), wherein the printing apparatus (Figure 3, element 61, “print unit”) generates a printed material based on print data (Figure 6, element S601, “scan document;” paragraph 0077, “In step S601, the scan unit 62 scans or reads a document;” a scanned document is read as a printed material that has been made into a digital format), and the inspection apparatus (Figure 3, element 66, “determination unit” and 65, “diagnosis unit”), the print system comprising: one or more controllers including one or more processors and one or more memories (paragraph 0008, “In another aspect of the present invention, a non-transitory computer readable storage medium storing one or more instructions that, when performed by one or more processors, cause the one or more processors to execute a method of diagnosing of an image forming apparatus is devised.”), the one or more controllers configured: to obtain a read image by reading the printed material (Figure 6, element S601, “scan document;” paragraph 0077, “In step S601, the scan unit 62 scans or reads a document;” a scanned document is read as a printed material that has been made into a digital format); to detect an image defect that has occurred in the printed material by comparing the read image and a reference image (Figure 6, element S603, “document is identical?;” paragraph 0079, “The determination whether the document scanned by the scan unit 62 is the same as the document previously scanned by the scan unit 62 is performed by comparing the feature data stored in the storage device 46 and then determining whether the feature data are identical. The determination can be performed using first to third methods to be described below;” paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated.”); toidentify a cause of the image defect based on the extracted feature information (paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated;” paragraph 0102, “The background stain is a stain caused by a toner adhering to a portion where toner is not supposed to be adhered;” paragraph 0112, “If the edge of the stripe is sharp, it can be assumed that the abnormality is caused by an optical system that performs writing and scanning, and if the end of the stripe pattern is blurred, it can be assumed that the abnormality is caused by a factor other than the optical system.”). Hayashi fails to explicitly disclose as further recited. However, Kojima discloses to extract feature information of the image defect based on difference information between pixel data of the reference image corresponding to a plurality of pixels in a region surrounding the image defect excluding the image defect in the region and pixel data indicating the image defect included in the read image (abstract, “ compare the reference image and the target image to detect differences in pixel values;” comparing images is read as comparing both regions of the defect and not including the defect (i.e. the entirety of the image); see also Figure 7B, comparing RGB values in multiple regions). Hayashi is directed toward “This disclosure relates to a diagnosis system for diagnosing an image forming apparatus, a method of diagnosing an image forming apparatus, and a storage medium of a program for causing a computer to diagnose an image forming apparatus (paragraph 0002).” Kojima is directed toward, “An inspection apparatus includes an obtaining unit configured to receive a target image obtained by scanning a printed surface of a printed material and receive a reference image obtained from print data of the printed surface;” … “compare the reference image and the target image to detect differences in pixel values (abstract).” As can be seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Hayashi and Kojima are directed toward similar methods of endeavor of print output analysis. Further, Kojima allows for analysis on a pixel level of the image data for comparison (abstract, “ compare the reference image and the target image to detect differences in pixel values”). One of ordinary skill in the art before the effective filing date of the claimed invention would be aware that image analysis on a pixel level as opposed to metadata provides key information about an image on a very specific and small scale; said differently, analysis at an image level can lose key details that may be necessary for analysis accuracy. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to ensure the smallest levels of images are analyzed to catch any differences between the image data sets being compared. Regarding dependent claim 2, the rejection of claim 1 is incorporated herein. Additionally, Kojima in the combination further discloses wherein the pixel data of the reference image includes an average value of RGB signal values of the plurality of pixels in the region surrounding the image defect excluding the image defect in the region (paragraph 0063, “As another example, the flatness analysis unit 12 may calculate a total or an average of differences between pixel values (RGB values) of a reference pixel and adjacent pixels adjacent to the reference pixel in each rectangular area of the reference image G1 as a direct flatness level;” paragraph 0113, “In this case, in steps S205, S208, S211, and S213, the difference detecting unit 132 compares pixels in the corresponding rectangular areas R of the reference image G1 and the target image G2 and calculates average differences between the pixels. More specifically, the difference detecting unit 132 compares pixel values (RGB values) of pixels in a rectangular area R of the reference image G1 with pixel values of pixels in the corresponding rectangular area R of the target image G2 to obtain absolute values indicating the differences between the pixel values (for the respective RGB components);” being that these rectangular regions are in both images, these are read as being “surrounding” the image defect). It is well known in the art that regions around a defect can provide valuable information for image processing. For example, harsher defects may include a larger change in color between a defect area and a non-defect area and need correcting as opposed to less impactful defects. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to ensure information about the entire region of a defect is analyzed, including an adjacent area to better understand the defect as a whole. Regarding dependent claim 4, the rejection of claim 1 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein the cause of the image defect is due to either of the printing apparatus and the inspection apparatus (paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated;” printing apparatus: paragraph 0102, “The background stain is a stain caused by a toner adhering to a portion where toner is not supposed to be adhered;” inspection apparatus: paragraph 0112, “If the edge of the stripe is sharp, it can be assumed that the abnormality is caused by an optical system that performs writing and scanning, and if the end of the stripe pattern is blurred, it can be assumed that the abnormality is caused by a factor other than the optical system.”). Regarding dependent claim 5, the rejection of claim 1 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein the printed material is a test chart (paragraph 0125, “the detection result is determined as abnormality information on the test chart, and then the sequence of extracting the abnormality information is ended;” paragraph 0134, “FIG. 11C illustrates a screen when “check effect” is selected in FIG. 11B. When “check effect” is selected, the screen displays an instruction to prompt the user to print the test chart, place the printed matter of test chart on the document table or ADF 10, and then scan the printed matter of test chart. ”) including an image of a plurality of colors (paragraph 0050, “FIG. 4 illustrates examples of abnormality types and test charts. The abnormality types include, for example, three types such as “stain, faint, and color misregistration,” which can be selected by the user. Since the three types are just examples, other types can be also used as abnormality;” for color misregistration to be present, there must be color in the test chart). Regarding dependent claim 6, the rejection of claim 5 is incorporated herein. Additionally, Kojima in the combination further discloses wherein, in the detection of the image defect, the one or more controllers detect an image defect that has occurred in the printed material based on difference information between the reference image generated from the print data and the read image (abstract, “An inspection apparatus includes an obtaining unit configured to receive a target image obtained by scanning a printed surface of a printed material and receive a reference image obtained from print data of the printed surface”… “compare the reference image and the target image to detect differences in pixel values”). It is well known to one of ordinary skill in the art before the effective filing date of the claimed invention then when comparing two entities, it can be helpful to analyze a simplest or smallest form of the data to ensure any minute differences can be captured. The smallest entity of an image would be a pixel, and analysis on a pixel level can detect differences that analysis on an image level may not. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to accurately determine at the simplest image level if there is a defect. Regarding dependent claim 7, the rejection of claim 5 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein, in the detection of the image defect, the image defect is detected based on difference information between the reference image obtained by reading the test chart and the read image (Figure 7; paragraph 0111, “In step S705, the presence or absence of abnormality is determined based on the presence or absence of abnormality portion. If it is determined that the abnormality is present (S705: YES), the sequence proceeds to step S706;” paragraph 0125, “ In step S710, the detection result is determined as abnormality information on the test chart, and then the sequence of extracting the abnormality information is ended.”). Regarding dependent claim 8, the rejection of claim 7 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein the test chart includes an image area and a non-image area (See Figure 18 and 19; paragraph 0179, “FIG. 18 illustrates an example of a test chart of “background stain/density detection pattern” indicated in FIG. 4. This pattern is used for detecting a background stain of one color and a thin/thick density. The pattern includes an image portion 77 and the non-image portion 71;” paragraph 0181, “FIG. 19 illustrates an example of a test chart of “full-color background stain/density detection pattern” indicated in FIG. 4. This pattern is used for detecting the background stain of all colors of yellow, magenta, cyan, and black, and a thin/thick density. Similar to an example case in FIG. 18, the image portion 77 sets a value of 256 with respect to 8 bits (256 values) while the non-image portion 71 sets a value of 0 with respect to 8 bits (256 values), in which each portion is the uniform density test pattern. ”). the pixel data of the reference image is obtained by reading the test chart, corresponding to each of the image area and the non-image area (paragraph 0005, “Therefore, a system has been proposed in which a pre-set evaluation chart is printed, a printed evaluation chart is read or scanned, an abnormal image is extracted from the read or scanned image based on information on internal conditions of an apparatus, and then an abnormal image is analyzed to identify the cause of the failure.;” paragraph 0174, “FIG. 16 illustrates an example of a test chart of “color adjustment pattern 2” indicated in FIG. 4. In an example illustrated in FIG. 15, only the original colors (primary colors) such as yellow, magenta, cyan, and black are used. In an example case in FIG. 16, a color pattern 73 and a gradation pattern 74 are set. The color pattern 73 includes the primary color and secondary color generated by mixing the primary colors. The gradation pattern 74 includes tertiary color generated by mixing the primary color and the secondary color. The secondary color corresponds to, for example, blue, green and red, and the tertiary color corresponds to, for example, gray or grey. By using such patterns, the deviation from the target color can be easily detected, the patterns can be scanned by the scanner 50 and the image processing parameters can be adjusted;” the test charts are used essentially as calibration between the multiple images; paragraph 0179, “For example, to evaluate the color margin, the image portion 77 can be set with a value of 200 while the non-image portion 71 can be set with a value rage of 8 to 16. Since this is an example, the image portion 77 and the non-image portion 71 can be set with other values, such as a value of 220 and a value of 20, respectively.”). However, Hayashi fails to explicitly disclose as further recited. However, Kojima in the combination further discloses the pixel data of the reference image is obtained from an average value of RGB signal values of pixels of read image (paragraph 0063, “As another example, the flatness analysis unit 12 may calculate a total or an average of differences between pixel values (RGB values) of a reference pixel), obtained by reading the test chart, corresponding to each of the image area and the non-image area. It is well known to one of ordinary skill in the art before the effective filing date of the claimed invention that color data varies drastically across an image. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to take account for outliers by using averaging of a region to simplify data into simpler terms. Regarding independent claim 9, the rejection of claim 1 applies directly. Additionally, Hayashi discloses An inspection apparatus for receiving and inspecting a printed material that has been printed on a printing apparatus (abstract, “A diagnosis system;” paragraph 0005, “Therefore, a system has been proposed in which a pre-set evaluation chart is printed, a printed evaluation chart is read or scanned, an abnormal image is extracted from the read or scanned image based on information on internal conditions of an apparatus, and then an abnormal image is analyzed to identify the cause of the failure;” paragraph 0002, “this disclosure relates to a diagnosis system for diagnosing an image forming apparatus, a method of diagnosing an image forming apparatus, and a storage medium of a program for causing a computer to diagnose an image forming apparatus.”), the inspection apparatus comprising: a reader (paragraph 0077, “In step S601, the scan unit 62 scans or reads a document”); and one or more controllers including one or more processors and one or more memories (paragraph 0008, “In another aspect of the present invention, a non-transitory computer readable storage medium storing one or more instructions that, when performed by one or more processors, cause the one or more processors to execute a method of diagnosing of an image forming apparatus is devised.”), the one or more controllers configured: to obtain a read image by reading the printed material by using the reader (Figure 6, element S601, “scan document;” paragraph 0077, “In step S601, the scan unit 62 scans or reads a document;” a scanned document is read as a printed material that has been made into a digital format); to detect an image defect that has occurred in the printed material by comparing the read image and a reference image (Figure 6, element S603, “document is identical?;” paragraph 0079, “The determination whether the document scanned by the scan unit 62 is the same as the document previously scanned by the scan unit 62 is performed by comparing the feature data stored in the storage device 46 and then determining whether the feature data are identical. The determination can be performed using first to third methods to be described below;” paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated.”); to identify a cause of the image defect based on the extracted feature information (paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated;” paragraph 0102, “The background stain is a stain caused by a toner adhering to a portion where toner is not supposed to be adhered;” paragraph 0112, “If the edge of the stripe is sharp, it can be assumed that the abnormality is caused by an optical system that performs writing and scanning, and if the end of the stripe pattern is blurred, it can be assumed that the abnormality is caused by a factor other than the optical system.”). Hayashi fails to explicitly disclose as further recited. However, Kojima discloses to extract feature information of the image defect based on difference information between pixel data of the reference image corresponding to a plurality of pixels in a region surrounding the image defect and excluding the image defect in the region and pixel data indicating the image defect included in the read image (abstract, “ compare the reference image and the target image to detect differences in pixel values;” comparing images is read as comparing both regions of the defect and not including the defect (i.e. the entirety of the image); see also Figure 7B, comparing RGB values in multiple regions). Hayashi is directed toward “This disclosure relates to a diagnosis system for diagnosing an image forming apparatus, a method of diagnosing an image forming apparatus, and a storage medium of a program for causing a computer to diagnose an image forming apparatus (paragraph 0002).” Kojima is directed toward, “An inspection apparatus includes an obtaining unit configured to receive a target image obtained by scanning a printed surface of a printed material and receive a reference image obtained from print data of the printed surface;” … “compare the reference image and the target image to detect differences in pixel values (abstract).” As can be seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Hayashi and Kojima are directed toward similar methods of endeavor of print output analysis. Further, Kojima allows for analysis on a pixel level of the image data for comparison (abstract, “ compare the reference image and the target image to detect differences in pixel values”). One of ordinary skill in the art before the effective filing date of the claimed invention would be aware that image analysis on a pixel level as opposed to metadata provides key information about an image on a very specific and small scale; said differently, analysis at an image level can lose key details that may be necessary for analysis accuracy. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to ensure the smallest levels of images are analyzed to catch any differences between the image data sets being compared. Regarding dependent claim 10, the rejection of claim 9 is incorporated herein. Additionally, Kojima in the combination further discloses wherein the pixel data of the reference image includes an average value of RGB signal values of the plurality of pixels in the region surrounding the image defect excluding the image defect in the region (paragraph 0063, “As another example, the flatness analysis unit 12 may calculate a total or an average of differences between pixel values (RGB values) of a reference pixel and adjacent pixels adjacent to the reference pixel in each rectangular area of the reference image G1 as a direct flatness level;” paragraph 0113, “In this case, in steps S205, S208, S211, and S213, the difference detecting unit 132 compares pixels in the corresponding rectangular areas R of the reference image G1 and the target image G2 and calculates average differences between the pixels. More specifically, the difference detecting unit 132 compares pixel values (RGB values) of pixels in a rectangular area R of the reference image G1 with pixel values of pixels in the corresponding rectangular area R of the target image G2 to obtain absolute values indicating the differences between the pixel values (for the respective RGB components);” being that these rectangular regions are in both images, these are read as being “surrounding” the image defect). It is well known in the art that regions around a defect can provide valuable information for image processing. For example, harsher defects may include a larger change in color between a defect area and a non-defect area and need correcting as opposed to less impactful defects. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to ensure information about the entire region of a defect is analyzed, including an adjacent area to better understand the defect as a whole. Regarding dependent claim 12, the rejection of claim 9 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein the cause of the image defect is due to either of the printing apparatus and the inspection apparatus (paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated;” printing apparatus: paragraph 0102, “The background stain is a stain caused by a toner adhering to a portion where toner is not supposed to be adhered;” inspection apparatus: paragraph 0112, “If the edge of the stripe is sharp, it can be assumed that the abnormality is caused by an optical system that performs writing and scanning, and if the end of the stripe pattern is blurred, it can be assumed that the abnormality is caused by a factor other than the optical system.”). Regarding dependent claim 13, the rejection of claim 9 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein the printed material is a test chart (paragraph 0125, “the detection result is determined as abnormality information on the test chart, and then the sequence of extracting the abnormality information is ended;” paragraph 0134, “FIG. 11C illustrates a screen when “check effect” is selected in FIG. 11B. When “check effect” is selected, the screen displays an instruction to prompt the user to print the test chart, place the printed matter of test chart on the document table or ADF 10, and then scan the printed matter of test chart. ”) including an image of a plurality of colors (paragraph 0050, “FIG. 4 illustrates examples of abnormality types and test charts. The abnormality types include, for example, three types such as “stain, faint, and color misregistration,” which can be selected by the user. Since the three types are just examples, other types can be also used as abnormality;” for color misregistration to be present, there must be color in the test chart). Regarding dependent claim 14, the rejection of claim 5 is incorporated herein. Additionally, Kojima in the combination further discloses wherein, in the detection of the image defect, the one or more controllers detect an image defect that has occurred in the printed material based on difference information between the reference image generated from print data based upon which the test chart is printed and the read image (abstract, “An inspection apparatus includes an obtaining unit configured to receive a target image obtained by scanning a printed surface of a printed material and receive a reference image obtained from print data of the printed surface”… “compare the reference image and the target image to detect differences in pixel values”). It is well known to one of ordinary skill in the art before the effective filing date of the claimed invention then when comparing two entities, it can be helpful to analyze a simplest or smallest form of the data to ensure any minute differences can be captured. The smallest entity of an image would be a pixel, and analysis on a pixel level can detect differences that analysis on an image level may not. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to accurately determine at the simplest image level if there is a defect. Regarding dependent claim 15, the rejection of claim 13 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein, in the detection of the image defect, the image defect is detected based on difference information between the reference image obtained by reading the test chart with the reader and the read image (Figure 7; paragraph 0111, “In step S705, the presence or absence of abnormality is determined based on the presence or absence of abnormality portion. If it is determined that the abnormality is present (S705: YES), the sequence proceeds to step S706;” paragraph 0125, “ In step S710, the detection result is determined as abnormality information on the test chart, and then the sequence of extracting the abnormality information is ended.”). Regarding dependent claim 16, the rejection of claim 15 is incorporated herein. Additionally, Hayashi in the combination further discloses wherein the test chart includes an image area and a non-image area (See Figure 18 and 19; paragraph 0179, “FIG. 18 illustrates an example of a test chart of “background stain/density detection pattern” indicated in FIG. 4. This pattern is used for detecting a background stain of one color and a thin/thick density. The pattern includes an image portion 77 and the non-image portion 71;” paragraph 0181, “FIG. 19 illustrates an example of a test chart of “full-color background stain/density detection pattern” indicated in FIG. 4. This pattern is used for detecting the background stain of all colors of yellow, magenta, cyan, and black, and a thin/thick density. Similar to an example case in FIG. 18, the image portion 77 sets a value of 256 with respect to 8 bits (256 values) while the non-image portion 71 sets a value of 0 with respect to 8 bits (256 values), in which each portion is the uniform density test pattern. ”). the pixel data of the reference image is obtained by reading the test chart, corresponding to each of the image area and the non-image area (paragraph 0005, “Therefore, a system has been proposed in which a pre-set evaluation chart is printed, a printed evaluation chart is read or scanned, an abnormal image is extracted from the read or scanned image based on information on internal conditions of an apparatus, and then an abnormal image is analyzed to identify the cause of the failure.;” paragraph 0174, “FIG. 16 illustrates an example of a test chart of “color adjustment pattern 2” indicated in FIG. 4. In an example illustrated in FIG. 15, only the original colors (primary colors) such as yellow, magenta, cyan, and black are used. In an example case in FIG. 16, a color pattern 73 and a gradation pattern 74 are set. The color pattern 73 includes the primary color and secondary color generated by mixing the primary colors. The gradation pattern 74 includes tertiary color generated by mixing the primary color and the secondary color. The secondary color corresponds to, for example, blue, green and red, and the tertiary color corresponds to, for example, gray or grey. By using such patterns, the deviation from the target color can be easily detected, the patterns can be scanned by the scanner 50 and the image processing parameters can be adjusted;” the test charts are used essentially as calibration between the multiple images; paragraph 0179, “For example, to evaluate the color margin, the image portion 77 can be set with a value of 200 while the non-image portion 71 can be set with a value rage of 8 to 16. Since this is an example, the image portion 77 and the non-image portion 71 can be set with other values, such as a value of 220 and a value of 20, respectively.”). However, Hayashi fails to explicitly disclose as further recited. However, Kojima in the combination further discloses the pixel data of the reference image is obtained from an average value of RGB signal values of pixels of read image (paragraph 0063, “As another example, the flatness analysis unit 12 may calculate a total or an average of differences between pixel values (RGB values) of a reference pixel), obtained by reading the test chart, corresponding to each of the image area and the non-image area. It is well known to one of ordinary skill in the art before the effective filing date of the claimed invention that color data varies drastically across an image. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to take account for outliers by using averaging of a region to simplify data into simpler terms. Regarding independent claim 17, the rejection of claim 1 applies directly. Additionally, Hayashi discloses A method of controlling an inspection apparatus for receiving and inspecting a printed material that has been printed by a printing apparatus (abstract, “A diagnosis system;” paragraph 0005, “Therefore, a system has been proposed in which a pre-set evaluation chart is printed, a printed evaluation chart is read or scanned, an abnormal image is extracted from the read or scanned image based on information on internal conditions of an apparatus, and then an abnormal image is analyzed to identify the cause of the failure;” paragraph 0002, “this disclosure relates to a diagnosis system for diagnosing an image forming apparatus, a method of diagnosing an image forming apparatus, and a storage medium of a program for causing a computer to diagnose an image forming apparatus.”), the method comprising: obtaining a read image by reading the printed material (Figure 6, element S601, “scan document;” paragraph 0077, “In step S601, the scan unit 62 scans or reads a document;” a scanned document is read as a printed material that has been made into a digital format); detecting an image defect that has occurred in the printed material by comparing the read image and a reference image (Figure 6, element S603, “document is identical?;” paragraph 0079, “The determination whether the document scanned by the scan unit 62 is the same as the document previously scanned by the scan unit 62 is performed by comparing the feature data stored in the storage device 46 and then determining whether the feature data are identical. The determination can be performed using first to third methods to be described below;” paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated.”); identifying a cause of the image defect based on the extracted feature information (paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated;” paragraph 0102, “The background stain is a stain caused by a toner adhering to a portion where toner is not supposed to be adhered;” paragraph 0112, “If the edge of the stripe is sharp, it can be assumed that the abnormality is caused by an optical system that performs writing and scanning, and if the end of the stripe pattern is blurred, it can be assumed that the abnormality is caused by a factor other than the optical system.”). Hayashi fails to explicitly disclose as further recited. However, Kojima discloses extracting feature information of the image defect based on difference information between pixel data of the reference image corresponding to a plurality of pixels in a region surrounding the image defect excluding the image defect in the region and pixel data indicating the image defect included in the read image (abstract, “ compare the reference image and the target image to detect differences in pixel values;” comparing images is read as comparing both regions of the defect and not including the defect (i.e. the entirety of the image); see also Figure 7B, comparing RGB values in multiple regions). Hayashi is directed toward “This disclosure relates to a diagnosis system for diagnosing an image forming apparatus, a method of diagnosing an image forming apparatus, and a storage medium of a program for causing a computer to diagnose an image forming apparatus (paragraph 0002).” Kojima is directed toward, “An inspection apparatus includes an obtaining unit configured to receive a target image obtained by scanning a printed surface of a printed material and receive a reference image obtained from print data of the printed surface;” … “compare the reference image and the target image to detect differences in pixel values (abstract).” As can be seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Hayashi and Kojima are directed toward similar methods of endeavor of print output analysis. Further, Kojima allows for analysis on a pixel level of the image data for comparison (abstract, “ compare the reference image and the target image to detect differences in pixel values”). One of ordinary skill in the art before the effective filing date of the claimed invention would be aware that image analysis on a pixel level as opposed to metadata provides key information about an image on a very specific and small scale; said differently, analysis at an image level can lose key details that may be necessary for analysis accuracy. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to ensure the smallest levels of images are analyzed to catch any differences between the image data sets being compared. Regarding independent claim 18, the rejection of claim 1 applies directly. Additionally, Hayashi discloses A non-transitory computer-readable storage medium storing a program for causing a processor to execute a method of controlling an inspection apparatus for receiving and inspecting a printed material that has been printed by a printing apparatus (abstract, “A diagnosis system;” paragraph 0005, “Therefore, a system has been proposed in which a pre-set evaluation chart is printed, a printed evaluation chart is read or scanned, an abnormal image is extracted from the read or scanned image based on information on internal conditions of an apparatus, and then an abnormal image is analyzed to identify the cause of the failure;” paragraph 0008, “In another aspect of the present invention, a non-transitory computer readable storage medium storing one or more instructions that, when performed by one or more processors, cause the one or more processors to execute a method of diagnosing of an image forming apparatus is devised.”), the method comprising: obtaining a read image by reading the printed material (Figure 6, element S601, “scan document;” paragraph 0077, “In step S601, the scan unit 62 scans or reads a document;” a scanned document is read as a printed material that has been made into a digital format); detecting an image defect that has occurred in the printed material by comparing the read image and a reference image (Figure 6, element S603, “document is identical?;” paragraph 0079, “The determination whether the document scanned by the scan unit 62 is the same as the document previously scanned by the scan unit 62 is performed by comparing the feature data stored in the storage device 46 and then determining whether the feature data are identical. The determination can be performed using first to third methods to be described below;” paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated.”); identifying a cause of the image defect based on the extracted feature information (paragraph 0090, “The diagnosis unit 65 analyzes the cause of abnormality based on the information on the abnormal portion acquired by processing the abnormality information. When the causes of abnormality are analyzed, a portion or part where the abnormality (defect) occurring in the image forming apparatus 1000 is estimated;” paragraph 0102, “The background stain is a stain caused by a toner adhering to a portion where toner is not supposed to be adhered;” paragraph 0112, “If the edge of the stripe is sharp, it can be assumed that the abnormality is caused by an optical system that performs writing and scanning, and if the end of the stripe pattern is blurred, it can be assumed that the abnormality is caused by a factor other than the optical system.”). Hayashi fails to explicitly disclose as further recited. However, Kojima discloses extracting feature information of the image defect based on difference information between pixel data of the reference image corresponding to a plurality of pixels in a region surrounding the image defect excluding the image defect in the region and pixel data indicating the image defect included in the read image (abstract, “ compare the reference image and the target image to detect differences in pixel values;” comparing images is read as comparing both regions of the defect and not including the defect (i.e. the entirety of the image); see also Figure 7B, comparing RGB values in multiple regions) Hayashi is directed toward “This disclosure relates to a diagnosis system for diagnosing an image forming apparatus, a method of diagnosing an image forming apparatus, and a storage medium of a program for causing a computer to diagnose an image forming apparatus (paragraph 0002).” Kojima is directed toward, “An inspection apparatus includes an obtaining unit configured to receive a target image obtained by scanning a printed surface of a printed material and receive a reference image obtained from print data of the printed surface;” … “compare the reference image and the target image to detect differences in pixel values (abstract).” As can be seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Hayashi and Kojima are directed toward similar methods of endeavor of print output analysis. Further, Kojima allows for analysis on a pixel level of the image data for comparison (abstract, “ compare the reference image and the target image to detect differences in pixel values”). One of ordinary skill in the art before the effective filing date of the claimed invention would be aware that image analysis on a pixel level as opposed to metadata provides key information about an image on a very specific and small scale; said differently, analysis at an image level can lose key details that may be necessary for analysis accuracy. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Kojima in order to ensure the smallest levels of images are analyzed to catch any differences between the image data sets being compared. Claim(s) 3 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Hayashi and Kojima as applied to claims 1 and 9 respectively above, and further in view of JP4507523 (a machine translation provided from Google Patents, hereinafter JP ‘523). Regarding dependent claim 3, the rejection of claim 1 is incorporated herein. Additionally, Hayashi and Kojima in the combination as a whole fails to explicitly disclose wherein the region surrounding the image defect is a region having sides that are apart from at least upper, lower, left, and right edge portions of the image defect by a predetermined length. However, JP ‘523 discloses wherein the region surrounding the image defect is a region having sides that are apart from at least upper, lower, left, and right edge portions of the image defect by a predetermined length (page 6-7, “In this processing, the image processing apparatus 50 checks whether each pixel of the inspection image has a predetermined characteristic indicating that it is a normal print portion or an erroneous print portion D1 to D4, and a pixel having the predetermined characteristic is determined.;” page 3, “In the target pixel area extraction step, pixels having predetermined characteristics in the inspection image are expanded by a predetermined number of pixels to extract pixel areas in which pixels having the predetermined characteristics are connected to each other, and the extracted pixel areas Are extracted as the target pixel area, and the feature amount of the target pixel area includes the total number of pixels of the target pixel area. In the printing state determination step, the difference is A printed matter inspection program for determining that a print state of a target pixel region whose total number of pixels is equal to or less than a predetermined reference value among the target pixel regions of an image is defective .”). As noted above, Hayashi and Kojima are directed toward similar methods of endeavor of print output analysis. Additionally, JP ‘523 is directed toward “a printed matter inspection apparatus and a printed matter inspection program for determining the quality of a printed matter (abstract).” As can be seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Hayashi, Kojima and JP ‘523 are all directed toward similar methods of endeavor of print output analysis. Further, JP ‘523 allows for expanding a defect area into a predefined shape. One of ordinary skill in the art would easily understand that expanding to a bounding box of a defect allows for the quantification of different features of the defect (i.e. height and width). It would have been obvious to person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of JP ‘523 in order to ensure additional important data of a defect can be determined. Regarding dependent claim 11, the rejection of claim 9 is incorporated herein. Additionally, Hayashi and Kojima in the combination as a whole fails to explicitly disclose wherein the region surrounding the image defect is a rectangle having sides that are apart from at least upper, lower, left, and right edge portions of the image defect by a predetermined length. However, JP ‘523 discloses wherein the region surrounding the image defect is a rectangle having sides that are apart from at least upper, lower, left, and right edge portions of the image defect by a predetermined length (page 6-7, “In this processing, the image processing apparatus 50 checks whether each pixel of the inspection image has a predetermined characteristic indicating that it is a normal print portion or an erroneous print portion D1 to D4, and a pixel having the predetermined characteristic is determined.;” page 3, “In the target pixel area extraction step, pixels having predetermined characteristics in the inspection image are expanded by a predetermined number of pixels to extract pixel areas in which pixels having the predetermined characteristics are connected to each other, and the extracted pixel areas Are extracted as the target pixel area, and the feature amount of the target pixel area includes the total number of pixels of the target pixel area. In the printing state determination step, the difference is A printed matter inspection program for determining that a print state of a target pixel region whose total number of pixels is equal to or less than a predetermined reference value among the target pixel regions of an image is defective;” had the initial defect been a rectangle, expanding the area by a predetermined number of pixels would also generate a rectangle of specific dimensions). As noted above, Hayashi and Kojima are directed toward similar methods of endeavor of print output analysis. Additionally, JP ‘523 is directed toward “a printed matter inspection apparatus and a printed matter inspection program for determining the quality of a printed matter (abstract).” As can be seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Hayashi, Kojima and JP ‘523 are all directed toward similar methods of endeavor of print output analysis. Further, JP ‘523 allows for expanding a defect area into a predefined shape. One of ordinary skill in the art would easily understand that expanding to a bounding box of a defect allows for the quantification of different features of the defect (i.e. height and width). It would have been obvious to person having ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of JP ‘523 in order to ensure additional important data of a defect can be determined. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: U.S. Publication No. 2020/0223230 to Krieger et al. discloses, “A method for determining print defects (abstract)” U.S. Patent No. 12,058,289 to Obayashi et al. discloses, “An image printed on a recording sheet is read, and the read image is displayed on a display unit. An instruction to use the displayed image as a correct answer image is accepted, and an image generated from the image that the instruction to use is accepted is registered as the correct answer image. A printed image is verified by comparing the printed image with the registered correct answer image (abstract).” Contact Any inquiry concerning this communication or earlier communications from the examiner should be directed to Courtney J. Nelson whose telephone number is (571)272-3956. The examiner can normally be reached Monday - Friday 8:00 - 4:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Villecco can be reached at 571-272-7319. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /COURTNEY JOAN NELSON/Primary Examiner, Art Unit 2661
Read full office action

Prosecution Timeline

Mar 06, 2024
Application Filed
Mar 11, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603175
METHOD AND APPARATUS FOR DETERMINING DIAGNOSIS RESULT DATA
2y 5m to grant Granted Apr 14, 2026
Patent 12597188
SYSTEMS AND METHODS FOR PROCESSING ELECTRONIC IMAGES FOR PHYSIOLOGY-COMPENSATED RECONSTRUCTION
2y 5m to grant Granted Apr 07, 2026
Patent 12597494
METHOD AND APPARATUS FOR TRAINING MEDICAL IMAGE REPORT GENERATION MODEL, AND IMAGE REPORT GENERATION METHOD AND APPARATUS
2y 5m to grant Granted Apr 07, 2026
Patent 12588881
PROVIDING A RESULT DATA SET
2y 5m to grant Granted Mar 31, 2026
Patent 12592016
Material-Specific Attenuation Maps for Combined Imaging Systems Background
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
96%
With Interview (+9.4%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 252 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month