Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This communication is in response to the action filed on 01/14/2026.
Claims 1-4, 6, 8-14 are currently amended. Claim 7 is canceled. Claims 1-6, and 8-14 are pending.
Response to Arguments
Applicant’s arguments filed on 01/14/2026 on pages 6-8, under REMARKS with respect to 35
U.S.C. 102 and 103 claim rejections to claims 1-14 have been fully considered and are persuasive. The rejections to the claims have been withdrawn. However, upon further consideration, a new grounds of rejection is made in view of secondary reference US 2023/0281795 A1.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claims 1-6, 8-14 are rejected under 35 § U.S.C. 103 as being obvious over US 2025/0193320 A1 to HARUTA (hereinafter “HARUTA”) in view of US 2023/0281795 A1 to SOMA et al. (hereinafter “SOMA”).
As per claim 1, HARUTA discloses an image processing apparatus that performs image processing for performing alignment of an inspection image obtained by reading a printed material and a reference image of the inspection image (a computing system and method of operation to perform an image processing method of performing image alignment of a target captured image and a reference image via an inspection module of the system wherein the input image is a scan of a paper sheet contains text; abstract; fig 4; paragraphs [0070-0076]), the image processing apparatus comprising: an extraction unit configured to extract feature points from the inspection image and the reference image, respectively (the computing system includes inspection apparatus 109 and inspects incoming images of sheets captured via capturing unit 240 which is then compared to a reference image in order to match extracted features at specific points called feature points of both the reference base sheet image and the input target sheet image respectively; abstract; fig 4; paragraphs [0006], [0063], [0069-0076]); a first derivation unit configured to derive first alignment information for performing alignment of the inspection image and the reference image by using feature points extracted from the inspection image and the reference image, respectively (the incoming sheet image is inspected for extracting pixel feature values and comparing pixel values at each image position, by at each position and compared to a reference image comprising a same position by extracting character data by optical character recognition the extracted feature data in this case of paragraph [0131] the feature was luminance of the point to determine proper alignment based on inspection threshold level; abstract; fig 4; paragraphs [0006], [0063], [0070-0076], [0131-0132]); by detecting sheet four corners from the inspection image and the reference image, respectively, and using the detected sheet four corners (the computing system is also adapted to include a second inspection method in which the inspection module identifies sheet vertices for sheet image alignment, the sheet vertices are represented by the four corners of the sheet image, this method may be used in combination with the feature point method and may also be used if the sheet contains too few feature points for the feature point method; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0171]): and an alignment unit configured to perform alignment of the inspection image and the reference image by using the first alignment information, wherein the alignment unit performs, in a case where the feature points extracted from the inspection image and the reference image_ are not more than or equal to a predetermined number (wherein based on an image of an area of a pixels square corresponding to one of the six feature points extracted in the image and the six matching feature points are used as alignment information to align the images based on the scanned reference and target sheet images; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0095]), alignment by using alternative alignment information by estimating the alternative alignment information from the first alignment information used in past alignment or the first alignment information derived by parameter adjustment for the alignment and the second alignment information derived by the past alignment and current alignment (the extracted feature points are assigned coordinate point values and used for alignment information to align the images to the reference sheet images; abstract; fig 4; paragraphs [0006], [0063], [0111], [0176]). HARUTA fails to disclose a second derivation unit configured to derive second alignment information for performing alignment of the inspection image and the reference image.
SOMA discloses a second derivation unit configured to derive second alignment information for performing alignment of the inspection image and the reference image (the computing system is adapted to determine alignment and to detect defect types which include misalignments, one way the system detects defects is using a reference value compared to a preset first threshold at step S82 and if the value is greater than the preset threshold it is determined a defect might be present and the pixel is of interest, this pixel and related value are then compared to a second threshold B1 at step 823 via image verification unit 56 of the information processing apparatus 50 determines whether the detected difference is equal to or greater than a second inspection threshold B1 (acting as the second alignment information), inspection threshold B1 is a value determined as the second threshold for inspection (defect determination criterion), which is larger than the value assuming a defect when the reference value is less than the threshold and one of the defects inspected for include misalignment; figs 6, 12-13; paragraphs [0091-0096], [0101-0103]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify HARUTA to have a second derivation unit configured to derive second alignment information for performing alignment of the inspection image and the reference image of SOMA reference. The Suggestion/motivation for doing so would have been to provide a two/multiple step verification process to the computing system in order to more accurately verify alignment and further reduce/eliminate the possibility for misalignment occurring as suggested by SOMA paragraph [0095]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine SOMA with HARUTA to obtain the invention as specified in claim 1.
As per claim 2, HARUTA in view of SOMA discloses the image processing apparatus according to claim 1. Modified HARUTA further discloses wherein in a case where the feature points extracted from the inspection image and the reference image are more than or equal to the predetermined number, the first alignment information derived by the first derivation unit is stored in a memory, and the alignment unit estimates the alternative alignment information from the first alignment information stored in the memory (the number of feature points in the example provided was set to six feature points as stated of the reference sheet image and the captured test sheet image of a plurality of captured images to compare to said reference image with said six set feature point and corresponding values and the data to make this comparison and instructions to do so are stored within he computing systems on board memory component; figs 5C-5D; paragraphs [0094-0095], [0107], [0131]).
As per claim 3, HARUTA in view of SOMA discloses the image processing apparatus according to claim 1, wherein the first alignment information derived by parameter adjustment for the alignment is derived and stored in a memory by using feature points obtained by reading a chart on which a feature point image having feature points more than or equal to a certain number is formed and extracting from a reference image of the feature point image and reading results, respectively, and the alignment unit estimates the alternative alignment information from the first alignment information stored in the memory (the alignment information based on the feature points is able to be selected from a variety of various image parameters and their threshold valuers set and changed as desired by the user for the specific test case/scenario, using this feature the system is adapted to save an input sheet image as a reference image and use that reference images feature point valuers to test input test sheet images feature values at the same points and to use these values to align the points of the images and in the case discussed the number of feature points to be matched is set to six feature points and stored in the computer’s memory after user input using computing system peripherals such as a keyboard or other input device; figs 5C-5D; paragraphs [0094-0095], [0107], [0131]).
As per claim 4, HARUTA in view of SOMA discloses the image processing apparatus according to claim 2, wherein the first alignment information stored in the memory is updated to, in a case where the feature extracted from the inspection image and the reference image by current alignment are more than or equal to the predetermined number, the first alignment information derived by the current alignment (when the computing system acquires the reference input images feature points it is adapted to identify the corresponding feature points of input target sheet images and based on identifying the six feature points and passing the threshold criterial for pixel feature value at that particular position of the image to determine alignment and saving the determined alignment information of the image to determine if the input sheet target image is aligned with the reference image; figs 5C-5D; paragraphs [0094-0095], [0107], [0131]).
As per claim 5, HARUTA in view of SOMA discloses the image processing apparatus according to claim 2, wherein the first alignment information stored in the memory is deleted in a case where the image processing apparatus has processed a predetermined number of inspection images or more or the image processing apparatus is not activated for a predetermined time or longer (the computing system via inspection apparatus 109 performs processing for updating the inspection level and the coordinate information and by updating the inspection level this means increasing the difficulty or strictness of the inspection process, making the allowable range smaller or by raising the threshold value to be more difficult to meet to confirm alignment, further the system is adapted to be set to an exclusion mode which excludes which means to not save or (deletes) data in levels of inspection not met by the image; paragraphs [0123], [0134-036], [0141]).
As per claim 6, HARUTA in view of SOMA discloses the image processing apparatus according to claim 2, wherein in a case where, even though the feature points extracted from the inspection image and the reference image are more than or equal to the predetermined number, a distribution of the feature points extracted by the extraction unit is subject to a certain area, the first alignment information derived by the first derivation unit is not stored in the memory (the computing system is adapted to allow a user to define an area for inspection of the feature points and this area may be updated or changed as the user desires via user interface tools; paragraph [0156]).
As per claim 8, HARUTA in view of SOMA discloses the image processing apparatus according to claim 1, wherein in a case where the feature points extracted from the inspection image and the reference image are more than or equal to the predetermined number (the feature points and the feature point values at each respective point are compared to a feature value threshold set by the user and may be increased as desired and the level or value will be raised, for example the prior art discusses the value of luminance compared to a threshold value (certain number) and the luminance value must be greater than the threshold; abstract; fig 4, 5D; paragraphs [0107], [0131-0132], [0171]), the second alignment information derived by the second derivation unit is stored in the memory, and the alignment unit performs alignment by using the second alignment information derived by the second derivation unit in a case where the feature points extracted from the inspection image and the reference image are not more than or equal to the predetermined number and the first alignment information used in the past alignment and the first alignment information derived by parameter adjustment for the alignment are not stored in the memory (the computing system via inspection apparatus 109 performs processing for updating the inspection level and the coordinate information and by updating the inspection level this means increasing the difficulty or strictness of the inspection process, making the allowable range smaller or by raising the threshold value to be more difficult to meet to confirm alignment, further the system is adapted to be set to an exclusion mode which excludes which means to not save or (deletes) data in levels of inspection not met by the image; paragraphs [0123], [0134-036], [0141]).
As per claim 9, HARUTA in view of SOMA discloses the image processing apparatus according to claim 1, wherein the first alignment information is information for performing affine transformation of coordinates of the feature points of one of the inspection image and the reference image into coordinates of the feature points of the other image (the computing system is adapted to perform affine geometric transformation in order to get coordinate point values related to each respective feature point of a base reference image and input scan image; paragraphs [0127-0128], [0137]).
As per claim 10, HARUTA in view of SOMA discloses the image processing apparatus according to claim 1, wherein the second alignment information is information for performing affine transformation of coordinates of the sheet four corners of one of the inspection image and the reference image into coordinates of the sheet four corners of the other image (the affine transformation is performed on feature points which in some prior art cases will include the four sheet vertices which are the four corners and using the affine transformation on the sheet image the corners coordinate point values would be found; paragraphs [0071], [0094-0095], [0127-0128], [0137]).
As per claim 11, HARUTA in view of SOMA discloses the image processing apparatus according to claim 1, wherein the alternative alignment information is information for performing affine transformation of coordinates of the feature points of one of the inspection image and the reference image into coordinates of the feature points of the other image (the computing system is adapted to perform affine geometric transformation in order to get coordinate point values related to each respective feature point of a base reference image and input scan image; paragraphs [0127-0128], [0137]).
As per claim 12, HARUTA discloses the image processing apparatus according to claim 1. HARUTA fails to disclose further having: an inspection unit configured to inspect a blot on the inspection image by comparing the inspection image and the reference image after alignment by the alignment unit.
SOMA discloses further having: an inspection unit configured to inspect a blot on the inspection image by comparing the inspection image and the reference image after alignment by the alignment unit (note; a blot is not clearly defined in the specification, that being said the examiner is interpreting “a blot” to mean a mark or marker on a medium to receive print to act as a positioning marker; the system is adapted to inspect the image for marker M (acting as the inspected blot) on placement table 22 and determine if marker M is visible or hidden over a time period after inspection to determine if the medium being printed on in this case a t shir was position properly; paragraphs [0037], [0079-0080], [0082]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify HARUTA to have an inspection unit configured to inspect a blot on the inspection image by comparing the inspection image and the reference image after alignment by the alignment unit of SOMA reference. The Suggestion/motivation for doing so would have been to provide the computing system the ability to determine if operator has finished setting the T-shirt on the placement table so that the shirt is properly aligned and it is safe to proceed to the printing processes next step, as suggested by SOMA at paragraph [0080]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine SOMA with HARUTA to obtain the invention as specified in claim 12.
As per claim 13, HARUTA discloses an image processing method that performs image processing for performing alignment of an inspection image obtained by reading a printed material and a reference image of the inspection image (a computing system and method of operation to perform an image processing method of performing image alignment of a target captured image and a reference image via an inspection module of the system wherein the input image is a scan of a paper sheet contains text; abstract; fig 4; paragraphs [0070-0076]), the image processing method comprising the steps of: extracting feature points from the inspection image and the reference image, respectively (the computing system includes inspection apparatus 109 and inspects incoming images of sheets captured via capturing unit 240 which is then compared to a reference image in order to match extracted features at specific points called feature points of both the reference base sheet image and the input target sheet image respectively; abstract; fig 4; paragraphs [0006], [0063], [0069-0076]); deriving first alignment information for performing alignment of the inspection image and the reference image by using feature points extracted from the inspection image and the reference image, respectively (the incoming sheet image is inspected for extracting pixel feature values and comparing pixel values at each image position, by at each position and compared to a reference image comprising a same position by extracting character data by optical character recognition the extracted feature data in this case of paragraph [0131] the feature was luminance of the point to determine proper alignment based on inspection threshold level; abstract; fig 4; paragraphs [0006], [0063], [0070-0076], [0131-0132]); by detecting sheet four corners from the inspection image and the reference image, respectively (the computing system is also adapted to include a second inspection method in which the inspection module identifies sheet vertices for sheet image alignment, the sheet vertices are represented by the four corners of the sheet image, this method may be used in combination with the feature point method and may also be used if the sheet contains too few feature points for the feature point method; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0171]), and using the detected sheet four corners (the computing system is also adapted to include a second inspection method in which the inspection module identifies sheet vertices for sheet image alignment, the sheet vertices are represented by the four corners of the sheet image, this method may be used in combination with the feature point method and may also be used if the sheet contains too few feature points for the feature point method; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0171]); and performing alignment of the inspection image and the reference image by using the first alignment information, wherein, in the performing, in a case where the feature points extracted from the inspection image and the reference image are not more than or equal to a predetermined number (wherein based on an image of an area of a pixels square corresponding to one of the six feature points extracted in the image and the six matching feature points are used as alignment information to align the images based on the scanned reference and target sheet images; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0095]), alignment is performed by using alternative alignment information by estimating the alternative alignment information from the first alignment information used in past alignment or the first alignment information derived by parameter adjustment for the alignment and the second alignment information derived by the past alignment and current alignment (the extracted feature points are assigned coordinate point values and used for alignment information to align the images to the reference sheet images; abstract; fig 4; paragraphs [0006], [0063], [0111], [0176]). HARUTA fails to disclose deriving second alignment information for performing alignment of the inspection image and the reference image.
SOMA discloses deriving second alignment information for performing alignment of the inspection image and the reference image (the computing system is adapted to determine alignment and to detect defect types which include misalignments, one way the system detects defects is using a reference value compared to a preset first threshold at step S82 and if the value is greater than the preset threshold it is determined a defect might be present and the pixel is of interest, this pixel and related value are then compared to a second threshold B1 at step 823 via image verification unit 56 of the information processing apparatus 50 determines whether the detected difference is equal to or greater than a second inspection threshold B1 (acting as the second alignment information), inspection threshold B1 is a value determined as the second threshold for inspection (defect determination criterion), which is larger than the value assuming a defect when the reference value is less than the threshold and one of the defects inspected for include misalignment; figs 6, 12-13; paragraphs [0091-0096], [0101-0103]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify SOMA to have deriving second alignment information for performing alignment of HARUTA reference. The Suggestion/motivation for doing so would have been to provide a two/multiple step verification process to the computing system in order to more accurately verify alignment and further reduce/eliminate the possibility for misalignment occurring as suggested by SOMA paragraph [0095]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine HARUTA with SOMA to obtain the invention as specified in claim 13.
As per claim 14, HARUTA discloses a non-transitory computer readable storage medium storing a program for causing a computer to execute an image processing method that performs image processing for performing alignment of an inspection image obtained by reading a printed material and a reference image of the inspection image (a computing system comprising computing components such as a memory to store instructions for a method of operation to perform an image processing method of performing image alignment of a target captured image and a reference image via an inspection module of the system wherein the input image is a scan of a paper sheet contains text; abstract; fig 4; paragraphs [0070-0076]), the image processing method comprising the steps of: extracting feature points from the inspection image and the reference image, respectively (the computing system includes inspection apparatus 109 and inspects incoming images of sheets captured via capturing unit 240 which is then compared to a reference image in order to match extracted features at specific points called feature points of both the reference base sheet image and the input target sheet image respectively; abstract; fig 4; paragraphs [0006], [0063], [0069-0076]); deriving first alignment information for performing alignment of the inspection image and the reference image by using feature points extracted from the inspection image and the reference image, respectively (the incoming sheet image is inspected for extracting pixel feature values and comparing pixel values at each image position, by at each position and compared to a reference image comprising a same position by extracting character data by optical character recognition the extracted feature data in this case of paragraph [0131] the feature was luminance of the point to determine proper alignment based on inspection threshold level; abstract; fig 4; paragraphs [0006], [0063], [0070-0076], [0131-0132]); by detecting sheet four corners from the inspection image and the reference image, respectively, and using the detected sheet four corners (the computing system is also adapted to include a second inspection method in which the inspection module identifies sheet vertices for sheet image alignment, the sheet vertices are represented by the four corners of the sheet image, this method may be used in combination with the feature point method and may also be used if the sheet contains too few feature points for the feature point method; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0171]); and performing alignment of the inspection image and the reference image by using the first alignment information, wherein, in the performing, in a case where the feature points extracted from the inspection image and the reference image are not more than or equal to a predetermined number (wherein based on an image of an area of a pixels square corresponding to one of the six feature points extracted in the image and the six matching feature points are used as alignment information to align the images based on the scanned reference and target sheet images; abstract; fig 4, 5D; paragraphs [0006], [0070-0076], [0095]), alignment is performed by using alternative alignment information by estimating the alternative alignment information from the first alignment information used in past alignment or the first alignment information derived by parameter adjustment for the alignment and the second alignment information derived by the past alignment and current alignment (the extracted feature points are assigned coordinate point values and used for alignment information to align the images to the reference sheet images; abstract; fig 4; paragraphs [0006], [0063], [0111], [0176]). HARUTA fails to disclose deriving second alignment information for performing alignment of the inspection image and the reference image.
SOMA discloses deriving second alignment information for performing alignment of the inspection image and the reference image (the computing system is adapted to determine alignment and to detect defect types which include misalignments, one way the system detects defects is using a reference value compared to a preset first threshold at step S82 and if the value is greater than the preset threshold it is determined a defect might be present and the pixel is of interest, this pixel and related value are then compared to a second threshold B1 at step 823 via image verification unit 56 of the information processing apparatus 50 determines whether the detected difference is equal to or greater than a second inspection threshold B1 (acting as the second alignment information), inspection threshold B1 is a value determined as the second threshold for inspection (defect determination criterion), which is larger than the value assuming a defect when the reference value is less than the threshold and one of the defects inspected for include misalignment; figs 6, 12-13; paragraphs [0091-0096], [0101-0103]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify HARUTA to have deriving second alignment information for performing alignment of SOMA reference. The Suggestion/motivation for doing so would have been to provide a two/multiple step verification process to the computing system in order to more accurately verify alignment and further reduce/eliminate the possibility for misalignment occurring as suggested by SOMA paragraph [0095]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine SOMA with HARUTA to obtain the invention as specified in claim 14.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEVIN JACOB DHOOGE whose telephone number is (571) 270-0999. The examiner can normally be reached 7:30-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached on (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800- 786-9199 (IN USA OR CANADA) or 571-272-1000.
/Devin Dhooge/
USPTO Patent Examiner
Art Unit 2677
/ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677