Prosecution Insights
Last updated: April 19, 2026
Application No. 18/630,141

INSPECTION APPARATUS, METHOD OF CONTROLLING INSPECTION APPARATUS, AND STORAGE MEDIUM

Non-Final OA §103
Filed
Apr 09, 2024
Examiner
BITAR, NANCY
Art Unit
2664
Tech Center
2600 — Communications
Assignee
Canon Kabushiki Kaisha
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
91%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
786 granted / 946 resolved
+21.1% vs TC avg
Moderate +8% lift
Without
With
+8.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
32 currently pending
Career history
978
Total Applications
across all art units

Statute-Specific Performance

§101
13.3%
-26.7% vs TC avg
§103
62.1%
+22.1% vs TC avg
§102
6.4%
-33.6% vs TC avg
§112
8.9%
-31.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 946 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. Claim limitation “a user interface, display control unit, an inspection unit” have been interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because it uses/they use a linking word “ configured to” coupled with functional language respectively recited after each of the aforementioned claim limitations, without reciting sufficient structure to achieve the function. Furthermore, the generic placeholder is not preceded by a structural modifier. A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitation: see figure 2 and corresponding text. If applicant wishes to provide further explanation or dispute the examiner’s interpretation of the corresponding structure, applicant must identify the corresponding structure with reference to the specification by page and line number, and to the drawing, if any, by reference characters in response to this Office action. If applicant does not intend to have the claim limitation(s) treated under 35 U.S.C. 112(f) applicant may amend the claim(s) so that it/they will clearly not invoke 35 U.S.C. 112(f) or present a sufficient showing that the claim recites/recite sufficient structure, material, or acts for performing the claimed function to preclude application of 35 U.S.C. 112(f). For more information, see MPEP § 2173 et seq. and Supplementary Examination Guidelines for Determining Compliance With 35 U.S.C. 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s)1-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Tatsumi et al (US 20220188543) in view of Miyazawa et al (US 8,576,233) As to claim 1, Tatsumi teaches an inspection apparatus, comprising: an obtaining unit configured to perform a process depending on a type of each of blocks extracted from a reference image to be used for inspection settings in a case of inspecting a printed sheet (The acquisition unit 21 acquires an image 31 serving as a target (hereinafter referred to as a target image 31) from which a character string is extracted. Referring to FIG. 3, the target image 31 of the first exemplary embodiment is the image of a document that includes but is not limited to an entry item, a character string written by a user for the entry item, and an object 32, such as an imprint. The target image 31 may be an image of a form or a slip delineated by ruled lines, a mechanically printed receipt, or any other image as long as it includes a character string. The object 32 may now be positioned at a location that is predetermined depending on the type of the document, paragraph [0026]), and obtain a character string and coordinates indicating a position of the character string for each block(Via the optical character recognition (OCR) operation, the recognition unit 22 acquires from the target image 31 a character string and the object 32 included in the document and positions (coordinates) of the character string and the object 32 in the target image 31 and outputs these pieces of information as recognition results 33, paragraph [0027]); an associating unit configured to compare the character string obtained from the reference image with a correct character string to determine the character string which is corresponding to the correct character string and is obtained from the reference image (The extracted character string may be corrected by using a learning model based on machine learning. For example, the character string extracted from the target image 31 having undergone the extraction operation and the character string corrected in the past are stored on the memory 27. A correction unit (not illustrated) learns the target image 31 stored on the memory 27, the character string in the target image 31 and the character string corrected in the past. The correction unit may display a correction candidate of the extracted character string to correct the character string, paragraph [0077]).While Tatsumi meets the limitation above. Tatsumi fails to teach “associate the coordinates indicating the position of the character string determined as the corresponding character string with the correct character string; and an inspection setting unit configured to make a setting regarding an inspection area to be inspected in the printed sheet based on the coordinates associated with the correct character string by the associating unit. “ However, Miyazawa et al teaches in figure 5 column 7 likes 59-65 a setting screen for allowing a user to set image content that will be displayed on the display screen where a user will operates to arrange characters in this subwindow Ws1 and uses selection boxes SB2 and SB3 to register the determined font identity and display control 111 that converts each character into a print image representing the character, (column 8 lines 28-37) where the user select boxes and adapted to allow a user to choose a font family and a font size with the correct character string (see Miyazawa figure 7 and column 9 lines 38-45 and figure 7) . It would have been obvious to one skilled in the art before filing of the claimed invention to use the inspect the printed sheet based on the coordinate setting in order to allow a terminal to display content containing characters as intended by a creator of the content even when a font used for some of the characters is not supported by the terminal, while allowing for effective usage of a font supported by the terminal. Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. As to claim 2, Miyazawa et al teaches the inspection apparatus according to claim 1, wherein the inspection setting unit displays an inspection setting screen in which the inspection area is initially arranged on the reference image based on the coordinates associated with the correct character string by the associating unit and makes the setting regarding the inspection area based on a user's operation on the inspection setting screen( (See Miyazawa, Fig. 5, Col. 7 lines 59-65, when a generated control program is executed in mobile terminal 30 the display control unit 111 controls content displayed on display screen 140 of display unit 14. This displayed content is a setting screen for allowing a user to set image content that will be displayed on the display screen of mobile terminal 30 as shown in FIG. 5.). As to claim 3, Miyazawa et al teaches the inspection apparatus according to claim 1, further comprising an inspection unit configured to inspect an image obtained by reading the printed sheet to be inspected, based on the setting regarding the inspection area made by the inspection setting unit( (See Miyazawa, Fig. 5, Col. 8 lines 28-37, User operates operation unit 13 to arrange characters in this sub-window Ws1, and uses selection boxes SB2 and SB3 to register the determined font identity for each arranged character, display control unit 111 converts each character into an print image representing the character according to the determined font identity based on the character and the font data, and displays the image. At this time, characters may be arranged by use of a text box or the like. In this way, a user can set content to be displayed on display screen 340 of mobile terminal 30.) As to claim 4, Miyazawa et al teaches the inspection apparatus according to claim 1, wherein the associating unit determines the character string obtained from the reference image and being similar to the correct character string, and associates the coordinates indicating the position of the character string determined as similar to the correct character string with the correct character string( (See Miyazawa, Fig. 7, Col. 9 lines 38-45, FIG. 7 is a diagram for explaining inspection region as the arrangement-designating information which includes a sequential number (No.) assigned to each of the displayed character strings, together with a correspondence relationship between "character information" indicating the character string, "font information" indicating a font identity for the character string, and "position information" indicating an arrangement position of the character string). As to claim 5, Tatsumi teaches the inspection apparatus according to claim 4, wherein the associating unit determines the character string obtained from the reference image and being similar to the correct character string by using a Levenshtein distance between the character string obtained from the reference image and the correct character string, and associates the coordinates indicating the position of the character string determined as similar to the correct character string with the correct character string (The determination unit 23 may determine as the attribute and type of the character string the attribute and type of a character string having the highest degree of similarity from among the character strings stored on the memory 27. The degree of similarity may be derived from Levenshtein distance, paragraph [0031]). As to claim 6, Tatsumi teaches the inspection apparatus according to claim 5, wherein the associating unit associates, the coordinates indicating the position of the character string, which is obtained from the reference image and has the Levenshtein distance being the shortest and less than a threshold, with the correct character string ( a degree of certainty indicating certainty of the character string may be derived. If the degree of certainty is lower than a predetermined threshold, for example, only a character string having certainty lower than the predetermined threshold may be displayed ( paragraph[0031][0073]). As to claim 7, Tatsumi teaches the inspection apparatus according to claim 6, wherein the threshold for the Levenshtein distance is set in advance (The determination unit 23 may determine as the attribute and type of the character string the attribute and type of a character string having the highest degree of similarity from among the character strings stored on the memory 27. The degree of similarity may be derived from Levenshtein distance. The Levenshtein distance is determined by counting the number of character replacements, character additions, and character deletions when any character string is changed into another character string, paragraph [0031]). As to claim 8, Tatsumi teaches the inspection apparatus according to claim 6, further comprising a display control unit configured to cause a display unit to display a user interface (UI) screen configured to receive the threshold for the Levenshtein distance from a user(the CPU 11 displays the confirmation correction screen and receives a user correction to the attribute, type, and position of the character string, paragraph [0054]). As to claim 9, Tatsumi teaches the inspection apparatus according to claim 3, further comprising a display control unit configured to, in a case where the inspection unit determines that an inspection result of the image obtained by reading the printed sheet to be inspected is a failure, cause a display unit to display a detail of the inspection result( the correction may be accepted with all the extracted character strings uniformly displayed, paragraph [0073-0074]). As to claim 10, Miyazawa teaches the inspection apparatus according to claim 1, wherein the correct character string is a character string to be printed on the printed sheet(See Miyazawa, Fig. 5, Col. 8 lines 28-37, User operates operation unit 13 to arrange characters in this sub-window Ws1, and uses selection boxes SB2 and SB3 to register the determined font identity for each arranged character, display control unit 111 converts each character into an print image representing the character according to the determined font identity based on the character and the font data, and displays the image. At this time, characters may be arranged by use of a text box or the like. In this way, a user can set content to be displayed on display screen 340 of mobile terminal 30). As to claim 11, Miyazawa teaches the inspection apparatus according to claim 1, wherein the correct character string is a character string to be encoded in a code image to be printed on the printed sheet((See Miyazawa, Fig. 5, Col. 8 lines 28-37, User operates operation unit 13 to arrange characters in this sub-window Ws1, and uses selection boxes SB2 and SB3 to register the determined font identity for each arranged character, display control unit 111 converts each character into an print image representing the character according to the determined font identity based on the character and the font data, and displays the image. At this time, characters may be arranged by use of a text box or the like. In this way, a user can set content to be displayed on display screen 340 of mobile terminal 30). The limitation of claim 12 and 13 has been addressed above. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to NANCY BITAR whose telephone number is (571)270-1041. The examiner can normally be reached Mon-Friday from 8:00 am to 5:00 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mrs. Jennifer Mehmood can be reached at 571-272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. NANCY . BITAR Examiner Art Unit 2664 /NANCY BITAR/Primary Examiner, Art Unit 2664
Read full office action

Prosecution Timeline

Apr 09, 2024
Application Filed
Feb 20, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599437
PRE-PROCEDURE PLANNING, INTRA-PROCEDURE GUIDANCE FOR BIOPSY, AND ABLATION OF TUMORS WITH AND WITHOUT CONE-BEAM COMPUTED TOMOGRAPHY OR FLUOROSCOPIC IMAGING
2y 5m to grant Granted Apr 14, 2026
Patent 12597132
IMAGE PROCESSING METHOD AND APPARATUS
2y 5m to grant Granted Apr 07, 2026
Patent 12597240
METHOD AND SYSTEM FOR AUTOMATED CENTRAL VEIN SIGN ASSESSMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12597189
METHODS AND APPARATUS FOR SYNTHETIC COMPUTED TOMOGRAPHY IMAGE GENERATION
2y 5m to grant Granted Apr 07, 2026
Patent 12591982
MOTION DETECTION ASSOCIATED WITH A BODY PART
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
91%
With Interview (+8.2%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 946 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month