Prosecution Insights
Last updated: April 19, 2026
Application No. 18/501,975

IMAGE COMPARISON APPARATUS AND STORAGE MEDIUM OF IMAGE COMPARISON PROGRAM

Non-Final OA §101§102§112
Filed
Nov 03, 2023
Examiner
ADU-JAMFI, WILLIAM NMN
Art Unit
2677
Tech Center
2600 — Communications
Assignee
Kyocera Document Solutions Inc.
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-62.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
25 currently pending
Career history
25
Total Applications
across all art units

Statute-Specific Performance

§101
19.5%
-20.5% vs TC avg
§103
36.8%
-3.2% vs TC avg
§102
28.7%
-11.3% vs TC avg
§112
14.9%
-25.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§101 §102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 3 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim does not fall within at least one of the four categories of patent eligible subject matter because the claim as presently drafted encompasses both transitory and non-transitory embodiments. Specifically, the claim recites a “storage medium of an image comparison program for causing a computer to execute image comparison processing,” which is broad enough to include transitory forms such as carrier waves or signals. A transitory signal, while physical and real, does not possess concrete structure that would qualify as a device or part under the definition of a machine, is not a tangible article or commodity under the definition of a manufacture (even though it is man-made and physical in that it exists in the real world and has tangible causes and effects), and is not composed of matter such that it would qualify as a composition of matter. As such, a transitory, propagating signal does not fall within any statutory category. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-3 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The term “specific range” in claim 1 is a relative term which renders the claim indefinite. The term “specific range” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The phrase “specific range” in claim 1 renders the claim indefinite because it fails to clearly define the boundary of the claimed limitation “to within a specific range.” The claim recites reducing the difference in the degree of blur between the reference image and the target image “to within a specific range,” but the specification does not provide any guidance, definition, or standard for determining what constitutes a “specific” range. It is therefore unclear what quantitative or qualitative threshold would satisfy this limitation. Accordingly, claim 3 is rejected for containing identical subject matter to claim 1. Furthermore, claim 2 depends from claim 1, and is rejected for the same reasons set forth for claim 1. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-3 are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by Ono (JP4960516B2). Regarding Claim 1, Ono teaches an image comparison apparatus comprising an image comparison portion configured to compare a target image with a reference image, wherein Paragraph [0051]: “The image recognition device 2 according to the second embodiment includes a storage unit 11A, a storage unit 12, a normalization unit 13, a blur measurement unit 14A, a blur comparison unit 15A, an image processing unit 16A, a feature extraction unit 17, and a recognition unit 18.” Explanation: These units collectively constitute an image comparison portion configured to compare an input (target) image to a normalized (reference) image stored in memory. The normalized image used to determine threshold values (Tmax to Tmin) functions as the reference image, while the new input image corresponds to the target image, both of which are compared by the blur comparison unit 15A. the image comparison portion calculates a degree of blur of each of the reference image and the target image, and reduces a difference in the degree of blur between the reference image and the target image to within a specific range by adding blur to one of the reference image and the target image that has a smaller degree of blur, and Paragraph [0054]: “In the second embodiment, the threshold values Tmax and Tmin are determined as follows. 1. An image is generated by normalizing various images that may be input to the image recognition device 1. 2. The maximum value M of the pixel values of each normalized image is calculated using the gradient filter L. 3. The average value M0 and the standard deviation σ of the calculated maximum value M are calculated. 4. The range given by the following equation (8) is determined as a range in which the maximum value M of pixel values can be taken. (8) where c is a constant. M0 + cσ is the threshold Tmax, and M0−cσ is the threshold Tmin.” Paragraph [0064]: “The blur measuring unit 14A calculates the pixel value of the image input from the normalizing unit 13. Next, the blur measuring unit 14A acquires the maximum value M from all the calculated pixel values (step S202).” Paragraph [0065]: “The blur comparing unit 15A determines whether the maximum value M of the pixel values acquired by the blur measuring unit 14A is within the range from the threshold Tmax stored in the storage unit 11A to the threshold Tmin (step S203). When the maximum value M of the pixel values is smaller than the threshold value Tmin, the blur comparing unit 15A instructs the image processing unit 16A to perform blur conversion on the image input from the normalizing unit 13.” Explanation: In the second embodiment, the maximum value M is a parameter representing the blur level. The blur measurement unit 14A calculates a blur level (M) for the input image, which represents the degree of blur of the target image. The blur range (Tmin – Tmax) and blur level (M), derived from the normalized image(s), represents the degree of blur of the reference image. The system explicitly adjusts the blur difference between the input and normalized images until it falls within the specific range (Tmin – Tmax), and the blur comparison unit 15A instructs the image processing unit 16A to blur-convert (add blur to) the less-blurred input image until it is within the range Tmin – Tmax. the image comparison portion compares the reference image and the target image whose difference in the degree of blur has been reduced to within the specific range Paragraph [0058]: “When the maximum value M of the pixel value is within the range from the threshold value Tmax to the threshold value Tmin, the blur comparing unit 15A instructs the image processing unit 16A to input the image input from the normalizing unit 13 to the feature extracting unit 17 as it is.” Paragraph [0070]: “The feature extracting unit 17 extracts features of the image input from the image processing unit 16A (Step S209). The recognizing unit 18 searches the storage unit 12 for an image pattern having a feature amount closest to the feature amount input from the feature extracting unit 17. Next, the recognition unit 18 outputs the searched image pattern as a recognition result (Step S210).” Explanation: Once the blur difference is within the specific range (Tmax to Tmin), the system performs comparison/recognition by comparing the adjusted input image with the stored normalized image. Regarding Claim 2, Ono teaches the image comparison apparatus according to claim 1, wherein the image comparison portion performs a Laplacian transform on the reference image to generate a transformed reference image and calculates a degree of variation of pixel values as the degree of blur of the reference image, and (see Paragraph [0054] above) Paragraph [0013]: “The gradient filter L is a filter for calculating a two-dimensional gradient of an image, and a Laplacian filter, a Prewitt filter, a Sobel filter, or the like can be used.” the image comparison portion performs a Laplacian transform on the target image to generate a transformed target image and calculates a degree of variation of pixel values as the degree of blur of the target image Paragraph [055]: “The blur measuring unit 14A reads the gradient filter L stored in the storage unit 11A, and projects the gradient filter L onto an image input from the normalizing unit 13 or the image processing unit 16A. Then, the pixel value of the image is calculated according to the weight defined by the gradient filter L. Note that the calculation method is the same as the method described in the first embodiment. Then, the blur measuring unit 14A acquires the maximum value M from the absolute values of all the calculated pixel values.” Regarding Claim 3, Ono teaches the same limitations as discussed with respect to claim 1 above. The only additional feature in claim 3 is that the image comparison processing is performed by a computer executing instructions stored on a storage medium. Ono discloses storage units (11A and 12) that store data and control information used by the image processing unit 16A and related components to perform the image comparison, blur measurement, and recognition functions (see paragraph [0051] above). These storage units would inherently contain the program instructions necessary for the computer to perform the same image comparison processes described in claim 1. Accordingly, Ono inherently discloses the “storage medium for causing a computer to execute” the recited steps. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM ADU-JAMFI whose telephone number is (571)272-9298. The examiner can normally be reached M-T 8:00-6:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached at (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILLIAM ADU-JAMFI/Examiner, Art Unit 2677 /ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677
Read full office action

Prosecution Timeline

Nov 03, 2023
Application Filed
Nov 12, 2025
Non-Final Rejection — §101, §102, §112 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month