Prosecution Insights
Last updated: April 19, 2026
Application No. 18/180,298

IMAGE PROCESSING APPARATUS FOR REDUCING INFLUENCE OF FINE PARTICLE IN AN IMAGE, CONTROL METHOD OF SAME, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM

Non-Final OA §103
Filed
Mar 08, 2023
Examiner
PATEL, JAYESH A
Art Unit
2677
Tech Center
2600 — Communications
Assignee
Canon Kabushiki Kaisha
OA Round
3 (Non-Final)
83%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
88%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
739 granted / 887 resolved
+21.3% vs TC avg
Moderate +5% lift
Without
With
+5.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
33 currently pending
Career history
920
Total Applications
across all art units

Statute-Specific Performance

§101
11.1%
-28.9% vs TC avg
§103
40.9%
+0.9% vs TC avg
§102
14.5%
-25.5% vs TC avg
§112
25.0%
-15.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 887 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/10/2026 has been entered. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4 and 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over Hirooka Shinichiro et al., (EP2911110A2) hereafter Hirooka (Single reference 103 as the claimed limitations are disclosed and shown in multiple figs/embodiments). 1. Regarding claim 1, Hirooka discloses an image processing apparatus (figs 1, 5E-5F shows an image processing apparatus “i.e the image signal processing apparatus”) that processes image data obtained by image capturing, the image processing apparatus comprising: a processor (Fig 1 element 110, para 0009 “CPU” is a processor meeting the claim limitations); and a memory storing a program which, read and executed by the processor, causes the processor to function (fig 1 para 0009, element 120 “Memory” stores a program for execution by the processor meeting the claim limitations) as: a detecting unit configured to detect an object in an image corresponding to the image data based on the image data (fig 5E, para 0035 shows “DISTANCE TO OBJECT” in the image i.e the object is detected in the image at a Long, Medium and Short distances meeting the claim limitations, examiner notes that the specifics of i.e how “to detect” are not required by the current claim); and an image processing unit configured to perform a first processing for removing component corresponding scattered light caused by fine particles existing in air to a first area where the object has been detected in the image (fig 1 element 105 is the image processing unit and figs 5E-5F, para 0035 discloses first correction “first processing” of removing the fog “density of fog” is the component corresponding to scattered light (luminance) caused by fine particles existing in the air at a MEDIUM density where the object is detected at a MEDIUM distance (i.e trees and buildings objects in the image (first image area)) and fig 5E and 5F shows the foggy image correction strength: STRONG (i.e the first processing) for removing the fog component meeting the claim limitations, examiner notes that the specifics of the object and first processing are not required by the current claim), and to perform a second processing for removing the component to a second area where the object has not been detected in the image (fig 1 element 105 is the image processing unit and figs 5E-5F, para 0035 discloses second correction “second processing” of removing the fog “density of fog” is the component corresponding to scattered light (luminance) caused by fine particles existing in the air at a DENSE density where the object is not detected at a LONG distance (i.e No object in the image (second image area) top portion of the image is seen without any objects or no object is detected) and fig 5E and 5F shows the foggy image correction strength :WEAK (i.e the second processing) for removing the fog component meeting the claim limitations, examiner notes that the specifics of the second processing are not required by the current claim), the second processing having a different intensity of correction for removing the component from the first processing (figs 5E-5F shows the first processing (i.e STRONG foggy image correction) is different intensity of correction (i.e magnitude of correction) from the second processing (i.e WEAK foggy image correction) meeting the claim limitations, examiner notes that the specifics of “a different intensity of correction” are not required by the current claim), wherein the first area is specified by a user instruction via a user interface (Examiner notes that as seen in the disclosure in para 0027 and the figs, an automatic selection for correction” is made. Hirooka however is silent and do not disclose wherein the first area is specified by a user instruction via a user interface. Examiner also notes that it would be obvious and within one of ordinary skill in the art wherein the first area is specified by a user instruction via a user interface as claimed from the teachings of Hirooka. Hirooka in Fig 1 shows a I/O interface 130 and at paras 0009, 0016 discloses a personal computer (i.e a user computer) as seen in fig 1 is used and the correction processing on the image signals are input through the I/O interface 130 can be performed accordingly (i.e obviously the image processing/correction is performed on the image signals (i.e areas in the image) input through the I/O interface (i.e specified by the user of the personal computer in fig 1 via the interface I/O 130) obviously meeting the claim limitations. Before the effective filing date of the invention was made, different figs/embodiments in Hirooka are combinable. The rationales supporting the rejection are B, E and F. See MPEP 2141 III. 2. Regarding claim 2, Hirooka discloses the apparatus according to claim 1, wherein the second processing removes the component to a further extent compared to the first processing (figs 5E-5F and para 0035 shows and discloses wherein the second processing removes the component to a further extent (i.e a WEAKER extent fog correction) compared to the first processing (STRONG), examiner notes that the specifics of “a further extent” are not required by the current claim). 3. Regarding claim 3, Hirooka discloses the apparatus according to claim 2, further comprising a generation unit configured to generate a composed image by selecting image data obtained by the first image processing to the first area and by selecting image data obtained by the second image processing to the second area (Figs 1, 5E-5F and para 0011 shows and discloses the “output signal image i.e the generated composed image” i.e a output generated image from the correction processing (i.e image processing unit) from the first processing (STRONG foggy image correction) and the second processing (i.e the WEAK foggy image correction) meeting the above claim limitations, examiner notes that the specifics of a composed image are not required by the current claim). 4. Regarding claim 4, Hirooka discloses the apparatus according to claim 1, further comprising an image-capturing unit configured to capture the image data obtained by the image capturing (para 0010 discloses “the image signals are input to the image signal input unit from an image capturing unit/video equipment” meeting the claim limitations). 5. Claim 8 is a corresponding method claim of claim 1. See the corresponding explanation of claim 8. 6. Claim 9 is a corresponding non-transitory computer-readable storage medium claim of claim 1. See the corresponding explanation of claim 1. Fig 1 shows a memory and Para 0009 discloses a memory storing program for execution by the CPU to perform the steps of claim 9. Claims 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Hirooka in view of Itoh (US20170206690) hereafter Itoh. 7. Regarding claim 5, Hirooka discloses the apparatus according to claim 1, wherein the first image processing and the second image processing as seen in figs 1, 5E and 5F. Hirooka is silent and fails to disclose wherein the first image processing and the second image processing perform calculation of a Mie scattering component and calculation of a Rayleigh scattering component, and perform the first processing and the second processing by generating a composed image in which the calculated Mie scattering component and the Rayleigh scattering component are used. Itoh discloses wherein the first image processing and the second image processing perform calculation of a Mie scattering component and calculation of a Rayleigh scattering component, and perform the first processing and the second processing by generating a composed image in which the calculated Mie scattering component and the Rayleigh scattering component are used (in fig 4 elements S405-S407, Paras 0036, 0040-0042). Before the effective filing date of the invention was made, it would be obvious and within one of ordinary skill in the art to combine Hirooka and Itoh because they are from the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be an apparatus which obtains higher quality noise suppressed image at para 0006. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Itoh in the apparatus of Hirooka to obtain the invention as specified in claim 5. 8. Regarding claim 6, Hirroka and Itoh disclose the apparatus according to claim 5. Itoh discloses further wherein the first processing uses a first parameter and the second processing uses a second parameter, the first parameter and the second parameter including a Mie scattering intensity coefficient and a Rayleigh scattering intensity coefficient, the Mie scattering intensity coefficient and the Rayleigh scattering intensity coefficient respectively indicating a contribution of the Mie scattering component and a contribution of the Rayleigh scattering component in the generation of the composed image, and the Mie scattering intensity coefficient and the Rayleigh scattering intensity coefficient have smaller values in the second parameter than in the first parameter (fig 4 S405-S407, paras 0034-0036, 0040-0042, 0077-0084 discloses the above claim limitations of wherein the first processing uses a first parameter and the second processing uses a second parameter, the first parameter and the second parameter including a Mie scattering intensity coefficient and a Rayleigh scattering intensity coefficient, the Mie scattering intensity coefficient and the Rayleigh scattering intensity coefficient respectively indicating a contribution of the Mie scattering component and a contribution of the Rayleigh scattering component in the generation of the composed image (see expression 13 in paras 0082-0083), and the Mie scattering intensity coefficient and the Rayleigh scattering intensity coefficient have smaller values in the second parameter than in the first parameter). Examiner's Note: Examiner has cited figures, and paragraphs in the references as applied to the claims above for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested for the applicant, in preparing the responses, to fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Examiner has also cited references in PTO892 but not relied on, which are relevant and pertinent to the applicant’s disclosure, and may also be reading (anticipatory/obvious) on the claims and claimed limitations. Applicant is advised to consider the references in preparing the response/amendments in-order to expedite the prosecution. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAYESH PATEL whose telephone number is (571)270-1227. The examiner can normally be reached IFW Mon-FRI. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached at 571-270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JAYESH PATEL Primary Examiner Art Unit 2677 /JAYESH A PATEL/Primary Examiner, Art Unit 2677
Read full office action

Prosecution Timeline

Mar 08, 2023
Application Filed
Jun 03, 2025
Non-Final Rejection — §103
Sep 04, 2025
Response Filed
Nov 21, 2025
Final Rejection — §103
Jan 24, 2026
Response after Non-Final Action
Feb 10, 2026
Request for Continued Examination
Feb 18, 2026
Response after Non-Final Action
Mar 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597170
METHOD AND APPARATUS FOR IMMERSIVE VIDEO ENCODING AND DECODING, AND METHOD FOR TRANSMITTING A BITSTREAM GENERATED BY THE IMMERSIVE VIDEO ENCODING METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12579770
DETECTION SYSTEM, DETECTION METHOD, AND NON-TRANSITORY STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12561949
CONDITIONAL PROCEDURAL MODEL GENERATION
2y 5m to grant Granted Feb 24, 2026
Patent 12555346
Automatic Working System, Automatic Walking Device and Control Method Therefor, and Computer-Readable Storage Medium
2y 5m to grant Granted Feb 17, 2026
Patent 12536636
METHOD AND SYSTEM FOR EVALUATING QUALITY OF A DOCUMENT
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
83%
Grant Probability
88%
With Interview (+5.2%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 887 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month