Prosecution Insights
Last updated: April 19, 2026
Application No. 18/314,196

IMAGING SYSTEM FOR ENDOSCOPE

Non-Final OA §103
Filed
May 09, 2023
Examiner
CHOU, WILLIAM B
Art Unit
3795
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
BOSTON SCIENTIFIC CORPORATION
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
3y 9m
To Grant
94%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
389 granted / 534 resolved
+2.8% vs TC avg
Strong +21% interview lift
Without
With
+21.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
27 currently pending
Career history
561
Total Applications
across all art units

Statute-Specific Performance

§101
0.6%
-39.4% vs TC avg
§103
42.0%
+2.0% vs TC avg
§102
23.6%
-16.4% vs TC avg
§112
24.9%
-15.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 534 resolved cases

Office Action

§103
The present application is being examined under the pre-AIA first to invent provisions. DETAILED ACTION Election/Restrictions Applicant’s election of Species 02a-1, 02b-1, 02c-4, and 02d-3, claims 21-40, in the reply filed on January 9, 2026 is acknowledged. Because applicant did not distinctly and specifically point out the supposed errors in the restriction requirement, the election has been treated as an election without traverse (MPEP § 818.01(a)). Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant's cooperation is requested in correcting any errors of which applicant may become aware in the specification. The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. The following title is suggested: Imaging System for Endoscope with Composite Image Generation Means from a Plurality of Wavelengths. CLAIM INTERPRETATION The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “image capture device” in claims 21-40 defined as a CCD, CMOS, or other including one or more pixels in [022] of the Specification, and “medical device” in claims 39-40 defined as capably inserted into the body through any anatomic opening in [017] of the Specification. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 21-40 are rejected under 35 U.S.C. 103(a) as being unpatentable over Sendai (U.S. Publication 2002/0105505) and in further views of Imaizumi et al. (U.S. Publication 2004/0186351, hereinafter “Imaizumi”) and Parulski (U.S. Patent 5,523,786). As to Claim 21, Sendai discloses a method for generating a composite image, the method comprising: determining a timing in [0288] and [0325]-[0331] of formation of an image frame by an image capture device (107) in Par. [0111]; causing an illumination source (110) in Par. [0110] to emit illumination energy via LEDs (26a, 26b, 26c) in 4/13 and Fig. 3 at a plurality of different wavelengths as described in [0114] and Figs. 1 and 5 having different wavelength filters in a sequence based on the determined timing such that an emission of one of the plurality of different wavelengths corresponds to a timing in [0288] and [0325]-[0331] of formation of one image frame by the image capture device as described in Pars. [0064] and [0112]-[0114]; receiving a plurality of image frames serially formed by the image capture device as the sequence of the illumination energy at the plurality of different wavelengths is emitted by the illumination source in Figs. 11A-11C as well as 1/33-40 of incorporated by reference U.S. Patent 4,845,553, wherein each of the plurality of image frames is associated with one of the plurality of different wavelengths (during the respective illumination period of each of the white light, IR reflected light, and fluorescent-light image acquisition periods); and generating a composite image “composite-image” in [0054] and [0067] from the plurality of image frames . However Sendair does not specifically disclose the plurality of pixels being configured to detect each wavelength. Imaizumi teaches in the analogous field of endeavor of endoscopic image acquisition that multiple CCDs produce not only visible light images in [0108] and [0246] yet also wavelengths in the infrared spectrum in [0208] via filter (50) as shown in Fig. 14 located in front of the CCDs for desired imaging characteristics. It would have been obvious to one of ordinary skill in the art that Sendai’s disclosure of utilizing stimulating-light cutoff filter in order to remove unwanted wavelengths is utilized in combination with an image capture device that is configured to detect illumination energy of any wavelength in the visible spectrum as taught by Imaizumi to fulfill the same function with predictable results of image acquisition of desired wavelengths. However Imaizumi does not specifically disclose wavelength sensitivity of each pixel for each and every wavelength in the visible spectrum simultaneously. Parulski teaches in the analogous field of endeavor of endoscopic image acquisition in 1/17 wherein each of a plurality of pixels is configured to detect illumination energy of each wavelength in the visible spectrum without the image capture device having a structure for separating broad band light into illumination energy at different wavelengths in 2/31-40-53 further supported by incorporated by reference U.S. Patent 4,845,553 in 1/33 of Parulski wherein Fig. 5 and 5/29-42 describe same pixels being read for R, G, and B image data. This evidences the level of ordinary skill in recognizing that sequential illumination and data output of imagers can be utilized to process a sequence of chrominance image signals as equivalent alternatives for providing the same predictable result of converting light to electrical signal. It would have been obvious to one of ordinary skill in the art at the time of invention to have provided the image sensors of Imaizumi with the image sensor equivalents as taught by Parulski as an obvious equivalent in combination and/or alternatively since it has been shown by Parulski that each will provide the predicable result of converting light to electrical signal (Parulski, 2/31-53). Parulski teaches timing considerations for the reasons relevant to claims 21, 30, and 36 above such that the processor is further configured to cause the illumination control system (28) in 4/14 and Fig. 3 to output via LEDs (26a, 26b, 26c) in 4/13 and Fig. 3 each illumination energy in the sequence, including the first illumination energy and the second illumination energy, at a timing and for a duration that is synchronized to the determined timing of the formation of the plurality of images such that the image capture device is enabled to form the image in Pars. [0109]-[0110] for each wavelength of illumination energy one at a time as the one wavelength is output at the given time in the sequence in Figs. 11A-11C as well as 1/33-40 of incorporated by reference U.S. Patent 4,845,553. As to Claim 22, Sendai discloses the method of claim 21, wherein determining the timing of formation of the image frame by the image capture device comprises receiving and processing one or more sets of a plurality of image frames captured by the image capture device to determine the timing in [0288] and [0325]-[0331]. As to Claim 23, Sendai discloses the method of claim 21, wherein causing the illumination source to emit illumination energy at the plurality of different wavelengths in the sequence based on the determined timing in [0288] and [0325]-[0331] comprises causing only one wavelength of the plurality of different wavelengths to be emitted at a given time in the sequence in Pars. [0064] and [0112]-[0114]. As to Claim 24, Sendai discloses the method of claim 21, further comprising: monitoring a signal from a power supply of the image capture device to detect when the image capture device begins to produce signals associated with the formation of the plurality of image frames; and in response to the detection, generating and sending one or more control signals to the illumination source to cause the illumination source to emit the illumination energy at the plurality of different wavelengths in the sequence based on the determined timing in Pars. [0064] and [0112]-[0114]. As to Claim 25, Sendai discloses the method of claim 21, further comprising causing an adjustment of one or more of an intensity or timing in [0288] and [0325]-[0331] of the illumination energy of at least one of the plurality of different wavelengths emitted by the illumination source in the sequence to at least one of: alter one or more features of the composite image, control image brightness of the composite image, or control color balance of the composite image in Pars. [0064] and [0112]-[0114]. As to Claim 26, Sendai discloses the method of claim 21, wherein the plurality of different wavelengths includes wavelengths associated with red light, green light, and blue light, and the plurality of image frames includes a first image frame associated with the redlight, a second image frame associated with the green light, and a third image frame associated with the blue light as described in Pars. [0064] and [0112]-[0114]. As to Claim 27, Sendai discloses the method of claim 26, wherein the composite image is a full color image in a red, green, and blue (RGB) color space as described in Pars. [0064] and [0112]-[0114]. As to Claim 28, Sendai discloses the method of claim 21, wherein the plurality of different wavelengths is a plurality of different wavelength bands as described in Pars. [0064] and [0112]-[0114]. As to Claim 29, Sendai discloses the method of claim 28, wherein the plurality of different wavelength bands includes a first wavelength band centered at a wavelength of about 650 nm, a second wavelength band centered at a wavelength of about 510 nm, and a third wavelength band centered at a wavelength of about 475 nm as described in Pars. [0064] and [0112]-[0114]. As to Claim 31, Sendai discloses the method of claim 29, wherein the plurality of image frames includes a first image frame associated with first wavelength band, a second image frame associated with the second wavelength band, and a third image frame associated with the third wavelength band as described in Pars. [0064] and [0112]-[0114]. As to Claim 32, Sendai discloses the method of claim 21, wherein the image capture device includes a plurality of pixels "pixels" in Par. [0115] that are each configured to detect illumination energy of each wavelength of the plurality of wavelengths (during the respective illumination period of each of the white light, IR reflected light, and fluorescent-light image acquisition periods). As to Claim 33, Sendai discloses the method of claim 31, wherein as illumination energy at each wavelength of the plurality of different wavelengths is emitted by the illumination source in the sequence, each of the plurality of pixels is configured to detect the illumination energy at the respective wavelength to form an image frame of the plurality of image frames associated with the respective wavelength as described in Pars. [0064] and [0112]-[0114]. As to Claim 34, Sendai discloses a method for generating a composite image, the method comprising: receiving and processing a first plurality of image frames serially formed by an image capture device (107) in Par. [0111] to determine a timing of the serial formation in [0288] and [0325]-[0331]; sending one or more control signals to an illumination source (110) in Par. [0110] to synchronize a sequence of a plurality of different color illuminations emitted by the illumination source via LEDs (26a, 26b, 26c) in 4/13 and Fig. 3 with the determined timing as described in Pars. [0064] and [0112]-[0114]; receiving a second plurality of image frames as described in [0114] and Figs. 1 and 5 having different wavelength filters serially formed by the image capture device as the sequence of the plurality of different color illuminations is emitted by the illumination source in response to the one or more control signals in Pars. [0064] and [0112]-[0114], wherein each image frame of the second plurality of image frames is associated with one of the plurality of different color illuminations based on the synchronization in Figs. 11A-11C as well as 1/33-40 of incorporated by reference U.S. Patent 4,845,553; and generating a composite image “composite-image” in [0054] and [0067] from the second plurality of image frames. However Sendair does not specifically disclose the plurality of pixels being configured to detect each wavelength. Imaizumi teaches in the analogous field of endeavor of endoscopic image acquisition that multiple CCDs produce not only visible light images in [0108] and [0246] yet also wavelengths in the infrared spectrum in [0208] via filter (50) as shown in Fig. 14 located in front of the CCDs for desired imaging characteristics. It would have been obvious to one of ordinary skill in the art that Sendai’s disclosure of utilizing stimulating-light cutoff filter in order to remove unwanted wavelengths is utilized in combination with an image capture device that is configured to detect illumination energy of any wavelength in the visible spectrum as taught by Imaizumi to fulfill the same function with predictable results of image acquisition of desired wavelengths. However Imaizumi does not specifically disclose wavelength sensitivity of each pixel for each and every wavelength in the visible spectrum simultaneously. Parulski teaches in the analogous field of endeavor of endoscopic image acquisition in 1/17 wherein each of a plurality of pixels is configured to detect illumination energy of each wavelength in the visible spectrum without the image capture device having a structure for separating broad band light into illumination energy at different wavelengths in 2/31-40-53 further supported by incorporated by reference U.S. Patent 4,845,553 in 1/33 of Parulski wherein Fig. 5 and 5/29-42 describe same pixels being read for R, G, and B image data. This evidences the level of ordinary skill in recognizing that sequential illumination and data output of imagers can be utilized to process a sequence of chrominance image signals as equivalent alternatives for providing the same predictable result of converting light to electrical signal. It would have been obvious to one of ordinary skill in the art at the time of invention to have provided the image sensors of Imaizumi with the image sensor equivalents as taught by Parulski as an obvious equivalent in combination and/or alternatively since it has been shown by Parulski that each will provide the predicable result of converting light to electrical signal (Parulski, 2/31-53). Parulski teaches timing considerations for the reasons relevant to claims 21, 30, and 36 above such that the processor is further configured to cause the illumination control system (28) in 4/14 and Fig. 3 to output via LEDs (26a, 26b, 26c) in 4/13 and Fig. 3 each illumination energy in the sequence, including the first illumination energy and the second illumination energy, at a timing and for a duration that is synchronized to the determined timing of the formation of the plurality of images such that the image capture device is enabled to form the image in Pars. [0109]-[0110] for each wavelength of illumination energy one at a time as the one wavelength is output at the given time in the sequence in Figs. 11A-11C as well as 1/33-40 of incorporated by reference U.S. Patent 4,845,553. As to Claim 35, Sendai discloses the method of claim 34, wherein only one color illumination of the plurality of different color illuminations is emitted by the illumination source at a given time in the sequence based on the one or more control signals in Pars. [0064] and [0112]-[0114]. As to Claim 36, Sendai discloses the method of claim 34, further comprising: monitoring a signal from a power supply of the image capture device to detect when the image capture device begins to produce signals associated with the formation of the second plurality of image frames; and in response to the detection, sending the one or more control signals to the illumination source in Pars. [0064] and [0112]-[0114]. As to Claim 37, Sendai discloses the method of claim 34, further comprising: sending one or more additional control signals to the illumination source to cause an adjustment of one or more of an intensity or timing in [0288] and [0325]-[0331] of at least one of the plurality of different color illuminations emitted by the illumination source in the sequence to at least one of: alter one or more features of the composite image, control image brightness of the composite image, or control color balance of the composite image in Pars. [0064] and [0112]-[0114]. As to Claim 38, Sendai discloses the method of claim 34, wherein: the plurality of different color illuminations includes a red illumination, a green illumination, and a blue illumination as described in Pars. [0064] and [0112]-[0114], the second plurality of image frames includes a first image frame associated with the red illumination, a second image frame as described in [0114] and Figs. 1 and 5 having different wavelength filters associated with the green illumination, and a third image frame associated with the blue illumination as described in Pars. [0064] and [0112]-[0114], and the composite image is a full color image in a red, green, and blue (RGB) color space as described in Pars. [0064] and [0112]-[0114]. As to Claim 39, Sendai discloses a method for generating a composite image of an object located in a lumen of a patient, the method comprising: determining a timing of formation of an image frame by an image capture device (107) in Par. [0111] that is positioned at a distal end of a medical device and includes a plurality of pixels, wherein each of the plurality of pixels "pixels" in Par. [0115] are configured to detect illumination energy of any wavelength (during the respective illumination period of each of the white light, IR reflected light, and fluorescent-light image acquisition periods); subsequent to the distal end of the medical device being inserted into the lumen of the patient, causing an illumination source (110) in Par. [0110] to emit, onto the object, illumination energy at a plurality of different wavelengths via LEDs (26a, 26b, 26c) in 4/13 and Fig. 3 in a sequence based on the determined timing such that an emission of one of the plurality of different wavelengths corresponds to a timing of formation of one image frame by the image capture device as described in Pars. [0064] and [0112]-[0114]; receiving a plurality of image frames of the object serially formed by the image capture device as the sequence of the illumination energy at the plurality of different wavelengths as described in [0114] and Figs. 1 and 5 having different wavelength filters is emitted by the illumination source, wherein each of the plurality of image frames is associated with one of the plurality of different wavelengths in Figs. 11A-11C as well as 1/33-40 of incorporated by reference U.S. Patent 4,845,553; and generating a composite image “composite-image” in [0054] and [0067] of the object from the plurality of image frames. However Sendair does not specifically disclose the plurality of pixels being configured to detect each wavelength. Imaizumi teaches in the analogous field of endeavor of endoscopic image acquisition that multiple CCDs produce not only visible light images in [0108] and [0246] yet also wavelengths in the infrared spectrum in [0208] via filter (50) as shown in Fig. 14 located in front of the CCDs for desired imaging characteristics. It would have been obvious to one of ordinary skill in the art that Sendai’s disclosure of utilizing stimulating-light cutoff filter in order to remove unwanted wavelengths is utilized in combination with an image capture device that is configured to detect illumination energy of any wavelength in the visible spectrum as taught by Imaizumi to fulfill the same function with predictable results of image acquisition of desired wavelengths. However Imaizumi does not specifically disclose wavelength sensitivity of each pixel for each and every wavelength in the visible spectrum simultaneously. Parulski teaches in the analogous field of endeavor of endoscopic image acquisition in 1/17 wherein each of a plurality of pixels is configured to detect illumination energy of each wavelength in the visible spectrum without the image capture device having a structure for separating broad band light into illumination energy at different wavelengths in 2/31-40-53 further supported by incorporated by reference U.S. Patent 4,845,553 in 1/33 of Parulski wherein Fig. 5 and 5/29-42 describe same pixels being read for R, G, and B image data. This evidences the level of ordinary skill in recognizing that sequential illumination and data output of imagers can be utilized to process a sequence of chrominance image signals as equivalent alternatives for providing the same predictable result of converting light to electrical signal. It would have been obvious to one of ordinary skill in the art at the time of invention to have provided the image sensors of Imaizumi with the image sensor equivalents as taught by Parulski as an obvious equivalent in combination and/or alternatively since it has been shown by Parulski that each will provide the predicable result of converting light to electrical signal (Parulski, 2/31-53). Parulski teaches timing considerations for the reasons relevant to claims 21, 30, and 36 above such that the processor is further configured to cause the illumination control system (28) in 4/14 and Fig. 3 to output via LEDs (26a, 26b, 26c) in 4/13 and Fig. 3 each illumination energy in the sequence, including the first illumination energy and the second illumination energy, at a timing and for a duration that is synchronized to the determined timing of the formation of the plurality of images such that the image capture device is enabled to form the image in Pars. [0109]-[0110] for each wavelength of illumination energy one at a time as described in Pars. [0064] and [0112]-[0114] as the one wavelength is output at the given time in the sequence in Figs. 11A-11C as well as 1/33-40 of incorporated by reference U.S. Patent 4,845,553. As to Claim 40, Sendai discloses the method of claim 39, wherein, as illumination energy at each wavelength of the plurality of different wavelengths is emitted by the illumination source in the sequence as described in Pars. [0064] and [0112]-[0114], each of the plurality of pixels is configured to detect the illumination energy at the respective wavelength (during the respective illumination period of each of the white light, IR reflected light, and fluorescent-light image acquisition periods) to form an image frame of the plurality of image frames associated with the respective wavelength. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM B CHOU whose telephone number is (571) 270-3367. The examiner can normally be reached on M-F 9 am - 6 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael Carey can be reached on (571) 270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILLIAM CHOU/ Examiner, Art Unit 3795 /MICHAEL J CAREY/Supervisory Patent Examiner, Art Unit 3795
Read full office action

Prosecution Timeline

May 09, 2023
Application Filed
May 09, 2023
Response after Non-Final Action
May 10, 2023
Response after Non-Final Action
Mar 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12569119
Optical Bulb for Surgical Instrument Port
2y 5m to grant Granted Mar 10, 2026
Patent 12508091
SYSTEMS AND METHODS FOR SWITCHING CONTROL BETWEEN MULTIPLE INSTRUMENT ARMS
2y 5m to grant Granted Dec 30, 2025
Patent 12510746
INCREASED RESOLUTION AND DYNAMIC RANGE CAPTURE UNIT IN A SURGICAL INSTRUMENT AND METHOD
2y 5m to grant Granted Dec 30, 2025
Patent 12507881
SYSTEM FOR OBTAINING CLEAR ENDOSCOPE IMAGES
2y 5m to grant Granted Dec 30, 2025
Patent 12507906
INTEGRATED MULTI-FUNCTIONAL ENDOSCOPIC TOOL
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
94%
With Interview (+21.4%)
3y 9m
Median Time to Grant
Low
PTA Risk
Based on 534 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month