Prosecution Insights
Last updated: April 19, 2026
Application No. 18/519,075

PROCESSOR DEVICE, METHOD OF OPERATING THE SAME, AND ENDOSCOPE SYSTEM

Non-Final OA §101§102
Filed
Nov 27, 2023
Examiner
NATNITHITHADHA, NAVIN
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Fujifilm Corporation
OA Round
1 (Non-Final)
71%
Grant Probability
Favorable
1-2
OA Rounds
4y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
685 granted / 963 resolved
+1.1% vs TC avg
Strong +31% interview lift
Without
With
+30.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
45 currently pending
Career history
1008
Total Applications
across all art units

Statute-Specific Performance

§101
12.6%
-27.4% vs TC avg
§103
30.9%
-9.1% vs TC avg
§102
29.2%
-10.8% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 963 resolved cases

Office Action

§101 §102
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation 2. The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. 3. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. 4. This application includes one or more claim limitations that use the word “means” or “step” but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure, materials, or acts to entirely perform the recited function. Such claim limitation(s) is/are: “a step of acquiring a first image signal …” in claim 10; “a step of calculating a calculation value via calculation processing based on the second image signal, the third image signal, and the fourth image signal” in claim 10; “a step of calculating the oxygen saturation on the basis of the calculation value with reference to an oxygen saturation calculation table” in claim 10; “a step of calculating the specific colorant concentration on the basis of the first image signal and the third image signal” in claim 10; “a step of monitoring the specific colorant concentration during the calculation of the oxygen saturation” in claim 10; and “a step of performing a correction notification on the basis of a result of the monitoring of the specific colorant concentration” in claim 10. Because this/these claim limitation(s) is/are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are not being interpreted to cover only the corresponding structure, material, or acts described in the specification as performing the claimed function, and equivalents thereof. If applicant intends to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to remove the structure, materials, or acts that performs the claimed function; or (2) present a sufficient showing that the claim limitation(s) does/do not recite sufficient structure, materials, or acts to perform the claimed function. Claim Rejections - 35 USC § 101 5. 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 6. Claims 1-7 and 10 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception, i.e. abstract idea, without significantly more. Step 1 of the Patent Subject Matter Eligibility Guidance (see MPEP 2106.03): Claims 1-7 are directed to a “device”, which describes one of the four statutory categories of patentable subject matter, i.e. a machine. Claim 10 is directed to a “method”, which describes one of the four statutory categories of patentable subject matter, i.e. a process. Step 2A of the Revised Patent Subject Matter Eligibility Guidance (see MPEP 2106.04): Claim(s) 1-7 and 10, recite the following mental process: calculate a calculation value via calculation processing based on the second image signal, the third image signal, and the fourth image signal; calculate the oxygen saturation on the basis of the calculation value with reference to an oxygen saturation calculation table; calculate the specific colorant concentration on the basis of the first image signal and the third image signal; monitor the specific colorant concentration during the calculation of the oxygen saturation; and perform a correction notification on the basis of a result of the monitoring of the specific colorant concentration. This judicial exception is not integrated into a practical application because the additional limitations of “acquire a first image signal that corresponds to a first wavelength range having a sensitivity to a specific colorant concentration of a specific colorant other than blood hemoglobin among colorants included in a subject of observation, a second image signal that corresponds to a second wavelength range having a sensitivity to oxygen saturation of blood hemoglobin, a third image signal that corresponds to a third wavelength range having a sensitivity to a blood volume, and a fourth image signal that corresponds to a fourth wavelength range having a wavelength longer than wavelengths of the first wavelength range, the second wavelength range, and the third wavelength range;” in claims 1 and 10 add insignificant pre-solution activity to the abstract idea that merely collects data to be used by the mental process. Furthermore, “one or more processors configured to:” in claim 1, and “operating a processor device, the method, executed by one or more processors” in claim 10, are merely parts of a computer to be used as a tool to perform the mental process. Step 2B of the Patent Subject Matter Eligibility Guidance (see MPEP 2106.05): The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception, when considered separately and in combination. Analyzing the additional claim limitations individually, the additional limitations that are not directed to the mental process are “acquire a first image signal that corresponds to a first wavelength range having a sensitivity to a specific colorant concentration of a specific colorant other than blood hemoglobin among colorants included in a subject of observation, a second image signal that corresponds to a second wavelength range having a sensitivity to oxygen saturation of blood hemoglobin, a third image signal that corresponds to a third wavelength range having a sensitivity to a blood volume, and a fourth image signal that corresponds to a fourth wavelength range having a wavelength longer than wavelengths of the first wavelength range, the second wavelength range, and the third wavelength range;” in claims 1 and 10. Such limitations are conventional and routine in the art, and add insignificant pre-solution activity to the abstract idea that merely collects data to be used by the abstract idea. The additional limitations “one or more processors configured to:” in claim 1, and “operating a processor device, the method, executed by one or more processors” in claim 10, are merely parts of a computer to be used as a tool to perform the mental process. The additional limitations of dependent claims 2-7 and 9 are merely directed to and further narrow the scope of the mental process. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. Their collective functions merely provide computer implementation of the abstract idea using collected data without: improvement to the functioning of a computer or to any other technology or technical field; applying the mental process with, or by use of, a particular machine; effecting a transformation or reduction of a particular article to a different state or thing; applying or using the mental process in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment; or adding a specific limitation other than what is well-understood, routine, conventional activity in the field. Claim Rejections - 35 USC § 102 7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 8. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 9. Claims 1-10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Shiraishi, U.S. Patent Application Publication No. 2016/0183774 A1 (“Shiraishi”). As to Claim 1, Shiraishi teaches the following: A processor device (“processor device”) 16 (see “The present invention relates to an endoscope system, a processor device, an operation method, and a distance measurement device for observing the inside of a subject.” in para. [0003]; and see “As shown in FIG. 1, an endoscope system 10 of a first embodiment includes an endoscope 12, a light source device 14, a processor device 16, a monitor 18, and a console 20.” in para. [0057], and fig. 1) comprising: one or more processors (“receiving unit 54”, “image processing switching unit 60”, “normal observation image processing unit 62”, “observation distance measurement unit 63”, and “special observation image processing unit 64”, and “image display signal generation unit 66”) 54, 56, 60, 62, 63, and 64 (see “The processor device 16 includes a receiving unit 54, an image processing switching unit 60, a normal observation image processing unit 62, an observation distance measurement unit 63, a special observation image processing unit 64, and an image display signal generation unit 66.” in para. [0078]) configured to: acquire a first image signal (“G2 image signal”) that corresponds to a first wavelength range (see “the G color filter has a spectral transmittance of 450 nm to 630 nm” in para. [0071]) having a sensitivity to a specific colorant (“green”) concentration of a specific colorant other than blood hemoglobin among colorants included in a subject of observation (see “In the present embodiment, therefore, a component (first blue laser light transmitted through the phosphor 44) that becomes the B1 image signal in the first white light is the first signal light, and a component (green band component of the second fluorescence) that becomes the G2 image signal in the second white light is the second signal light.” in para. [0076]; and see “Therefore, by using not only the B1 image signal but also the signal ratios B1/G2 and R2/G2 obtained from the R2 image signal, which corresponds to light that changes mainly depending on the blood volume, and the G2 image signal, which is a reference signal of the B1 image signal and the R2 image signal, it is possible to accurately calculate the oxygen saturation without there being dependency on the blood volume.” in para. [0103]), a second image signal (“B1 image signal”) that corresponds to a second wavelength range (see “the B color filter has a spectral transmittance of 380 nm to 560 nm” in para. [0071]) having a sensitivity to oxygen saturation of blood hemoglobin (see “For example, as at a center wavelength of 473 nm of the first blue laser light, at a wavelength at which the difference between the absorption coefficient of oxygenated hemoglobin and the absorption coefficient of reduced hemoglobin is large, it is easy to handle the information of the oxygen saturation. However, the B1 image signal including a signal corresponding to 473-nm light has a high dependence not only on the oxygen saturation but also on the blood volume.” in para. [0076]), a third image signal (“R2 image signal”) that corresponds to a third wavelength range (see “the R color filter has a spectral transmittance of 580 nm to 760 nm” in para. [0071]) having a sensitivity to a blood volume (see “Therefore, by using not only the B1 image signal but also the signal ratios B1/G2 and R2/G2 obtained from the R2 image signal, which corresponds to light that changes mainly depending on the blood volume, and the G2 image signal, which is a reference signal of the B1 image signal and the R2 image signal, …” in para. [0103]), and a fourth image signal (“narrowband image signal”) that corresponds to a fourth wavelength range (see “The broadband light source 401 is, for example, a xenon lamp or a white LED, and emits white light having a wavelength in a wavelength band ranging from blue to red.” in para. [0159]) having a wavelength longer than wavelengths of the first wavelength range, the second wavelength range, and the third wavelength range (see “The broadband light source 401 is, for example, a xenon lamp or a white LED, and emits white light having a wavelength in a wavelength band ranging from blue to red. The rotary filter 402 includes a normal observation mode filter 410 and a special observation mode filter 411 (refer to FIG. 30), and can move in a radial direction between a first position for normal observation mode to place the normal observation mode filter 410 on the optical path, in which the white light emitted from the broadband light source 401 is incident on the light guide 41, and a second position for special observation mode to place the special observation mode filter 411 on the optical path.” in para. [0159]; and see “The special observation mode filter 411 includes an R filter 411a that transmits red light, a G filter 411b that transmits green light, a B filter 411c that transmits blue light, and a narrowband filter 411d that transmits narrowband light of 473±10 nm. Therefore, when the rotary filter 402 is placed at the second position for normal light observation mode, the white light from the broadband light source 401 is incident on one of the R filter 411a, the G filter 411b, the B filter 411c, and the narrowband filter 411d according to the rotation of the rotary filter 402. As a result, red light, green light, blue light, and narrowband light (473 nm) are sequentially emitted to the subject according to the transmitted filter, and the image sensor 405 sequentially outputs an R image signal, a G image signal, a B image signal, and a narrowband image signal by imaging the subject by receiving the reflected light of the red light, the green light, the blue light, and the narrowband light.” in para. [0161]); calculate a calculation value via calculation processing based on the second image signal, the third image signal, and the fourth image signal (see “The R image signal and the G image signal obtained in the special observation mode correspond to the R1 (or R2) image signal and the G1 (or G2) image signal in the first embodiment, respectively. In addition, the B image signal obtained in the special observation mode corresponds to the B2 image signal in the first embodiment, and the narrowband image signal corresponds to the B1 image signal.” in para. [0161]; and see “Then, in the divided regions 501a to 501i, specific frequency Ω.sub.v components are extracted, and the amplitude differences ΔBG, ΔGR, and ΔRB are calculated.” in para. [0163]); calculate the oxygen saturation on the basis of the calculation value with reference to an oxygen saturation calculation table (see “By repeatedly performing such measurement while changing the observation distance, measurement data 96 in which the amplitude differences ΔBG, ΔGR, and ΔRB, the observation distance, and the error of the oxygen saturation are associated with each other as shown in FIG. 21 is obtained. In the measurement data 96, a table in which the amplitude differences ΔBG. ΔGR, and ΔRB and the corresponding observation distances are stored is the normal distance table 83a, and a table in which the observation distance and the corresponding error of the oxygen saturation are stored is the normal error table 87a.” in para. [0117]); calculate the specific colorant concentration on the basis of the first image signal and the third image signal (see “In addition, the subject at the time of normal observation is observed mainly in the contrast of the image signal of each color corresponding to the amount of absorption (or the amount of reflection) of hemoglobin contained in the blood. In contrast, all of the above-described colorant, cleaning solution, and residues, are substances that break the balance of the contrasts of the image signals of the respective colors corresponding to the amount of absorption (amount of reflection) of hemoglobin. In this specification, substances other than hemoglobin that breaks the balance of the contrasts of the image signals of the respective colors are referred to collectively as a non-hemoglobin substance.” in para. [0143]); monitor the specific colorant concentration during the calculation of the oxygen saturation (see “In the embodiment described above, the B1 image signal, the G2 image signal, and the R2 image signal that are used for the calculation of oxygen saturation are used in the measurement of the observation distance in the special observation mode. However, the measurement of the observation distance may be performed using the B1 image signal, the G1 image signal, and the R1 image signal obtained in the first frame, or the measurement of the observation distance may be performed using the B2 image signal, the G2 image signal, and the R2 image signal obtained in the second frame. In this manner, also in the special observation mode, it is possible to measure the accurate observation distance just with image signals obtained in only one frame as in the measurement of the observation distance in the normal observation mode.” in para. [0146]); and perform a correction notification on the basis of a result of the monitoring of the specific colorant concentration (see “In the endoscope system 10, however, if a setting indicating that a colorant is introduced into the subject is performed in the processor device 16, an observation distance is calculated in the second measurement mode for coloring, and the oxygen saturation is corrected based on the accurate observation distance calculated in the second measurement mode. Therefore, even if the specific tissue 111 is colored, for example, as in an oxygen saturation image 116, a correct low oxygen region 102 is displayed as in the oxygen saturation image 101. In addition, even in a case in which the low oxygen region 102 is magnified by bringing the distal portion 24 close to the subject (or by performing a zooming operation) to shorten the observation distance, the second measurement mode for coloring is calculated, and the oxygen saturation is corrected based on the accurate observation distance calculated in the second measurement mode. Therefore, as in an oxygen saturation image 118, even if the specific tissue 111 is colored, it is possible to correctly magnify and observe a region around the low oxygen region 102 without the artifact 104 appearing unlike in the oxygen saturation image 105.” in para. [0129]). As to Claim 2, Shiraishi teaches the following: wherein the one or more processors 54, 56, 60, 62, 63, and 64 configured to: calculate a first difference or a first ratio between each of specific colorant concentrations sequentially calculated and a reference concentration predetermined for the specific colorant concentration (see “In order to calculate the oxygen saturation, a signal ratio B1/G2 between the B1 image signal and the G2 image signal and a signal ratio R2/G2 between the R2 image signal and the G2 image signal are used. Between these signal ratios, the signal ratio B1/G2 between the B1 image signal and the G2 image signal is an essential signal ratio for the calculation of oxygen saturation.” in para. [0076]; and see “The difference calculation section 82 calculates a difference of the specific frequency component between the amplitudes P(B1), P(G2), and P(R2) extracted by the frequency component information extraction section 81. That is, a difference ΔBG (=P(B1)−P(G2)) between the amplitudes of specific frequency components of the B1 image signal and the G2 image signal, a difference ΔGR (=P(G2)−P(R2)) between the amplitudes of specific frequency components of the G2 image signal and the R2 image signal, and a difference ΔRB (=P(R2)−P(B1)) between the amplitudes of specific frequency components of the R2 image signal and the B1 image signal are calculated for each specific frequency component.” in para. [0088])); and perform the correction notification in a case where the first difference or the first ratio is out of a specific range (see para. [0129]). As to Claim 3, Shiraishi teaches the following: wherein the one or more processors 54, 56, 60, 62, 63, and 64 configured to: calculate a specific colorant concentration-average value that is an average value of specific colorant concentrations calculated in a certain time (see “The receiving unit 54 includes a digital signal processor (DSP) 56 and a noise removal section 58, and the DSP 56 performs digital signal processing, such as color correction processing, on the received image signal. The noise removal section 58 performs noise removal processing using, for example, a moving average method or a median filter method, on the image signal obtained after the color correction processing or the like in the DSP 56. The image signals after noise has been removed are input to the image processing switching unit 60.” in para. [0078]); calculate a second difference or a second ratio between the specific colorant concentration-average value and a reference concentration predetermined for the specific colorant concentration (see para. [0078]); and perform the correction notification in a case where the second difference or the second ratio is out of a specific range (see para. [0129]). As to Claim 4, Shiraishi teaches the following: wherein the reference concentration is the specific colorant concentration obtained at a timing when the oxygen saturation calculation table is corrected (see para. [0117]). As to Claim 5, Shiraishi teaches the following: wherein the reference concentration is predetermined for each patient or each site (see para. [0116]). As to Claim 6, Shiraishi teaches the following: wherein the specific colorant is a yellow colorant (see “As the image sensor 48, it is also possible to use a so-called complementary color image sensor including complementary color filters of cyan (C), magenta (M), yellow (Y), and green (G) on the imaging surface.” in para. [0073]). As to Claim 7, Shiraishi teaches the following: wherein the first wavelength range is 450 ± 10 nm, the second wavelength range is 470 ± 10 nm, the third wavelength range is a green light wavelength range, and the fourth wavelength range is a red light wavelength range (see para. [0071]). As to Claim 8, Shiraishi teaches the following: the processor device according to claim 1 (see grounds for rejection of claim 1 above); a light source device (“light source device”) 14 that includes a light source unit (“LED light source unit”) 301 and a light source processor (“LED light source control unit”) 304 (see “As shown in FIG. 25, in a light source device 14 of an endoscope system 300, a light emitting diode (LED) light source unit 301 and an LED light source control unit 304 are provided instead of the first and second blue laser light sources 34 and 36 and the light source control unit 40.” in para. [0150]), the light source unit 301 including a first semiconductor light source (“B-LED”) 301c emitting first blue light, a second semiconductor light source (“narrowband filter”) 411d emitting second blue light having a wavelength longer than a wavelength of the first blue light, a third semiconductor light source (“G-LED”) 301b emitting green light, and a fourth semiconductor light source (“R-LED”) 301a emitting red light, and the light source processor 304 controlling turn-on and turn-off of the first semiconductor light source 301c, the second semiconductor light source 411d, the third semiconductor light source 301b, and the fourth semiconductor light source 301a (see para. [0151] and [0161]); and an endoscope that includes an imaging sensor provided with a B color filter having a blue light transmission range, a G color filter having a green light transmission range, and an R color filter having a red light transmission range (see para. [0151] and [0161]), wherein the first wavelength range is a wavelength range of light, which has been transmitted through the B color filter, of the green light (see para. [0151] and [0161]), the second wavelength range is a wavelength range of light, which has been transmitted through the B color filter, of the second blue light (see para. [0151] and [0161]), the third wavelength range is a wavelength range of light, which has been transmitted through the G color filter, of the green light (see para. [0151] and [0161]), and the fourth wavelength range is a wavelength range of light, which has been transmitted through the R color filter, of the red light (see para. [0151] and [0161]). As to Claim 9, Shiraishi teaches the following: wherein the blue light transmission range is 380 to 560 nm, the green light transmission range is 450 to 630 nm, and the red light transmission range is 580 to 760 nm (see para. [0151] and [0161]). As to Claim 10, Shiraishi teaches the following: A method of operating a processor device, the method, executed by one or more processors, comprising: a step of acquiring a first image signal (“G2 image signal”) that corresponds to a first wavelength range (see “the G color filter has a spectral transmittance of 450 nm to 630 nm” in para. [0071]) having a sensitivity to a specific colorant (“green”) concentration of a specific colorant other than blood hemoglobin among colorants included in a subject of observation (see “In the present embodiment, therefore, a component (first blue laser light transmitted through the phosphor 44) that becomes the B1 image signal in the first white light is the first signal light, and a component (green band component of the second fluorescence) that becomes the G2 image signal in the second white light is the second signal light.” in para. [0076]; and see “Therefore, by using not only the B1 image signal but also the signal ratios B1/G2 and R2/G2 obtained from the R2 image signal, which corresponds to light that changes mainly depending on the blood volume, and the G2 image signal, which is a reference signal of the B1 image signal and the R2 image signal, it is possible to accurately calculate the oxygen saturation without there being dependency on the blood volume.” in para. [0103]), a second image signal (“B1 image signal”) that corresponds to a second wavelength range (see “the B color filter has a spectral transmittance of 380 nm to 560 nm” in para. [0071]) having a sensitivity to oxygen saturation of blood hemoglobin (see “For example, as at a center wavelength of 473 nm of the first blue laser light, at a wavelength at which the difference between the absorption coefficient of oxygenated hemoglobin and the absorption coefficient of reduced hemoglobin is large, it is easy to handle the information of the oxygen saturation. However, the B1 image signal including a signal corresponding to 473-nm light has a high dependence not only on the oxygen saturation but also on the blood volume.” in para. [0076]), a third image signal (“R2 image signal”) that corresponds to a third wavelength range (see “the R color filter has a spectral transmittance of 580 nm to 760 nm” in para. [0071]) having a sensitivity to a blood volume (see “Therefore, by using not only the B1 image signal but also the signal ratios B1/G2 and R2/G2 obtained from the R2 image signal, which corresponds to light that changes mainly depending on the blood volume, and the G2 image signal, which is a reference signal of the B1 image signal and the R2 image signal, …” in para. [0103]), and a fourth image signal (“narrowband image signal”) that corresponds to a fourth wavelength range (see “The broadband light source 401 is, for example, a xenon lamp or a white LED, and emits white light having a wavelength in a wavelength band ranging from blue to red.” in para. [0159]) having a wavelength longer than wavelengths of the first wavelength range, the second wavelength range, and the third wavelength range (see “The broadband light source 401 is, for example, a xenon lamp or a white LED, and emits white light having a wavelength in a wavelength band ranging from blue to red. The rotary filter 402 includes a normal observation mode filter 410 and a special observation mode filter 411 (refer to FIG. 30), and can move in a radial direction between a first position for normal observation mode to place the normal observation mode filter 410 on the optical path, in which the white light emitted from the broadband light source 401 is incident on the light guide 41, and a second position for special observation mode to place the special observation mode filter 411 on the optical path.” in para. [0159]; and see “The special observation mode filter 411 includes an R filter 411a that transmits red light, a G filter 411b that transmits green light, a B filter 411c that transmits blue light, and a narrowband filter 411d that transmits narrowband light of 473±10 nm. Therefore, when the rotary filter 402 is placed at the second position for normal light observation mode, the white light from the broadband light source 401 is incident on one of the R filter 411a, the G filter 411b, the B filter 411c, and the narrowband filter 411d according to the rotation of the rotary filter 402. As a result, red light, green light, blue light, and narrowband light (473 nm) are sequentially emitted to the subject according to the transmitted filter, and the image sensor 405 sequentially outputs an R image signal, a G image signal, a B image signal, and a narrowband image signal by imaging the subject by receiving the reflected light of the red light, the green light, the blue light, and the narrowband light.” in para. [0161]); a step of calculating a calculation value via calculation processing based on the second image signal, the third image signal, and the fourth image signal (see “The R image signal and the G image signal obtained in the special observation mode correspond to the R1 (or R2) image signal and the G1 (or G2) image signal in the first embodiment, respectively. In addition, the B image signal obtained in the special observation mode corresponds to the B2 image signal in the first embodiment, and the narrowband image signal corresponds to the B1 image signal.” in para. [0161]; and see “Then, in the divided regions 501a to 501i, specific frequency Ω.sub.v components are extracted, and the amplitude differences ΔBG, ΔGR, and ΔRB are calculated.” in para. [0163]); a step of calculating the oxygen saturation on the basis of the calculation value with reference to an oxygen saturation calculation table (see “By repeatedly performing such measurement while changing the observation distance, measurement data 96 in which the amplitude differences ΔBG, ΔGR, and ΔRB, the observation distance, and the error of the oxygen saturation are associated with each other as shown in FIG. 21 is obtained. In the measurement data 96, a table in which the amplitude differences ΔBG. ΔGR, and ΔRB and the corresponding observation distances are stored is the normal distance table 83a, and a table in which the observation distance and the corresponding error of the oxygen saturation are stored is the normal error table 87a.” in para. [0117]); a step of calculating the specific colorant concentration on the basis of the first image signal and the third image signal (see “In addition, the subject at the time of normal observation is observed mainly in the contrast of the image signal of each color corresponding to the amount of absorption (or the amount of reflection) of hemoglobin contained in the blood. In contrast, all of the above-described colorant, cleaning solution, and residues, are substances that break the balance of the contrasts of the image signals of the respective colors corresponding to the amount of absorption (amount of reflection) of hemoglobin. In this specification, substances other than hemoglobin that breaks the balance of the contrasts of the image signals of the respective colors are referred to collectively as a non-hemoglobin substance.” in para. [0143]); a step of monitoring the specific colorant concentration during the calculation of the oxygen saturation (see “In the embodiment described above, the B1 image signal, the G2 image signal, and the R2 image signal that are used for the calculation of oxygen saturation are used in the measurement of the observation distance in the special observation mode. However, the measurement of the observation distance may be performed using the B1 image signal, the G1 image signal, and the R1 image signal obtained in the first frame, or the measurement of the observation distance may be performed using the B2 image signal, the G2 image signal, and the R2 image signal obtained in the second frame. In this manner, also in the special observation mode, it is possible to measure the accurate observation distance just with image signals obtained in only one frame as in the measurement of the observation distance in the normal observation mode.” in para. [0146]); and a step of performing a correction notification on the basis of a result of the monitoring of the specific colorant concentration (see “In the endoscope system 10, however, if a setting indicating that a colorant is introduced into the subject is performed in the processor device 16, an observation distance is calculated in the second measurement mode for coloring, and the oxygen saturation is corrected based on the accurate observation distance calculated in the second measurement mode. Therefore, even if the specific tissue 111 is colored, for example, as in an oxygen saturation image 116, a correct low oxygen region 102 is displayed as in the oxygen saturation image 101. In addition, even in a case in which the low oxygen region 102 is magnified by bringing the distal portion 24 close to the subject (or by performing a zooming operation) to shorten the observation distance, the second measurement mode for coloring is calculated, and the oxygen saturation is corrected based on the accurate observation distance calculated in the second measurement mode. Therefore, as in an oxygen saturation image 118, even if the specific tissue 111 is colored, it is possible to correctly magnify and observe a region around the low oxygen region 102 without the artifact 104 appearing unlike in the oxygen saturation image 105.” in para. [0129]). Conclusion 10. Any inquiry concerning this communication or earlier communications from the examiner should be directed to NAVIN NATNITHITHADHA whose telephone number is (571)272-4732. The examiner can normally be reached Monday - Friday 8:00 am - 8:00 am - 4:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason M Sims can be reached at 571-272-7540. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NAVIN NATNITHITHADHA/Primary Examiner, Art Unit 3791 01/07/2026
Read full office action

Prosecution Timeline

Nov 27, 2023
Application Filed
Jan 07, 2026
Non-Final Rejection — §101, §102
Apr 07, 2026
Applicant Interview (Telephonic)
Apr 08, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12569172
DEVICES, SYSTEMS, AND METHODS ASSOCIATED WITH ANALYTE MONITORING DEVICES AND DEVICES INCORPORATING THE SAME
2y 5m to grant Granted Mar 10, 2026
Patent 12564329
Optical Device for Determining Pulse Rate
2y 5m to grant Granted Mar 03, 2026
Patent 12562273
MEDICAL DEVICES AND METHODS
2y 5m to grant Granted Feb 24, 2026
Patent 12555404
DISPLAY DEVICE HAVING BIOMETRIC FUNCTION AND OPERATION METHOD THEREOF
2y 5m to grant Granted Feb 17, 2026
Patent 12543976
SYSTEM FOR MONITORING BODY CHEMISTRY
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+30.9%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 963 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month