Prosecution Insights
Last updated: April 19, 2026
Application No. 18/558,266

IN VIVO DEVICE AND A COMBINED IMAGER THEREFOR

Final Rejection §103
Filed
Oct 31, 2023
Examiner
LEUBECKER, JOHN P
Art Unit
3795
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Given Imaging Ltd.
OA Round
2 (Final)
75%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
85%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
613 granted / 820 resolved
+4.8% vs TC avg
Moderate +11% lift
Without
With
+10.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
31 currently pending
Career history
851
Total Applications
across all art units

Statute-Specific Performance

§101
1.0%
-39.0% vs TC avg
§103
36.9%
-3.1% vs TC avg
§102
29.8%
-10.2% vs TC avg
§112
26.2%
-13.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 820 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the a) “the second wavelength range has a portion not overlapping with the first wavelength range” (claims 2,10) must be shown or the feature(s) canceled from the claim(s). No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-6 and 9-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshino et al. (US 2010/0245616, hereinafter “Yoshino”) in view of Hsieh et al. (US 2006/0164533, hereinafter “Hsieh”). As to claim 1, Yoshino discloses an in-vivo device comprising: a combined sensor array (image sensor 226, Fig.15) comprising a first sensor array (array of RGB sensors, Fig.19) sensitive to a first wavelength range (C24,C25,C26 in Fig.20) and a second sensor array (array of IR sensors, Fig.19) sensitive to a second wavelength range (C27 in Fig.20), the second wavelength range having a partial overlap with the first wavelength range (C27 overlaps C24,C25,C26, Fig.20), the first sensor array configured for collecting light in the first wavelength range and outputting a corresponding first signal (collects and outputs B,G, and R picture signals, [0140]), and the second sensor array configured for collecting light in the second wavelength range and outputting a corresponding second signal (collects and outputs IR pictures signals, [0140]); and a processor (image processing device 230) configured for: receiving the first signal and the second signal (processor receives all picture signals for processing, [0143]) manipulating the first signal based on at least a part of the second signal corresponding to the partial overlap, to output a first image (generates a white light image based on the B, G and R signals by removing the IR component (manipulating) from each of these signals, [0145]); and outputting a second image based on the second signal (outputs a uniform image (IR image) based on the IR picture signals, [0143]). Yoshino fails to disclose that the processor is further configured to access data indicative of degree of turbidity or darkness around the in-vivo device and, based such data, configure at least one of an imaging modality or a frame rate of the combined sensor array. However, Hsieh teaches that is it is known in in-vivo camera sensor array art (disclosed camera imaging device can be used in an endoscope, [0181]) to access data indicative of degree of turbidity or darkness (processor, e.g. 140,144, Fig.2, obtains data indicative of degree of turbidity or darkness using low light sensing capability of sensor, [0011]), and, based on such data, control the image modality of the image sensor to, for example, change frame rates to allow for adequate exposure times to produce quality images, ([0011]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have provided the in-vivo device of Yoshino with the ability to sense the degree of turbidity/darkness and change the imaging modality such a frame rate, as taught by Hsieh, in order to provide adequate exposure times for better quality images (Hsieh: [0011]). As to claim 2, wherein the partial overlap corresponds to at least one of: the second wavelength range is completely contained within the first wavelength range and overlaps a beginning or an end of the first wavelength range (as shown in Fig.20); the second wavelength range is completely contained within the first wavelength range and overlaps a middle portion of the first wavelength range (an IR light component exists in all of the B, G, and R ranges, [0140] and thus overlaps the middle portion and the ends of the first wavelength range); or the second wavelength range has a portion not overlapping with the first wavelength range (as shown in Fig.20). As to claim 3, wherein a portion of the first wavelength range does not overlap with the second wavelength range (as shown in Fig.20). As to claim 4, wherein the first sensor array comprises RGB sensors (array of RGB sensors, Fig.19) and the second sensor array comprises infrared sensors (array of IR sensors, Fig.19). As to claim 5, wherein the second wavelength range includes near infrared (as shown in Fig.20, the IR wavelength range includes 750+ wavelengths). As to claim 6, wherein the first wavelength range includes infrared range such that the first sensor array has at least some sensitivity in the infrared range ([0140]). As to claim 9, Yoshino discloses a method for obtaining images by an in-vivo device having a processor (image processing device 230) and a combined sensor array (image sensor 226, Fig.15) comprising a first sensor array (array of RGB sensors, Fig.19) sensitive to a first wavelength range (C24,C25,C26 in Fig.20) and a second sensor array (array of IR sensors, Fig.19) sensitive to a second wavelength range (C27 in Fig.20), the second wavelength having a partial overlap with the first wavelength range (C27 overlaps C24,C25,C26, Fig.20), the method comprising: using the combined sensor array to collect light in the first wavelength range and output a corresponding first signal (collects and outputs B,G,R picture signals, [0140]) and to collect light in the second wavelength range and output a corresponding second signal (collects and outputs IR picture signals, [0140]); receiving, by the processor, the first signal and the second signal (processor receives all picture signals for processing, [0143]); manipulating, by the processor, the first signal based on at least a part of the second signal corresponding to the partial overlap, to output a first image (generates a white light image based on the B,G,R signals by removing the IR component (manipulating) from each of these signals, [0145]); and outputting, by the processor, a second image based on the second signal (outputs a uniform image (IR image) based on the IR picture signals, [0143]). Yoshino fails to disclose accessing data indicative of degree of turbidity or darkness around the in-vivo device and, based such data, configuring at least one of an imaging modality or a frame rate of the combined sensor array. However, Hsieh teaches that is it is known in in-vivo camera sensor array art (disclosed camera imaging device can be used in an endoscope, [0181]) to access data indicative of degree of turbidity or darkness (processor, e.g. 140,144, Fig.2, obtains data indicative of degree of turbidity or darkness (e.g. low light level) using low light sensing capability of sensor, [0011]), and, based on such data, control the image modality of the image sensor to, for example, change frame rates to allow for adequate exposure times to produce quality images, ([0011]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have provided the in-vivo device of Yoshino with the ability to sense the degree of turbidity/darkness and change the imaging modality such a frame rate, as taught by Hsieh, in order to provide adequate exposure times for better quality images (Hsieh: [0011]). As to claim 10, wherein the partial overlap corresponds to at least one of: the second wavelength range is completely contained within the first wavelength range and overlaps a beginning or an end of the first wavelength range (as shown in Fig.20); the second wavelength range is completely contained within the first wavelength range and overlaps a middle portion of the first wavelength range (an IR light component exists in all of the B, G, and R ranges, [0140] and thus overlaps the middle portion and the ends of the first wavelength range); or the second wavelength range has a portion not overlapping with the first wavelength range (as shown in Fig.20). As to claim 11, wherein a portion of the first wavelength range does not overlap with the second wavelength range (as shown in Fig.20). As to claim 12, wherein the first sensor array comprises RGB sensors (array of RGB sensors, Fig.19) and the second sensor array comprises infrared sensors (array of IR sensors, Fig.19). As to claim 13, wherein the second wavelength range includes near infrared (as shown in Fig.20, the IR wavelength range includes 750+ wavelengths). As to claim 14, wherein the first wavelength range includes infrared range such that the first sensor array has at least some sensitivity in the infrared range ([0140]). Claim(s) 7, 8 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshino et al. (US 2010/0245616, hereinafter “Yoshino”) and Hsieh et al. (US 2006/0164533, hereinafter “Hsieh”) as set forth above with respect to claims 1 and 9, and further in view of Khait et al. (US 2012/0271104, hereinafter “Khait”). As to claim 7, Yoshino discloses an endoscope as the in-vivo device and thus fails to disclose that the in-vivo device is a swallowable capsule. One of ordinary skill in the art will recognize that in-vivo imaging/illumination systems take on various configurations and the teachings of such imaging/illumination system are applicable to all configurations. Khait is just one of numerous references that teaches this ([0016],[0034]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have configured the in-vivo imaging/illumination system of Yoshino as an endoscope (as exemplified) or as a swallowable capsule-type endoscope as a known obvious alternative equivalent configuration for the in-vivo imaging device. As to claims 8 and 15, Yoshino, as set forth above with respect to claims 1 and 9, fails to disclose that the processor is further configured to access data indicative of motion of the in-vivo device; and based on the data, configure the imaging modality and the frame rate of the combined sensor array based on motion and degree of turbidity. However, as is known in the in-vivo imaging art, it is known to monitor the speed of motion (data indicative of motion) and change the frame rate accordingly (e.g. decrease frame rate at slower speeds and increase frame rate at higher speeds) (Khait: [0024]). By configuring the imaging modality (e.g. exposure time) and frame rate based on motion (Khait) and degree of turbidity/darkness (Hsieh, as set forth above), higher quality images (e.g. with proper exposure times, Hsieh) with reduced redundancy because of motion (Khait) can be obtained. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have processed the images of the Yoshino device by changing the imaging modality (e.g. exposure time) and frame rate according to speed of motion in order to improve imaging results by reducing redundancy during slow/stationary motion and preventing overlooking of important images during fast/sudden movement. Allowable Subject Matter Claims 16-20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Response to Arguments Rejections and objections from the previous Office Action that have not been repeated in this Office Action should be considered as addressed or corrected, and thus hereby withdrawn. Applicant's arguments filed December 22, 2025 have been fully considered but they are not persuasive. As for the previous Drawing Objection, Applicant argues that the “not shown” subject matter is actually shown in Figure 3, citing a wavelength range of about 750nm-1050nm as one of the wavelength ranges (60 in Fig.3) and a wavelength range of about 400-1050nm. Given the processor functioning in claim 1 on the first and second signals (which correspond to the first and second wavelength ranges, respectively), and the assumed validity of the first two conditions of claim 2, wherein the second wavelength range is completely contained within the first wavelength range, it is clear that the claimed first wavelength range is referring to the 400nm-1050nm range and the second wavelength range is referring to the 750nm-1050nm range (see paragraph [0012] of Applicant’s specification). This is further supported by the language of claim 3, which recites that a portion (400nm-750nm) of the first wavelength range (400nm-1050nm) does not overlap with the second wavelength range (750nm-1050nm). Thus, Figure 3 does not and can not show the second wavelength range (e.g. 750nm-1050nm) having a portion NOT overlapping with the first wavelength range, based on the example wavelength ranges set forth by Applicant, because the second wavelength range is completely contained within the first wavelength range. Therefore, the drawing objection is being maintained. Regarding the previous rejection of claims 1 and 9 over Yoshino, Applicant amended claims 1 and 9 to include an alternative from claims 8 and 15, respectively, that was not required to be addressed. Accordingly, such alternative, in view of its combination with the other elements of claims 1 and 9, respectively, is addressed above in the 103 rejection of Yoshino in view of Hsieh. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See references cited on PTO-892. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN P LEUBECKER whose telephone number is (571)272-4769. The examiner can normally be reached Generally, M-F, 5:30-2:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan T Nguyen can be reached at 571-272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOHN P LEUBECKER/Primary Examiner, Art Unit 3795
Read full office action

Prosecution Timeline

Oct 31, 2023
Application Filed
Sep 28, 2025
Non-Final Rejection — §103
Dec 22, 2025
Response Filed
Mar 15, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599291
OPTICAL INSTRUMENT CONFIGURED TO SWITCH BETWEEN AN INTEGRATED AND EXTERNAL LIGHT SOURCE
2y 5m to grant Granted Apr 14, 2026
Patent 12593968
SYSTEMS AND METHODS FOR DATA COMMUNICATION VIA A LIGHT CABLE
2y 5m to grant Granted Apr 07, 2026
Patent 12582509
DEVICE AND METHOD FOR SUBGINGIVAL MEASUREMENT
2y 5m to grant Granted Mar 24, 2026
Patent 12580077
SYSTEMS AND METHODS FOR IDENTIFYING THE NATURE OF DEFECTS IN MEDICAL SCOPES, AND DETERMINING SERVICING AND/OR FUTURE USE OF THE SCOPES
2y 5m to grant Granted Mar 17, 2026
Patent 12575717
DEVICE DELIVERY TOOLS AND SYSTEMS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
75%
Grant Probability
85%
With Interview (+10.6%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 820 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month