Prosecution Insights
Last updated: April 19, 2026
Application No. 19/053,662

MEDICAL IMAGING DEVICE

Non-Final OA §101§103
Filed
Feb 14, 2025
Examiner
STOLTENBERG, DAVID J
Art Unit
3685
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Welch Allyn Inc.
OA Round
1 (Non-Final)
57%
Grant Probability
Moderate
1-2
OA Rounds
3y 7m
To Grant
82%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
299 granted / 522 resolved
+5.3% vs TC avg
Strong +25% interview lift
Without
With
+24.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
23 currently pending
Career history
545
Total Applications
across all art units

Statute-Specific Performance

§101
31.6%
-8.4% vs TC avg
§103
37.0%
-3.0% vs TC avg
§102
13.5%
-26.5% vs TC avg
§112
10.8%
-29.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 522 resolved cases

Office Action

§101 §103
DETAILED CORRESPONDENCE The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This non-final first office action on merits is in response to the Patent Application filed on 14 February 2025. Claims 1-20 are pending and considered below. Priority Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, or 365(c) is acknowledged. Applicants claim of priority to provisional application filed on 20 February 2024 is acknowledged. Therefore the instant invention is accorded a priority date of 20 February 2024. Claim Rejections - 35 USC § 101 As a result of evaluation of the instant invention under the requirements of the 2019 PEG Revised Step 2A Prongs One and Two and MPEP 2106 the Examiner determines the instant invention is directed to a judicial exception and is further directed to a practical application and improvement to computer operations. Therefore the instant invention is determined eligible under the requirements of the statute. As a result of analysis under the Revised Step 2A Prong One and MPEP 2106 the Examiner determines the instant invention is directed to a judicial exception as related to managing personal behavior or relationships or interactions between people including social activities, teaching, and following rules or instructions as well mental processes including concepts performed in the human mind including observation, evaluation, judgement and opinion. As a result of analysis under Revised Step 2A Prong Two and MPEP 2106 the Examiner determines that the instant invention is directed to a practical application and improvement to computer functioning because the instant independent claims as related to the detailed processing as presented in the written description clearly details optimized computational processing of data as related to implementing cameras for the capture of images and the further processing of the images as related to a type of optical viewing device as well as the capture of user interactions with the workflow and the generation of recommendations based upon the processing of captured data. Examiner’s conclusion is guided at least by disclosures of the written description as detailed at paragraphs [48]-[58]. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 2, 5-8, 11-15, and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Perkins et al. (20240388777) and Small et al. (20230240523). Claims 1 and 14: Perkins discloses an imaging device for capturing images through an eyepiece of an optical viewing device, the imaging device comprising: at least one processing device ([4 “imaging device comprising: a housing for attachment to the optical viewing device; at least one processing device housed inside the housing; and at least one computer-readable data storage device storing software instructions that, when executed by the at least one processing device, cause the at least one processing device to: detect attachment to the optical viewing device; determine a type of the optical viewing device; and adjust at least one aspect of the imaging device based on the type of the optical viewing device,” 6, 89, 90, Fig. 16]); and at least one computer readable data storage device storing software instructions that, when executed by the at least one processing device ([89-101]), cause the imaging device to: determine a type of the optical viewing device ([27 “FIG. 1 shows examples of different types of optical viewing devices 100. For example, the optical viewing devices 100 include a first type of optical viewing device 102 such as an otoscope, a second type of optical viewing device 104 such as an ophthalmoscope, and a third type of optical viewing device 106 such as a dermatoscope,” 28-30]); present a workflow on the display based on the type of the optical viewing device ([66 “a user interface displayed on the display screen 404. In further examples, operation 1106 can include displaying a workflow on the display screen 404 that is specialized for capturing images for an ear exam,” 67 “operation 1106 can include displaying a workflow that can include automatic ear detection based on identifying a location of an ear drum or other anatomy of the ear. In some examples, the workflow automatically captures an image of the ear (without user input) when the workflow detects the ear drum or other anatomy of the ear. In some examples, the imaging device 400, 400b labels anatomical structures, such as acute otitis media (AOM) or tympanic perforation, and alert the user when such structure or condition is identified,” 68 “operation 1108 can include displaying a workflow on the display screen 404 that is specialized for capturing images for an eye exam,” 69-74]); and capture metrics related to user interactions with the workflow presented on the display and the images captured by the camera during the workflow ([48 “imaging device 400 includes a display screen 404 for displaying the images captured by the camera 410. In some examples, the display screen 404 is a touchscreen such that it can both display the images, and receive inputs from a user. For example, the display screen 404 can be used by a user of the imaging device to: adjust the settings of the camera 410 (e.g., focus, exposure, white balance, FOV/zoom); tapping the display screen 404 to trigger focus and lock; adjust settings of the display screen 404 such as the screen brightness; provide a virtual keyboard to type in information; display a battery-life indicator; provide video recording controls (e.g., start, stop, save, delete, review, upload; provide a sliding bar to go through video frames, pinch-zoom to enlarge; display arrow(s) to indicate image orientation; and display one or more stamps (e.g., date, time, filter info, etc.) on saved images,” 68 “operation 1108 can include displaying a workflow on the display screen 404 that is specialized for capturing images for an eye exam,” 69 “operation 1108 can include displaying a workflow that can include automatic eye detection based on identifying a location of an optic disc or other anatomy of the eye. In some examples, the workflow automatically captures an image of the eye (without user input) when the workflow detects the optic disc or other anatomy of the eye,” 70 “operation 1104 determines that the instrument head 200 is a dermatoscope, the method 1100 proceeds to an operation 1110 of adjusting at least one feature of the imaging device 400, 400b for optimal use of the imaging device 400, 400b when attached to the dermatoscope. The method 1100 can include additional operations for adjusting the features on the imaging device 400, 400b based on the type of instrument head determined in operation 1104. For example, operation 1110 can include adjusting at least one of an image size displayed on the display screen 404, a magnification of the camera 410, and a user interface displayed on the display screen 404. In further examples, operation 1110 can include displaying a workflow on the display screen 404 that is specialized for capturing images for a dermal exam,” 71-74, 94 “computing device 1600 can include one or more input devices such as the display screen 404 (in examples where the display screen 404 is a touch sensitive touchscreen), one or more physical push buttons on the housing 402 of the imaging device 400, and the camera,” 95, 96 “display screen 404 is touch sensitive and is connected to the system bus 1606 via an interface, such as a video adapter 1642. The display screen 404 includes touch sensors for receiving input from a user when the user touches the display. Such sensors can be capacitive sensors, pressure sensors, or other touch sensors,”]). Examiner Note: As cited to above with respect to reference Perkins the instant invention as cited discloses the determination of treatment workflows as well as providing a variety of methods for system operators to determine the workflow metrics and interact with the system to determine and execute the workflows by interacting with system related interfaces and associated images and treatment data and the choosing of particular processes. Perkins does not explicitly disclose however Small discloses: an optical viewing device including an instrument head having an eyepiece ([241 “lens 1001 may be referred to as an objective lens, lens 1002 may be referred to as a field lens, and lens 1003 may be referred to as an eyepiece lens. The field lens 1002 may be in the vicinity of an intermediate image plane. The eyepiece lens 1003 may deliver the image to the cell phone camera at a near-infinity conjugate. The camera's autofocus system may or may not make final adjustments. The objective lens 1001 and the eyepiece lens 1002 are doublets to minimize chromatic aberrations,” 259-264, 287]); a camera configured for alignment with the eyepiece of the optical viewing device for capturing the images viewed through the eyepiece ([39 “user may attach the otoscope clip to the smart device, may align a portion of the otoscope with the camera of the smart device using the alignment tab, and may secure the otoscope clip to the smart device using the screw clamp assembly to close the clamp,” 42 “user may wish to align the otoscope with the camera on the smartphone. Aligning the otoscope with the smartphone camera may provide an image that may not be impeded by a component of the otoscope. Aligning the otoscope with the smartphone camera may help keep stray light out of the image (e.g., seal light out of the otoscope) to improve an image quality,” 43-45, 241 “lens 1001 may be referred to as an objective lens, lens 1002 may be referred to as a field lens, and lens 1003 may be referred to as an eyepiece lens,” Figs. 1A-1C]); a display for displaying the images captured by the camera ([43 “front side of the smart device may comprise a camera, such as the camera 107, which may be directed towards a user, and a display screen, such as the display,” 44, 51 “alignment aperture may cause the viewing portion 108 of the otoscope to be aligned with a smart device camera when the alignment aperture is aligned with a corresponding shape within an image on a display of the smart device,” 62-66]). Therefore it would be obvious for Perkins to implement an optical viewing device including an instrument head having an eyepiece, a camera configured for alignment with the eyepiece of the optical viewing device for capturing the images viewed through the eyepiece, and a display for displaying the images captured by the camera as per the steps of Small in order to collect visible data reflective of conditions as related to patient metrics and processing the data to determine a medical condition of the individual and providing feedback on ways to treat the detected conditions. Claims 2 and 15: Perkins in view of Small discloses the imaging device of claims 1 and 14 above and Small further discloses wherein the instructions, when executed by the at least one processing device, further cause the imaging device to: generate a recommendation based on the metrics ([43 “front side of the smart device may comprise a camera, such as the camera 107, which may be directed towards a user, and a display screen, such as the display,” 80 “provide guidance as to how to move the otoscope clip device 100 along the horizontal axis, the horizontal alignment image feature 206 may have one or more reference points. For example, the horizontal alignment image feature 206 may be a number of dots along a vertical axis, a line along the vertical axis, a shape elongated along the vertical axis, an indicator along the vertical axis,” 81 “align the otoscope with the smart device camera along the horizontal axis, a user may attach otoscope clip device 100 on smart device 102 such that the horizontal alignment tab feature 212 may be aligned with horizontal alignment image feature,” 82 “user may use the vertical alignment image feature 208 to determine how to move the otoscope clip device 100 to achieve alignment. The vertical alignment image feature 208 may be a line, an object, a shape, an indicator, an icon, and/or the like. For example, the vertical alignment image feature 208 may be an oval elongated along the horizontal axis,” 83, 97, 102, 163 “ user may turn the knob 114 in a clockwise direction so that the alignment tab 106 may move towards a parallel surface of the clip assembly 122. The knob 114 may cause the alignment tab 106 to move towards the clip engagement member 118 such that alignment tab 106 and the clip engagement member 118 may clamp onto the smart device. A user may turn the knob 114 in a counterclockwise direction so that the alignment tab 106 may move away from a parallel surface of the clip assembly,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Small with respect to the implementation of system provided recommendations with respect to collected metrics as derived from information as collected from sensors and associated metrics to disclose the above limitation. Therefore it would be obvious for Perkins to generate a recommendation based on the metrics as per the steps of Small in order to collect visible data reflective of conditions as related to patient metrics and processing the data to determine a medical condition of the individual and providing feedback on ways to treat the detected conditions. Claims 5 and 18: Perkins in view of Small discloses the imaging device of claims 2 and 15 above and Perkins further discloses wherein the recommendation is to adjust one or more settings on the imaging device ([66 “When operation 1104 determines the instrument head 200 is an otoscope, the method 1100 proceeds to an operation 1106 of adjusting at least one feature of the imaging device 400, 400b for optimal use of the imaging device 400, 400b when attached to the otoscope,” 67, 68 “When operation 1104 determines the instrument head 200 is an ophthalmoscope, the method 1100 proceeds to operation 1108 of adjusting at least one feature of the imaging device 400, 400b for optimal use of the imaging device 400, 400b when attached to the ophthalmoscope,” 69, 70 “When operation 1104 determines that the instrument head 200 is a dermatoscope, the method 1100 proceeds to an operation 1110 of adjusting at least one feature of the imaging device 400, 400b for optimal use of the imaging device 400, 400b when attached to the dermatoscope,” 71-74]). Claims 6 and 12: Perkins in view of Small discloses the imaging device of claims 2 and 8 above and Perkins further discloses wherein the instructions, when executed by the at least one processing device, further cause the imaging device to: provide a video on the display that is selectable for playback, the video describing the recommendation to improve competency of the imaging device ([29 “imaging device 400 is attached to the instrument head 200 of each type of optical viewing device 100. The imaging device 400 is a portable, battery powered camera that can record high quality image frames and videos from the optical viewing devices 100, providing digital imaging solutions,” 30 “the imaging device 400 transmits images, videos, and other data to an external system 600, which analyzes the images, videos, and other data to generate one or more results for transmission back to the imaging device 400. The external system 600 can be remotely located with respect to the optical viewing device 100 and the imaging device,” 31 “the external system 600 may also host storage of the images, videos, and other data received from the imaging device 400. In further examples, the external system 600 can host the EMR of the patient. In yet further examples, the external system 600 may provide connectivity to other external systems and servers having image storage,” 54 “automatically performed by the imaging device 400, 400b without requiring any input or feedback from a user, thereby improving the usability of the imaging device 400, 400b by having one or more features of the imaging device 400, 400b automatically adjusted based on the type of optical viewing device attached thereto,” 72 “images from the otoscope under higher zoom can move around the display screen 404 in an unstable manner, such that operation 1106 can include automatically centering the images to improve the usability of the imaging device 400, 400b when the imaging device 400, 400b is attached to an otoscope for examining the ears of a patient,” 73 “workflow can be optimized for capturing images of one or more anatomical areas based on the type of instrument head determined in operation,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures at least at paragraph [72] and as well the other information included in the other paragraphs to disclose the improvement of the operability of the video imaging device which Examiner interprets to be the improvement of the competency of the video imaging device. Claims 7, 13, and 19: Perkins in view of Small discloses the imaging device of claims 2, 8, and 15 above and Perkins further discloses wherein the instructions, when executed by the at least one processing device, further cause the imaging device to: provide an input on the display that when selected causes the recommendation to be implemented on the imaging device ([29 “the imaging device 400 captures images through an eyepiece of the instrument head 200 for display on a display screen 404 (see FIG. 4) for viewing by a physician. The images captured by the imaging device 400 can be analyzed by algorithms (including artificial intelligence algorithms) for disease screening, and the images can be stored in an electronic medical record (EMR) of a patient,” 30 “imaging device 400 transmits images, videos, and other data to an external system 600, which analyzes the images, videos, and other data to generate one or more results for transmission back to the imaging device,” 31, 48 “imaging device 400 includes a display screen 404 for displaying the images captured by the camera 410. In some examples, the display screen 404 is a touchscreen such that it can both display the images, and receive inputs from a user. For example, the display screen 404 can be used by a user of the imaging device to: adjust the settings of the camera 410 (e.g., focus, exposure, white balance, FOV/zoom); tapping the display screen 404 to trigger focus and lock; adjust settings of the display screen 404 such as the screen brightness; provide a virtual keyboard to type in information; display a battery-life indicator; provide video recording controls (e.g., start, stop, save, delete, review, upload; provide a sliding bar to go through video frames, pinch-zoom to enlarge; display arrow(s) to indicate image orientation; and display one or more stamps (e.g., date, time, filter info, etc.) on saved images,” 49 “display screen,” 50-52]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosure of Perkins with respect to a wide range of the provision of image displays and the provision of a wide range of recommendations with respect to treatment methods and a wide variety of physical conditions. Claim 8: Perkins discloses a method of capturing images through an eyepiece of an optical viewing device, and Perkins further discloses the method comprising: presenting a workflow on the imaging device for capturing the images based on a type of optical viewing device, the workflow including tools for annotating the images on a display ([66 “operation 1106 can include adjusting at least one of an image size displayed on the display screen 404, a magnification of the camera 410, and a user interface displayed on the display screen 404. In further examples, operation 1106 can include displaying a workflow on the display screen 404 that is specialized for capturing images for an ear exam,” 67 “operation 1106 can include displaying a workflow that can include automatic ear detection based on identifying a location of an ear drum or other anatomy of the ear. In some examples, the workflow automatically captures an image of the ear (without user input) when the workflow detects the ear drum or other anatomy of the ear. In some examples, the imaging device 400, 400b labels anatomical structures, such as acute otitis media (AOM) or tympanic perforation, and alert the user when such structure or condition is identified,” 68 “When operation 1104 determines the instrument head 200 is an ophthalmoscope, the method 1100 proceeds to operation 1108 of adjusting at least one feature of the imaging device 400, 400b for optimal use of the imaging device 400, 400b when attached to the ophthalmoscope,” 69 “operation 1108 can include displaying a workflow that can include automatic eye detection based on identifying a location of an optic disc or other anatomy of the eye. In some examples, the workflow automatically captures an image of the eye (without user input) when the workflow detects the optic disc or other anatomy of the eye. In some further examples, the imaging device 400, 400b can also label anatomical structures, such as papilledema or glaucomatous disc, and alert the user when such structure is identified,”]) Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Perkins with respect to the detection and labeling or annotating the collected images with respect to a wide variety of detected parameters to disclose the claimed limitation with respect to the type of device and methods of operation. ; capturing metrics related to user interactions with the workflow, and quality characteristics of the images captured during the workflow ([48 “imaging device 400 includes a display screen 404 for displaying the images captured by the camera 410. In some examples, the display screen 404 is a touchscreen such that it can both display the images, and receive inputs from a user. For example, the display screen 404 can be used by a user of the imaging device to: adjust the settings of the camera 410 (e.g., focus, exposure, white balance, FOV/zoom); tapping the display screen 404 to trigger focus and lock; adjust settings of the display screen 404 such as the screen brightness; provide a virtual keyboard to type in information; display a battery-life indicator; provide video recording controls (e.g., start, stop, save, delete, review, upload; provide a sliding bar to go through video frames, pinch-zoom to enlarge; display arrow(s) to indicate image orientation; and display one or more stamps (e.g., date, time, filter info, etc.) on saved images,” 68 “operation 1108 can include displaying a workflow on the display screen 404 that is specialized for capturing images for an eye exam,” 69 “operation 1108 can include displaying a workflow that can include automatic eye detection based on identifying a location of an optic disc or other anatomy of the eye. In some examples, the workflow automatically captures an image of the eye (without user input) when the workflow detects the optic disc or other anatomy of the eye,” 70 “operation 1104 determines that the instrument head 200 is a dermatoscope, the method 1100 proceeds to an operation 1110 of adjusting at least one feature of the imaging device 400, 400b for optimal use of the imaging device 400, 400b when attached to the dermatoscope. The method 1100 can include additional operations for adjusting the features on the imaging device 400, 400b based on the type of instrument head determined in operation 1104. For example, operation 1110 can include adjusting at least one of an image size displayed on the display screen 404, a magnification of the camera 410, and a user interface displayed on the display screen 404. In further examples, operation 1110 can include displaying a workflow on the display screen 404 that is specialized for capturing images for a dermal exam,” 71-74, 94 “computing device 1600 can include one or more input devices such as the display screen 404 (in examples where the display screen 404 is a touch sensitive touchscreen), one or more physical push buttons on the housing 402 of the imaging device 400, and the camera,” 95, 96 “display screen 404 is touch sensitive and is connected to the system bus 1606 via an interface, such as a video adapter 1642. The display screen 404 includes touch sensors for receiving input from a user when the user touches the display. Such sensors can be capacitive sensors, pressure sensors, or other touch sensors,”]). Examiner Note: As cited to above with respect to reference Perkins the instant invention as cited discloses the determination of treatment workflows as well as providing a variety of methods for system operators to determine the workflow metrics and interact with the system to determine and execute the workflows by interacting with system related interfaces and associated images and treatment data and the choosing of particular processes]). Perkins does not explicitly disclose however Small discloses: providing an imaging device for attachment to the optical viewing device ([39 “attach the otoscope clip to the smart device, may align a portion of the otoscope with the camera of the smart device using the alignment tab, and may secure the otoscope clip to the smart device using the screw clamp assembly to close the clamp. The user may then record an image of an outer ear, a middle ear, and/or an inner ear of a patient and may provide that image to a physician for diagnosis,” 41 “smart device may be used to provide a camera for an otoscope. The smart device may be a smartphone, a smart tablet (e.g., an iPad), a computer, and/or the like. The smart device may include a camera, which the otoscope may use to take an image. The camera on the smartphone may provide a cost-effective method of providing the otoscope with the camera,” 42, 43, 44 “smart device 102 may be a smartphone, a smart tablet (e.g., an iPad), a computer, and/or the like. The smart device 102 may comprise a display, such as the display 103. The display 103 may be a liquid crystal display (LCD) located on the front-facing portion of the smart device. The display 103 may show an alignment image 104. The alignment image 104 may assist the user in aligning a viewing portion of an otoscope with a camera of the smart device,” 45-52, Figs. 1A, 1B, 1C]). generating a recommendation based on the metrics ([43 “front side of the smart device may comprise a camera, such as the camera 107, which may be directed towards a user, and a display screen, such as the display,” 80 “provide guidance as to how to move the otoscope clip device 100 along the horizontal axis, the horizontal alignment image feature 206 may have one or more reference points. For example, the horizontal alignment image feature 206 may be a number of dots along a vertical axis, a line along the vertical axis, a shape elongated along the vertical axis, an indicator along the vertical axis,” 81 “align the otoscope with the smart device camera along the horizontal axis, a user may attach otoscope clip device 100 on smart device 102 such that the horizontal alignment tab feature 212 may be aligned with horizontal alignment image feature,” 82 “user may use the vertical alignment image feature 208 to determine how to move the otoscope clip device 100 to achieve alignment. The vertical alignment image feature 208 may be a line, an object, a shape, an indicator, an icon, and/or the like. For example, the vertical alignment image feature 208 may be an oval elongated along the horizontal axis,” 83, 97, 102, 163 “ user may turn the knob 114 in a clockwise direction so that the alignment tab 106 may move towards a parallel surface of the clip assembly 122. The knob 114 may cause the alignment tab 106 to move towards the clip engagement member 118 such that alignment tab 106 and the clip engagement member 118 may clamp onto the smart device. A user may turn the knob 114 in a counterclockwise direction so that the alignment tab 106 may move away from a parallel surface of the clip assembly,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Small with respect to the implementation of system provided recommendations with respect to collected metrics as derived from information as collected from sensors and associated metrics to disclose the above limitation. Therefore it would be obvious for Perkins to provide an imaging device for attachment to the optical viewing device and generate a recommendation based on the metrics as per the steps of Small in order to collect visible data reflective of conditions as related to patient metrics and processing the data to determine a medical condition of the individual and providing feedback on ways to treat the detected conditions. Claim 11: Perkins in view of Small discloses the imaging device of claim 9 above and Perkins does not explicitly disclose however Small discloses wherein the recommendation is to adjust one or more settings on the imaging device based on the quality characteristics of the images ([42 “Aligning the otoscope with the smartphone camera may help keep stray light out of the image (e.g., seal light out of the otoscope) to improve an image quality,” 236 “camera flash of a smartphone or other fixed or uncontrollable light source, veiling glare may cause poor image quality. This problem may be further extenuated when an otoscope is being used by a consumer or patient, rather than a doctor or medical professional, who is generally unfamiliar with the use of otoscopes and the structure of the outer ear, the middle ear, and/or the ear canal. Good image quality and contrast may help a consumer or patient effectively take an image of the outer ear, the middle ear, and/or the ear canal,” 237 “Example B shows various embodiments of the otoscope design where veiling glare has been reduced and image quality improved,” 254-257, 297 “image contrast quality is improved (e.g., greatly improved) in FIG. 16B through the reduction of veiling glare in the speculum of the inner otoscope device 1000B of FIGS. 13 and 14A-D, as compared to the image contrast quality shown in FIG. 16A as related to the inner otoscope configuration 1000A of FIG. 11. This enhancement of image contrast quality may allow a user to easily capture an image or video of the outer ear, the middle ear, and/or the ear canal. Such an image may be sent to a healthcare professional for evaluation and/or diagnosis. Without high image quality, such an application of tele-otoscopy may not be feasible,” 298 “distance and alignment of the various optical components of the otoscope device to the subject (e.g., outer ear, the middle ear, and/or the ear canal of a patient) may be optimized to produce higher image quality,”]). Therefore it would be obvious for Perkins wherein the recommendation is to adjust one or more settings on the imaging device based on the quality characteristics of the images as per the steps of Small in order to collect visible data reflective of conditions as related to patient metrics and processing the data to determine a medical condition of the individual and providing feedback on ways to treat the detected conditions. Claim 20: Perkins in view of Small discloses the system of claim 14 above and Perkins further discloses wherein the optical viewing device is an otoscope, an ophthalmoscope, or a dermatoscope ([27 “optical viewing devices 100 include a first type of optical viewing device 102 such as an otoscope, a second type of optical viewing device 104 such as an ophthalmoscope, and a third type of optical viewing device 106 such as a dermatoscope,”]). Claim(s) 3, 4, 9, 10, 16, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Perkins et al. (20240388777) and Small et al. (20230240523) and in further view of Bruce et al. (20220335080). Claims 3, 9, and 16: Perkins in view of Small discloses the imaging device of claims 2, 8, and 15 and Bruce further discloses wherein the recommendation is to upload the images to at least one of a secure network, an electronic health record system, and an overread system ([8 “Upon determining that the patient is a new patient, a new patient record can be generated for the patient, and the medical imaging data can be automatically linked to the new patient in response to receiving the request for the imaging study overread,” 19 “Providers often desire internal interpretations (overread) of submitted outside medical imaging. This facilitates comparison to any existing internal imaging that may exist. Outside interpretations may also not be available with submitted outside imaging. Requests for outside imaging overreads are submitted independently from submission of the imaging itself. This leads to additional manual workflows requiring personnel to manually match submitted imaging with orders for imaging overreads,” 23 “wired or wireless networks 110 (for example, local enterprise network or the Internet). The computer system 102 can be connected to a client device 116 over one or more wired or wireless networks 114. A modality 112 can be connected to client device 116 over one or more wired or wireless networks 115. Networks 114 and networks 115 may or may not be the same physical network in whole or in part,” 27 “computer system 102 can present the client device 116 with additional data entry options such as the ability to directly initiate and request overreads, merge outside imaging to an existing internal patient record, obtain outside visit information, or a variety of other custom processing options,” 36 “If an authenticated user indicated that overreads were being requested, the specific exam(s) (if multiple exams were found) to be interpreted can be indicated by the user. The user on client device 116 has the option to finalize the submission,”]). Examiner Note: The above rejection includes references to secure networks, electronic health record systems, and the implementation of overreads. Therefore it would be obvious for Perkins wherein the recommendation is to upload the images to at least one of a secure network, an electronic health record system, and an overread system as per the steps of Bruce in order to collect visible data reflective of conditions as related to patient metrics and processing the data to determine a medical condition of the individual and providing feedback on ways to treat the detected conditions Claims 4, 10, and 17: Perkins in view of Small discloses the imaging device of claims 3, 9, and 16 and Bruce further discloses wherein the instructions, when executed by the at least one processing device, further cause the imaging device to: delete the images after the images are uploaded to the at least one of the secure network, the electronic health record system, and the overread system ([8 “Upon determining that the patient is a new patient, a new patient record can be generated for the patient, and the medical imaging data can be automatically linked to the new patient in response to receiving the request for the imaging study overread,” 19 “Providers often desire internal interpretations (overread) of submitted outside medical imaging. This facilitates comparison to any existing internal imaging that may exist. Outside interpretations may also not be available with submitted outside imaging. Requests for outside imaging overreads are submitted independently from submission of the imaging itself. This leads to additional manual workflows requiring personnel to manually match submitted imaging with orders for imaging overreads,” 23 “wired or wireless networks 110 (for example, local enterprise network or the Internet). The computer system 102 can be connected to a client device 116 over one or more wired or wireless networks 114. A modality 112 can be connected to client device 116 over one or more wired or wireless networks 115. Networks 114 and networks 115 may or may not be the same physical network in whole or in part,” 27 “computer system 102 can present the client device 116 with additional data entry options such as the ability to directly initiate and request overreads, merge outside imaging to an existing internal patient record, obtain outside visit information, or a variety of other custom processing options,” 36 “If an authenticated user indicated that overreads were being requested, the specific exam(s) (if multiple exams were found) to be interpreted can be indicated by the user. The user on client device 116 has the option to finalize the submission, 37 “If data to be sent is not correct (decision branch “No”), then the client device 116 user cancels submission, and the data package is destroyed. Any ongoing data transfer if present is cancelled. Limited metadata may be kept on the computer system 102 to facilitate support and troubleshooting, but all data files are deleted,”]). Examiner Note: as disclosed at cited to paragraph [37] above the reference Bruce discloses the deletion of data as related to system evaluation with respect to the evaluation of the correctness of the data and therefore the rejection is maintained. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. See attached References Cited form 892. See Babson et al.(EP4079214) for disclosures related to the use of a physical assessment device otoscope which collects body related image information and used to assess physical parameters of patients. See at least pages 7-14. See Babson et al. (20220233065) for disclosures related to the implementation of a physical assessment device related to the collection of data for the purpose of detecting a variety of medical conditions. See at least paras. [140]-[183]. See Zhang et al. (20210068646) for disclosures related to the implementation of an otoscope as managed by a smart phone or other handheld devices for the purpose of collecting data from sensors for medical processing. See at least paras. [9]-[25]. See Kojima (20200405135) for disclosures related to the implementation of a medical observations system implemented by generating observation light and collecting the reflected light for processing as related to wavelengths. See at least paras. [43]-[69]. See Lozano-Buhl et al. (20200237310) for disclosures related to the attaching of medical examination devices to mobile device using a magnetic array and the associated collection and processing of data. See at least paras. [85]-[107]. See Salvati et al. (6,393,431) for disclosures related to the implementation of a hand held imaging instrument for the purpose of detecting a variety of medically related issues as related skin, ears, and eyes while interacting with a mobile device. See at least columns 1-3. Any inquiry concerning this communication or earlier communications from the examiner should be directed to David Stoltenberg whose telephone number is (571) 270-3472. The examiner can normally be reached on Monday-Friday 8:30AM to 5:00PM EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kambiz Abdi, can be reached on (571) 272-6702. The fax phone number for the organization where this application or proceeding is assigned is (571)-273-8300, or the examiner’s direct fax phone number is (571) 270 4472. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published application may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center at (866) 217-9197 (toll free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call (800) 786-9199 (IN USA OR CANADA) or (571) 272-1000. /DAVID J STOLTENBERG/ Primary Examiner, Art Unit 3685
Read full office action

Prosecution Timeline

Feb 14, 2025
Application Filed
Apr 01, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594431
AED ACTIONS REMOTELY TRIGGERED BY AED MANAGEMENT PLATFORM
2y 5m to grant Granted Apr 07, 2026
Patent 12580054
COMPUTATIONALLY-EFFICIENT LOAD PLANNING SYSTEMS AND METHODS OF DIAGNOSTIC LABORATORIES
2y 5m to grant Granted Mar 17, 2026
Patent 12555679
HUMIDIFICATION DEVICE COMMUNICATIONS
2y 5m to grant Granted Feb 17, 2026
Patent 12548681
METHOD AND DEVICE FOR ADAPTIVELY DISPLAYING AT LEAST ONE POTENTIAL SUBJECT AND A TARGET SUBJECT
2y 5m to grant Granted Feb 10, 2026
Patent 12525346
VIRTUAL CARE SYSTEMS AND METHODS
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
57%
Grant Probability
82%
With Interview (+24.9%)
3y 7m
Median Time to Grant
Low
PTA Risk
Based on 522 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month