Prosecution Insights
Last updated: April 19, 2026
Application No. 18/647,718

SYSTEMS, DEVICES, AND METHODS FOR THERMAL IMAGING

Non-Final OA §103§112
Filed
Apr 26, 2024
Examiner
AKAR, SERKAN
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Moleculight Inc.
OA Round
1 (Non-Final)
65%
Grant Probability
Favorable
1-2
OA Rounds
4y 10m
To Grant
97%
With Interview

Examiner Intelligence

Grants 65% — above average
65%
Career Allow Rate
265 granted / 407 resolved
-4.9% vs TC avg
Strong +32% interview lift
Without
With
+31.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 10m
Avg Prosecution
49 currently pending
Career history
456
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
47.3%
+7.3% vs TC avg
§102
15.3%
-24.7% vs TC avg
§112
22.6%
-17.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 407 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claims 27 and 32-33 are objected to because of the following informalities: Claims 27 and 32-33 recite the limitation of “and/or” which should rather be either “and” or “or” to prevent any possible ambiguity due to the interpretation. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 14-20 and 27 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The term “substantially” in claims 14-16 and 20 is relative term which renders the claim indefinite. The term “substantially” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The depending claims are also rejected by the virtue of their dependency. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 13, 21-24, 28-33 are rejected under 35 U.S.C. 103 as being unpatentable over Dacosta et al (US 20220061671 A1, hereinafter “Dacosta ‘671”) in view of Dacosta et al (US20220092770A1, hereinafter “Dacosta ‘770”). Regarding claim 13, Dacosta ‘671 teaches a multi-modal imaging device (“multi-modal imaging and analysis is disclosed [0002]), comprising: a housing (see fig. 1-7, 11, and 12); a fluorescence camera configured to capture fluorescence images; a thermal imaging module; (“In particular, the system and method may be suitable for collecting data regarding biochemical, biological and/or non-biological substances. The data may include, for example, one or more of white light data, fluorescent data, thermal data, infrared data, such as in wound care, for both human and animal applications” [0002]”); a processor configured to receive fluorescence image data, and thermal image data (“co-register white light images, fluorescent images, thermal images and other images of a target” [0106]); and a display, wherein the display is configured to allow a user to select a reference area and a test area on one or more of a fluorescent image and a thermal image of a target and an area surrounding the target (“create three-dimensional maps of a target. The device may be configured to enhance color distinctions between different tissue types identified in an image. The device may be configured to determine tissue classification of the target based on different colors or image features captured in the fluorescent image. The device may be configured to delineate between diseased and healthy tissues therein providing a map for users to selectively remove diseased tissues while sparing surrounding healthy tissues is a targeted manner” [0106]), and wherein the processor is further configured to: output to the display of the multi-modal imaging device, an indication of the temperature and a fluorescence image containing the reference area and the test area (“A front, or user-facing side 115 of the base body portion 110 includes a display screen 120 for displaying images and videos captured by the device. Although depicted as square or rectangular, the device may take on any shape that will reasonably support a display screen such as a touchscreen display. In addition to disclosing images captured by the imaging device 100, the display screen also operates as a user interface, allowing the user to control functions of the device via touchscreen input” [0048]; “create three-dimensional maps of a target. The device may be configured to enhance color distinctions between different tissue types identified in an image. The device may be configured to determine tissue classification of the target based on different colors or image features captured in the fluorescent image” [0106]). Dacosta ‘671 does not show the specifics of determining a difference in temperature between the areas, and outputting an indication of the difference in temperature. However, in the same field of endeavor, Dacosta ‘770 teaches wound imaging can capture, calculate, and/or combine one or more of tissue/bacterial fluorescence, measured wound area, a thermal map of a wound, and an infrared imaging of blood flow [0129]. Combine one or more other sensors described herein for three-dimensional imaging with a thermal sensor. A thermal sensor can be used to provide a thermal map to show areas of high temperature that can correlate with the location of bacteria when looking at wound. Imaging can be combined. Thermal mapping can be done with three-dimensional mapping. For example, thermal imaging can be done simultaneously with three-dimensional imaging by coupling a thermal imaging sensor to the three-dimensional imaging device. Then the captured thermal images or video can be superimposed topographically on the [white] visible light/fluorescent light images using one or more fiducials. A camera used for three-dimensional imaging can, for example, capture tissue/bacterial fluorescence, measure wound area, capture a thermal map of wound, and/or image infrared of blood flow. [0141]. It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with determining a difference in temperature between the areas, and outputting an indication of the difference in temperature as taught by Dacosta ‘770 because improved characterization of wounds would help a clinician better understand whether a given treatment is working and better identify which treatments would have the greatest efficacy for a particular wound ([0003] of Dacosta ‘770). Regarding claim 21, Dacosta ‘671 teaches wherein the thermal imaging module is removably attached to the housing (see figs. 1-6 and the associated pars.). Regarding claim 22, Dacosta ‘671 teaches wherein the processor is configured to apply an overlay indicating the difference in temperature to the thermal image of the target and the surrounding area (“co-register white light images, fluorescent images, thermal images and other images of a target. The device may be configured to create three-dimensional maps of a target. The device may be configured to enhance color distinctions between different tissue types identified in an image” [0106]). Regarding claim 23, Dacosta ‘671 teaches wherein the overlay includes a first mark identifying the reference area and a second mark identifying the test area (“determine tissue classification of the target based on different colors or image features captured in the fluorescent image. The device may be configured to delineate between diseased and healthy tissues therein providing a map for users to selectively remove diseased tissues while sparing surrounding healthy tissues is a targeted manner” [0106]). Regarding claim 24, Dacosta ‘671 teaches wherein processor is configured to apply a label indicative of the difference in temperature on the thermal image (“qualitative parameters include wound dimensions, tissue type, and the amount of exudate or discharge, and thermal readings” [0035]). Regarding claim 28, Dacosta ‘671 teaches wherein the thermal module includes: a thermal sensor configured to receive thermal radiation from the target and the surrounding area and to output an electrical signal representing the receive thermal radiation (“optical housings may also include, in any combination, features such as an ambient light sensor, a range finder, thermal imaging sensors, structured light emitters, an infrared radiation source and detector to be used for three-dimensional imaging, lasers for taking measurements, etc” [0044]; “thermal sensor configured to detect thermal information regarding the target surface.” See Claim 77); and an accessory housing configured for removable attachment to the housing of the multi-modal imaging device (see e.g., fig. 12). Regarding claim 29, Dacosta ‘671 teaches wherein the accessory housing includes a surface feature configured to receive an attachment for a surgical drape (see e.g., fig. 15). Regarding claim 30, Dacosta ‘671 teaches wherein the target is a wound in tissue (“Wound progression is currently monitored manually” [0035]). Regarding claim 31, Dacosta ‘671 teaches wherein the indication of the difference in temperature is a thermal map or a thermal image and the thermal map or thermal image is co-registered and displayed side-by-side with the fluorescence image (see fig. 8). Regarding claim 32, Dacosta ‘671 teaches wherein the processor is configured to co-register, co-localize, and/or overlay the white light image data, the fluorescence image data, and the thermal image data (see figs. 8-9 and particularly 10 as well as the associated pars.). Regarding claim 33, Dacosta ‘671 teaches wherein the processor is configured to display the co-registered, co-localized, and/or overlaid image data (see figs. 8-9 and particularly 10 as well as the associated pars.). Claims 14-19 and 27 are rejected under 35 U.S.C. 103 as being unpatentable over Dacosta ‘671 in view of Dacosta ‘770 and further in view of Moghadam (US20160006951A1). Regarding claim 14, Dacosta ‘671 teaches further comprising a white light camera configured to capture white light images (“white light cameras” [0073]; “sensors configured for white light imaging” [0054]), wherein the processor is configured to receive white light image data (“white light data, fluorescent data, thermal data, infrared data, such as in wound care, for both human and animal applications” [0002]), wherein the display is configured to allow the user to select (“A front, or user-facing side 115 of the base body portion 110 includes a display screen 120 for displaying images and videos captured by the device. Although depicted as square or rectangular, the device may take on any shape that will reasonably support a display screen such as a touchscreen display. In addition to disclosing images captured by the imaging device 100, the display screen also operates as a user interface, allowing the user to control functions of the device via touchscreen input” [0048]) the reference area and the test area on one or more of a white light image, the fluorescence image, and the thermal image (“co-register white light images, fluorescent images, thermal images and other images of a target” [0106]), and The combination noted above does not teach field of view of the white light camera is substantially the same as a field of view of the thermal imaging module. However, in the same field of endeavor, Moghadam teaches system for generating a three-dimensional model of an object, the system including a portable hand-held imaging device including a housing, a plurality of sensors attached to the housing and at least one electronic processing device coupled to the plurality of sensors (Abst). A thermal infrared image sensor attached to the housing, wherein the range sensor, the visible light image sensor and the thermal infrared image sensor have overlapping fields of view (claim 20). It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with field of view of the white light camera is substantially the same as a field of view of the thermal imaging module as taught by Moghadam because there is a need for an improved three-dimensional imaging method and system ([0014] of Moghadam). Regarding claim 15, the combination noted above teaches all the claimed limitations except for fluorescence camera is substantially the same as a field of view of the thermal imaging module. However, in the same field of endeavor, Moghadam teaches system for generating a three-dimensional model of an object, the system including a portable hand-held imaging device including a housing, a plurality of sensors attached to the housing and at least one electronic processing device coupled to the plurality of sensors (Abst). A thermal infrared image sensor attached to the housing, wherein the range sensor, the visible light image sensor and the thermal infrared image sensor have overlapping fields of view (claim 20). It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with cameras substantially the same as a field of view of with each other as taught by Moghadam because there is a need for an improved three-dimensional imaging method and system ([0014] of Moghadam). Regarding claim 16, the combination noted above teaches all the claimed limitations except for field of view of the white light camera is substantially the same as a field of view of the thermal imaging module. However, in the same field of endeavor, Moghadam teaches system for generating a three-dimensional model of an object, the system including a portable hand-held imaging device including a housing, a plurality of sensors attached to the housing and at least one electronic processing device coupled to the plurality of sensors (Abst). A thermal infrared image sensor attached to the housing, wherein the range sensor, the visible light image sensor and the thermal infrared image sensor have overlapping fields of view (claim 20). It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with field of view of the white light camera is substantially the same as a field of view of the thermal imaging module as taught by Moghadam because there is a need for an improved three-dimensional imaging method and system ([0014] of Moghadam). Regarding claim 17, Dacosta ‘671 teaches wherein the white light camera is a stereoscopic camera (“second WL image sensor/camera sensor may be used as part of a stereoscopic” [0047]). Regarding claim 18, Dacosta ‘671 teaches wherein the white light camera is adjacent to the fluorescence camera on the housing in a first direction (see light module figs 1-5). Regarding claim 19, Dacosta ‘671 teaches wherein the thermal imaging module is adjacent to the white light camera and fluorescent camera on the housing in a second direction orthogonal to the first direction (see light module figs 1-5). Regarding claim 27, Dacosta ‘671 teaches wherein the processor is configured to co-register, co-localize, and/or overlay the white light image data, the fluorescence image data, and the thermal image data (“co-register white light images, fluorescent images, thermal images and other images of a target. The device may be configured to create three-dimensional maps of a target. The device may be configured to enhance color distinctions between different tissue types identified in an image” [0106]). Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Dacosta ‘671 in view of Dacosta ‘770 and further in view of Schoch et all (US20160076936A1). Regarding claim 20, the combination noted above teaches all the claimed limitations except for wherein the thermal imaging module is movable between a first position and a second position. However, in the same field of endeavor of thermal measurements, Schoch teaches infrared sensor array 342 can include one or more thermal detectors the imaging tool is engaged with the first surface of the test and measurement tool, the sensor array of the imaging tool is movable relative to the imaging tool such that the target scene is adjustable without movement of the test and measurement tool [0003]. It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with thermal imaging module is movable between a first position and a second position as taught by Schoch because orienting the test and measurement tool to acquire desired image data may make performing a function of the test and measurement tool difficult or impossible ([0002] of Schoch). Claims 25 and 26 is rejected under 35 U.S.C. 103 as being unpatentable over Dacosta ‘671 in view of Dacosta ‘770 and further in view of Spahn (US 20190236775) Regarding claim 25, the combination noted above teaches all the claimed limitations except for set a temperature of the reference area, and wherein outputting the indication of the difference in temperature includes outputting an indication of whether the test area is warmer or cooler than the reference area. However, in the same field of endeavor, Spahn teaches a system for determining a clinically relevant temperature differential between a predetermined area of interest on the body surface of a mammal and a control area on the body surface of said mammal, said system comprising: a visual and thermal image capturing device (abst). User can click on the plot and the corresponding location on the image is highlighted by the system of the present invention. A user can then place the unaffected reference point in that location or choose a different one if the user so desires [0215]. Even though the thermal images provide more in-depth definition of area of interest than the digital image, it becomes harder to differentiate between small variations in temperature as it is difficult to differentiate between shades of gray. The entire thermal image is made up of 254 different shades of gray as shown in FIG. 5 [0216]. To make visual differences between temperature variations greater the method of the present invention includes incorporating a unique color for each pixel value to generate a custom color bar as shown in FIG. 6 [0217]. Method for applying a custom color bar to the gray scale thermal image comprises choosing an unaffected reference area such as the temperature variation within the area is less than 1.5 degrees Celsius, finding the average of all the pixel values that fall within the unaffected reference area called the reference mean; generating a matrix that holds the R, G, and B values for the new custom colors; assigning each pixel in the image a pixel value; and calculating the difference between the current pixel value and the reference mean [0240]. It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with set a temperature of the reference area, and wherein outputting the indication of the difference in temperature includes outputting an indication of whether the test area is warmer or cooler than the reference area as taught by Spahn because accurate and repeatable measurement of size is essential for documenting progression or regression of the wound and/or area of interest ([0056] of Spahn). Regarding claim 26, the combination noted above teaches all the claimed limitations except for a temperature of zero on the relative color scale corresponds to the temperature of a control area. However, in the same field of endeavor, Spahn teaches a system for determining a clinically relevant temperature differential between a predetermined area of interest on the body surface of a mammal and a control area on the body surface of said mammal, said system comprising: a visual and thermal image capturing device (abst). Method for applying a custom color bar to the gray scale thermal image comprises choosing an unaffected reference area such as the temperature variation within the area is less than 1.5 degrees Celsius, finding the average of all the pixel values that fall within the unaffected reference area called the reference mean; generating a matrix that holds the R, G, and B values for the new custom colors; assigning each pixel in the image a pixel value; and calculating the difference between the current pixel value and the reference mean [0240]. It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with applying a custom color bar for temperatures to set the temperature of zero on the relative color scale corresponds to the temperature of a control area as taught by Spahn because accurate and repeatable measurement of size is essential for documenting progression or regression of the wound and/or area of interest ([0056] of Spahn). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SERKAN AKAR whose telephone number is (571)270-5338. The examiner can normally be reached 9am-5pm M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at 571-272 7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SERKAN AKAR/ Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

Apr 26, 2024
Application Filed
Feb 27, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594122
CONTEXT AWARE SURGICAL SYSTEMS AND METHODS VIA HYPERSPECTRAL IMAGE ANALYSIS TO CONFIGURE A DEVICE DURING MEDICAL PROCEDURE
2y 5m to grant Granted Apr 07, 2026
Patent 12589263
THERAPEUTIC FOCUSED ULTRASOUND SYSTEMS AND METHODS HAVING TREATMENT BLOCKS THAT ARE ROTATABLE AROUND REFERENCE AXIS FOR INDEPENDENT PHASE AND AMPLITUDE CONTROL
2y 5m to grant Granted Mar 31, 2026
Patent 12576213
Multiple Dosage Injector with Rack and Pinion Dosage System having Ram that includes a Lockout Protrusion
2y 5m to grant Granted Mar 17, 2026
Patent 12575736
METHOD AND SYSTEM FOR ESTIMATING PHYSIOLOGICAL INFORMATION VIA SET OF LEDS AND PHOTODETECTORS BY DETERMINING A CORRECTION PROFILE BASED ON A RATIO
2y 5m to grant Granted Mar 17, 2026
Patent 12564330
MACHINE LEARNING METHOD FOR PREDICTING FRACTIONAL FLOW RESERVE FOR OPTICAL COHERENCE TOMOGRAPHY
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
65%
Grant Probability
97%
With Interview (+31.7%)
4y 10m
Median Time to Grant
Low
PTA Risk
Based on 407 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month