Prosecution Insights
Last updated: April 19, 2026
Application No. 17/763,506

DISEASED PERSON DISTINGUISHING DEVICE AND DISEASED PERSON DISTINGUISHING SYSTEM

Final Rejection §101§103§112
Filed
Mar 24, 2022
Examiner
DEUTSCH, TAYLOR M
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Hyogo College Of Medicine
OA Round
4 (Final)
55%
Grant Probability
Moderate
5-6
OA Rounds
3y 2m
To Grant
92%
With Interview

Examiner Intelligence

Grants 55% of resolved cases
55%
Career Allow Rate
53 granted / 97 resolved
-15.4% vs TC avg
Strong +37% interview lift
Without
With
+37.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
36 currently pending
Career history
133
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
55.1%
+15.1% vs TC avg
§102
20.9%
-19.1% vs TC avg
§112
14.0%
-26.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 97 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This office action is in response to the communications filed on 07/17/2025, concerning Application No. 17/763,506. The amendments to the claims filed on 07/17/2025 are acknowledged. Presently, claims 1-13 remain pending. Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Claim Objections Claims 4 and 10-11 are objected to because of the following informalities: Claim 4, lines 8-9, the limitation “derives the temperature patterns” should be changed to “derives the plurality of temperature patterns” to maintain consistent terminology throughout the claims; Claim 10, lines 6-7, the limitation “the temperature patterns” should be changed to “the plurality of temperature patterns” to maintain consistent terminology throughout the claims; and Claim 11, lines 5-6, the limitation “the temperature patterns” should be changed to “the plurality of temperature patterns” to maintain consistent terminology throughout the claims. Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “a temperature pattern calculation unit that calculates a plurality of temperature patterns” in claim 1 (also in claims 4-7 and 11-12); “the thermal image being input from a body surface temperature measurement device, the body surface temperature measurement device… measuring a body surface temperature of the subject” in claim 1 (also in claims 4-7 and 12); “a subject distinguishing unit that inputs the plurality of temperature patterns to a learned model” in claim 1 (also in claims 8-10 and 12-13); “a display unit that receives a determination result… and causes the received determination result to be displayed” in claim 1; “a specific site defining unit that defines the plurality of specific sites” in claim 2 (also in claims 4 and 6); “a storage unit that stores the learned model” in claim 8; “an input unit that receives a user distinction result” in claim 10 (also in claim 13); and “a temperature pattern learning unit that generates the learned model learned by means of a machine learning model” in claim 10 (also in claims 12-13). Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. The claim limitation “temperature pattern calculation unit” in claims 1, 4-7, and 11-12 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0042], “In the diseased person distinguishing device 2, the functions of the specific site defining unit 21, the temperature pattern calculation unit 22, the subject distinguishing unit 24, and the temperature pattern learning unit 27 are fulfilled by the CPU 31”. Therefore, the “temperature pattern calculation unit” has been interpreted as corresponding to a CPU and its corresponding program code of software, and equivalents thereof. The claim limitation “body surface temperature measurement device” in claims 1, 4-7, and 12 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0015-0024], specifically Para. [0016], “The body surface temperature measurement device 1 includes a lens 10, a detection element 11, an A/D converter 12, a control unit 13, a display unit 14, and an operation unit 15. The body surface temperature measurement device 1 can measure the body surface temperature of the subject at which the lens 10 is aimed.”. Therefore, the “body surface temperature measurement device” has been interpreted as corresponding to the combination of a lens, a detection element/sensor, an A/D converter, a control unit/CPU, a display unit, and an operation unit, and equivalents thereof. The claim limitation “subject distinguishing unit” in claims 1, 8-10, and 12-13 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0042], “In the diseased person distinguishing device 2, the functions of the specific site defining unit 21, the temperature pattern calculation unit 22, the subject distinguishing unit 24, and the temperature pattern learning unit 27 are fulfilled by the CPU 31”. Therefore, the “subject distinguishing unit” has been interpreted as corresponding to a CPU and its corresponding program code of software, and equivalents thereof. The claim limitation “display unit” in claim 1 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0043], “The display device 35 is, for example, a liquid crystal display monitor, and displays the result of processing performed in the computing machine 30, a thermal image, and the like to the user. For example, the display device 35 is used for the display unit 14 of the body surface temperature measurement device 1 and the display unit 25 of the diseased person distinguishing device 2”. Therefore, the “display unit” has been interpreted as corresponding to a liquid crystal display monitor, and equivalents thereof. The claim limitation “specific site defining unit” in claims 2, 4, and 6 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0042], “In the diseased person distinguishing device 2, the functions of the specific site defining unit 21, the temperature pattern calculation unit 22, the subject distinguishing unit 24, and the temperature pattern learning unit 27 are fulfilled by the CPU 31”. Therefore, the “specific site defining unit” has been interpreted as corresponding to a CPU and its corresponding program code of software, and equivalents thereof. The claim limitation “storage unit” in claim 8 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0045], “As the nonvolatile storage 37, a hard disk drive (HDD), a solid-state drive (SSD), a flexible disk, an optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a magnetic tape, or a nonvolatile memory is used, for example. The nonvolatile storage 37 has recorded therein not only an operating system (OS) and various parameters but also a program for causing the computing machine 30 to function. The ROM 32 and the nonvolatile storage 37 permanently record a program, data, and the like required for operation of the CPU 31, and are used as examples of a computer-readable non-transitory recording medium storing a program executed by the computing machine 30. For example, the nonvolatile storage 37 is used for the storage unit 23 of the diseased person distinguishing device 2”. Therefore, the “storage unit” has been interpreted as corresponding to a hard disk drive (HDD), a solid-state drive (SSD), a flexible disk, an optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a magnetic tape, or a nonvolatile memory, and equivalents thereof. The claim limitation “input unit” in claims 10 and 13 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0044], “As the input device 36, an operation key or an operation button is used, for example, with which the user can perform predetermined operation inputs and instructions. For example, the input device 36 is used for the operation unit 15 of the body surface temperature measurement device 1 and the input unit 26 of the diseased person distinguishing device 2”. Therefore, the “input unit” has been interpreted as corresponding to an operation key/button, and equivalents thereof. The claim limitation “temperature pattern learning unit” in claims 10 and 12-13 has the corresponding structure described in the original specification that performs the claimed functions: Para. [0042], “In the diseased person distinguishing device 2, the functions of the specific site defining unit 21, the temperature pattern calculation unit 22, the subject distinguishing unit 24, and the temperature pattern learning unit 27 are fulfilled by the CPU 31”. Therefore, the “temperature pattern learning unit” has been interpreted as corresponding to a CPU and its corresponding program code of software, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-13 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding independent claims 1 and 12, the current amendments include the following new limitations that don’t seem to have corresponding support/teachings in the specification: “calculates a plurality of temperature patterns” (emphasis added); “each of the plurality of temperature patterns is a graph with a horizontal axis… and a vertical axis…” (emphasis added); and “the first, second, and third temperature patterns are three separate graphs” (emphasis added). Applicant pointed to Figs. 8A, 8B, 9A, 9B and Para. [0112] of the publication for support for their current amendments. Based on Figs. 8A, 8B, 9A, and 9B, it seems as if they may acquire a single pattern in one graph, wherein the single pattern covers/represents temperatures for each of the eye inner corners, nose tip, nose side, etc.. Examiner further points to Para. [0059] of the publication, which states “The temperature pattern calculation unit 22 calculates a temperature pattern from the temperatures in the exposed region of the subject on the basis of the temperature data input from the body surface temperature measurement device 1 that measures the body surface temperature of the subject. The temperature pattern is expressed as a group of temperature data collected from a plurality of specific sites of the subject. For example, as illustrated in FIGS. 8 and 9 described below, the temperature pattern is a graph expressing normalized temperature values for the respective specific sites included in a region horizontally crossing the face of the subject” (emphasis added). There does not seem to be any support for “three separate graphs”, acquiring a plurality of temperature patterns, or “each of the plurality of temperature patterns is a graph”. For examination purposes, Examiner herein interprets these new claim amendments as if being taught in Figs. 8A, 8B, 9A, and/or 9B and the corresponding disclosure, wherein the temperature pattern is one graph expressing normalized temperature values for the respective specific sites included in a region horizontally crossing the face of the subject, such that different/separate portions of the single graph each respectively represent an eye inner corner, a nose tip, a nose side, etc. of the subject’s face. Clarification is required. Claims 2-11 and 13 are also rejected under 35 U.S.C. 112(a) due to their dependency from claims 1 and 12. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-13 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1: The claims are directed to apparatuses, and therefore satisfy step 1 of the subject matter eligibility test. Step 2A, Prong 1: The claims recite the following limitations that are directed to judicial exceptions (abstract ideas): “calculates a plurality of temperature patterns, each pattern corresponding to a respective one of a plurality of specific sites of a face of a subject on a thermal image” in claims 1 and 12; “determine whether the subject is a diseased person or a healthy person” in claims 1 and 12; “each of the plurality of temperature patterns is a graph…” in claims 1 and 12; “normalizing a temperature value in each respective part” in claims 1 and 12; “the first, second, and third temperature patterns are three separate graphs” in claims 1 and 12; “determines whether the subject is the diseased person or the healthy person based on all of the inputted plurality of temperature patterns…” in claims 1 and 12; “defines the plurality of specific sites…” in claims 2, 4, and 6; “wherein the plurality of specific sites include…” in claim 3; “locates face parts…” in claims 4-7; “extracts temperature data from the plurality of specific sites defined” in claim 4; “derives the temperature patterns…” in claims 4 and 6; “calculates the plurality of temperature patterns based on respective pixels of the thermal image” in claims 5 and 7; “determines that the face of the subject faces front…” in claims 6-7; “identifies the subject” in claim 8; “generates the learned model” in claims 10 and 12-13; and “calculates the temperature patterns for the plurality of specific sites measured in a predetermined measurement order” in claim 11; etc., which recite either mathematical concepts and/or mental processes that can be performed in the human mind or with the aid of pen and paper. Step 2A, Prong 2: This judicial exception is not integrated into a practical application because the generically recited computer elements do not add a meaningful limitation to the abstract idea (i.e., the mental processes and mathematical concepts) as the generically recited computer elements only amount to simply implementing the abstract idea on the machine. Additional elements recited at a high-level of generality include processing circuitry/devices/components/tools capable of performing the mere data gathering steps as claimed (i.e., “the thermal image being input” in claims 1 and 12; “measuring a body surface temperature of the subject” in claims 1 and 12; “wherein the thermal image is obtained…” in claims 4-7; “stores the learned model” in claim 8; and “receives a user distinction result…” in claims 10 and 13), etc., and the displaying/outputting steps as claimed (i.e., “receives a determination result… regarding whether the subject is the diseased person or the healthy person, and causes the received determination result to be displayed” in claim 1; and “displays… a possibility that the subject is the diseased person” in claim 9), etc., which are components recited at a high-level of generality that merely links the judicial exceptions to a particular technological environment and/or a computer as a tool to perform the abstract idea. Step 2B: For similar reasons set forth above, the additional limitations also do not provide an inventive concept that would be substantially more than the judicial exception. Conclusion: Claims 1-13 are not patent-eligible. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1-13 are rejected under 35 U.S.C. 103 as being unpatentable over Lai et al. (US 2019/0216333 A1, of record, cited in the applicant’s IDS filed on 03/24/2022, hereinafter Lai) in view of Hunt et al. (US Patent No. 3,798,366 A, hereinafter Hunt), and further in view of Alloo et al. (US 2011/0123093 A1, hereinafter Alloo). Regarding claims 1 and 12, Lai discloses a diseased person distinguishing device of claim 1 (smart device 110) (see, e.g., Para. [0040-0077]); and a diseased person distinguishing system of claim 12 (system 100) comprising: a diseased person distinguishing device (smart device 110); and a learning server (see, e.g., Fig. 1 and Para. [0040-0077]), the diseased person distinguishing device (110) comprising: a temperature pattern calculation unit that calculates a plurality of temperature patterns, each pattern corresponding to a respective one of a plurality of specific sites of a face of a subject on a thermal image of the subject’s face, the thermal image being input from a body surface temperature measurement device (mechanism 115), the body surface temperature measurement device (115) including a lens and measuring a body surface temperature of the subject at which the lens is aimed (see, e.g., Para. [0040], “FIG. 1 is a processing flow diagram of a system 100 for generating a health index based at least in part on an image of a person that represents the person's thermal condition. The thermal condition may be indicative of blood circulation, referred to as a person's circulatory condition. The system 100 may include a smart device 110, such as a smart phone”, and Para. [0041], “The smart device 110 includes a mechanism 115 to capture a person's profile images. The mechanism 115 may be a camera which includes at least one of RGB and thermal sensors, such as an array of microbolometers to capture infrared radiation from the person. Both types of images, RGB and thermal sensor based images, may be used to detect the status of a person's circulatory condition. The camera may be front facing or rear facing and integrated into the smart device 110. A front facing camera enables easier use by the person to capture their own image, while a rear facing camera enables health care providers, either professional, friends, or family to capture the person's image”, and Para. [0042], “Both types of images, RGB and thermal sensor images, may be used to detect the status of a person's circulatory condition. For example, a higher temperature shown in the image may indicate denser capillaries and better blood circulation. Temperature distribution and changes may map to the status of the circulatory system, with different distributions being associated with different health conditions. Thermal sensor based images contain robust pixel information that clearly reflects such temperatures and temperature distributions”, and Para. [0061], “FIG. 3A is a representation of a thermal face image 300 that illustrates a region of interest 310 and a background 320. Note that the image is reproduced herein as a black and white image, but the different shades of black still show different temperatures of the face. Color images show temperature variations as different colors and intensities of color, however, the black and white image still conveys that there are color differences. The pixel data behind the images is what is processed to create the individual health model. The region of interest 310 is detected and may include various sub-regions of interest such as eyes, mouth, nose, forehead, etc”); a subject distinguishing unit (health index analysis module 125) that inputs the plurality of temperature patterns to a learned model (training module 120 with the individual health model, general health model 130) that has been learned in advance (see, e.g., Para. [0045], “The smart device 110 may also include a training module 120 that may include circuitry and instructions for labeling the collected images. Labeling of the images may be performed based on information provided by the person or healthcare provider, such as via a popup window with the ability to input information about how the user is feeling, or to directly label an image from a dropdown menu of predefined labels such as headache, runny nose, sore throat, normal, tired, etc”, and Para. [0046], “the training module 120 may use a convolutional neural network (CNN) to train an individual health model using the collected thermal images and corresponding labels”) to determine whether the subject is a diseased person or a healthy person, wherein the plurality of temperature patterns are input to the learned model (120, 130) of the subject distinguishing unit (125) which determines whether the subject is the diseased person or the healthy person based on all of the inputted plurality of temperature patterns without applying a temperature threshold to the inputted plurality of temperature patterns (see, e.g., Para. [0040], “FIG. 1 is a processing flow diagram of a system 100 for generating a health index based at least in part on an image of a person that represents the person's thermal condition”, and Para. [0042], “Both types of images, RGB and thermal sensor images, may be used to detect the status of a person's circulatory condition. For example, a higher temperature shown in the image may indicate denser capillaries and better blood circulation. Temperature distribution and changes may map to the status of the circulatory system, with different distributions being associated with different health conditions. Thermal sensor based images contain robust pixel information that clearly reflects such temperatures and temperature distributions”, and Para. [0047], “The individual health model, which is also represented in FIG. 1 by training module 120, when trained, provides predictions of the thermal condition reflected in the images to a health index analysis module 125, which also receives context information collected by the mechanism 115. The health index module 125 analyzes the thermal condition information and context information using statistically sound methods, which may weight various pieces of information to generate a health index” (emphasis added), and Para. [0055], and Para. [0073], “FIG. 10 is a block diagram illustrating circuitry for estimating health of a person using thermal face images”); and a display unit that receives a determination result from the subject distinguishing unit (125) regarding whether the subject is the diseased person or the healthy person, and causes the received determination result to be displayed on a display device (see, e.g., Para. [0040], “FIG. 1 is a processing flow diagram of a system 100 for generating a health index based at least in part on an image of a person that represents the person's thermal condition”, and Para. [0042], “Both types of images, RGB and thermal sensor images, may be used to detect the status of a person's circulatory condition. For example, a higher temperature shown in the image may indicate denser capillaries and better blood circulation. Temperature distribution and changes may map to the status of the circulatory system, with different distributions being associated with different health conditions. Thermal sensor based images contain robust pixel information that clearly reflects such temperatures and temperature distributions”, and Para. [0052], “The health index may be provided to a notification module 135, which provides notifications to the person. Example notifications based on the health index may include an indication that the person appears to be coming down with a cold or the flu, and to rest, drink plenty of liquids, and try to consume certain types of nutrition. If the person's temperature is high for example, the notification may also indicate that the person should seek a health advisor indicated by the health service 140, and may even notify the health advisor or service”, and Para. [0054], “As described above and shown in FIG. 1, the smart device 110 may include the mechanism 115, the training module 120, the health index analysis module 125, and the notification module 135, which may be integrated into the smart device 110. The smart device may also include a display screen, such as a touch screen for display and input by the person or other user providing services to the person”, and Para. [0069], “FIG. 6 is a representation of example context information shown on a display screen of the smart device at 600. This context information may be obtained from a wearable device that counts steps and can also determine whether the wearer is running, walking, or climbing. Thus the wearable device may include various accelerometers, timers, altimeters, and other sensing devices, such as pulse rate sensors, temperature sensors, etc. The data displayed is displayable by day, week, month, and year. Some or all of the data developed by the wearable device may be provided to the smart device via a short distance wireless protocol, such as a Bluetooth® protocol. In other words the wearable device may be paired with the smart device. Note that selected data may form part of the context associated with a face image obtained by the smart device”, and Para. [0076], “Computer 1000 may include or have access to a computing environment that includes input interface 1006, output interface 1004, and a communication interface 1016. Output interface 1004 may include a display device, such as a touchscreen, that also may serve as an input device”), and wherein the learning server includes a temperature pattern learning unit that learns the temperature pattern by means of a machine learning model and generates the learned model (training module 120 with the individual health model, general health model 130) (see, e.g., Para. [0045], “The smart device 110 may also include a training module 120 that may include circuitry and instructions for labeling the collected images. Labeling of the images may be performed based on information provided by the person or healthcare provider, such as via a popup window with the ability to input information about how the user is feeling, or to directly label an image from a dropdown menu of predefined labels such as headache, runny nose, sore throat, normal, tired, etc”, and Para. [0046], “Labeling may further be performed as a function of sound captured by a microphone integrated into or communicatively coupled to the smart device 110, wearable device inputs, and calendar events. For instance, if a person is exercising, the training module 120 may label the image with an indication that the image reflects a context of exercising or physical exertion. The microphone may pick up sounds indicative of coughing or sneezing, sniffling, running nose, nose blowing, moans for pain, or other sounds that may be correlated to certain health conditions, and label the collected images accordingly. Such sounds may be picked up during a phone call made using the smart device 110, or even contemporaneously with collection of the images. In one embodiment, the training module 120 may use a convolutional neural network (CNN) to train an individual health model using the collected thermal images and corresponding labels”, and Para. [0047], “The individual health model, which is also represented in FIG. 1 by training module 120, when trained, provides predictions of the thermal condition reflected in the images to a health index analysis module 125, which also receives context information collected by the mechanism 115. The health index module 125 analyzes the thermal condition information and context information using statistically sound methods, which may weight various pieces of information to generate a health index”). Lai does not specifically disclose wherein: [1] each of the plurality of temperature patterns is a graph with a horizontal axis being a position of each of parts in a respective specific site, among the plurality of specific sites, of the face of the subject in the thermal image and a vertical axis being a value obtained by normalizing a temperature value in each respective part; and [2] the plurality of temperature patterns include at least a first temperature pattern for an eye inner corner of the subject's face, a second temperature pattern for a nose tip of the subject's face, a third temperature pattern for a nose side of the subject's face, and the first, second, and third temperature patterns are three separate graphs [portions]. However, in the same field of endeavor of medical imaging, Hunt discloses wherein: each of the plurality of temperature patterns is a graph with a horizontal axis being a position of each of parts in a respective specific site, among the plurality of specific sites, of the face of the subject in the thermal image and a vertical axis being a temperature value in each respective part; and the plurality of temperature patterns include at least a first temperature pattern for an eye inner corner of the subject's face, a second temperature pattern for a nose tip of the subject's face, a third temperature pattern for a nose side of the subject's face, and the first, second, and third temperature patterns are three separate graphs [portions] (see, e.g., Col. 7, lines 24-45, “Referring to FIG. 3, a general outline of the type of display presented by the circuit of FIG. 1 on the face 59 of the cathode ray tube is provided. A picture 95 of an object whose image is being scanned across the detector 21 is displayed in the top portion of the picture display. This is a visual image of the object as observed by a detector limited to the infrared region (5-13 microns) of the electromagnetic energy spectrum. A graticule line 97 is brightly written across the screen at the bottom of the picture 95 by circuits in the video processing block 61. Below the picture 95 is displayed a curve 99 which represents the relative intensity of the picture 95 across a line 101 thereof. This shows the temperature profile of the object at a certain line thereacross. A bright white line is generated across the line 101 as a fiducial mark to show the area of the object where the temperature profile 99 is being taken. In order to permit some quantitative determination of the magnitude of the temperature profile 99, additional bright graticule lines 103 are provided as part of the display and are evenly spaced for comparison with the temperature profile curve 99”, and Fig. 3, where the disclosed graph/curve/profile 99 represents the temperature profile of the face at a certain line across the face, which includes portions of the subject’s face (such as the eyes and/or nose) that are then represented at different/separate portions of the graph/curve/profile 99). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the diseased person distinguishing device of Lai by including wherein: [1] each of the plurality of temperature patterns is a graph with a horizontal axis being a position of each of parts in a respective specific site, among the plurality of specific sites, of the face of the subject in the thermal image and a vertical axis being a temperature value in each respective part; and [2] the plurality of temperature patterns include at least a first temperature pattern for an eye inner corner of the subject's face, a second temperature pattern for a nose tip of the subject's face, a third temperature pattern for a nose side of the subject's face, and the first, second, and third temperature patterns are three separate graphs [portions], as disclosed by Hunt. One of ordinary skill in the art would have been motivated to make this modification in order to desirably display the relative intensity of an infrared thermal image to a user, as recognized by Hunt (see, e.g., Fig. 3, and Col. 7, lines 24-40). Lai modified by Hunt still does not disclose [1] wherein the vertical axis being a value obtained by normalizing a temperature value in each respective part. However, in the same field of endeavor of thermal imaging, Alloo discloses wherein the temperature patterns is a graph with a vertical axis being a value obtained by normalizing a temperature value (from a thermal image) in each respective part of the subject (see, e.g., Fig. 3, where the vertical axis of the graph represents values obtained by normalizing a temperature value to be between 0 and 1, and Para. [0022], “FIG. 1 generally depicts one embodiment of a defect detection system for inspecting a coated substrate to identify defects which may be present in the coating. The defect detection system generally comprises a controller, an object detector coupled to the controller for determining a color of the coated substrate, a temperature manipulation device coupled to the controller for manipulating the temperature of the coating applied to the substrate, and a thermal detector for collecting a thermal image of the coating on the substrate after the temperature of the coating has been manipulated”, and Para. [0032], “FIG. 3 shows the temperature history for each of the four defect-free colored coatings of Table 1 as a result of being heated with the heat source. The y-axis is indicative of the normalized temperature ((T-T.sub.mm)/(T.sub.max-T.sub.min)) while the x-axis is indicative of time in seconds”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the diseased person distinguishing device of Lai modified by Hunt by including [1] wherein the vertical axis being a value obtained by normalizing a temperature value in each respective part, as disclosed by Alloo. One of ordinary skill in the art would have been motivated to make this modification in order to desirably analyze/compare the temperature values from different types of thermal readings by normalizing the temperature values, as recognized by Alloo (see, e.g., Fig. 3, Para. [0022], and Para. [0032]). Regarding claim 2, Lai modified by Hunt and Alloo discloses the diseased person distinguishing device according to claim 1. Lai further discloses the diseased person distinguishing device further comprising: a specific site defining unit that defines the plurality of specific sites of the face of the subject in the thermal image (see, e.g., FIG. 3A, and Abstract, “A computer implemented method includes capturing, via a camera, one or more digital images of a face of a person representative of blood circulation of the person, collecting context information via one or more processors corresponding to the person contemporaneously with the capturing of the one or more digital images”, and Para. [0041], “The smart device 110 includes a mechanism 115 to capture a person's profile images. The mechanism 115 may be a camera which includes at least one of RGB and thermal sensors, such as an array of microbolometers to capture infrared radiation from the person. Both types of images, RGB and thermal sensor based images, may be used to detect the status of a person's circulatory condition. The camera may be front facing or rear facing and integrated into the smart device 110. A front facing camera enables easier use by the person to capture their own image, while a rear facing camera enables health care providers, either professional, friends, or family to capture the person's image”, and Para. [0042], “Both types of images, RGB and thermal sensor images, may be used to detect the status of a person's circulatory condition. For example, a higher temperature shown in the image may indicate denser capillaries and better blood circulation. Temperature distribution and changes may map to the status of the circulatory system, with different distributions being associated with different health conditions. Thermal sensor based images contain robust pixel information that clearly reflects such temperatures and temperature distributions”, and Para. [0061], “FIG. 3A is a representation of a thermal face image 300 that illustrates a region of interest 310 and a background 320. Note that the image is reproduced herein as a black and white image, but the different shades of black still show different temperatures of the face. Color images show temperature variations as different colors and intensities of color, however, the black and white image still conveys that there are color differences. The pixel data behind the images is what is processed to create the individual health model. The region of interest 310 is detected and may include various sub-regions of interest such as eyes, mouth, nose, forehead, etc”). Regarding claim 3, Lai modified by Hunt and Alloo discloses the diseased person distinguishing device according to claim 2. Lai further discloses wherein the plurality of specific sites include the eye inner corner, the nose tip, the nose side, and any one or more of a cheek, a jaw, an ear, a hand, a head excluding a hair portion, a temple, and a neck of the subject (see, e.g., FIG. 3A, and Para. [0061], “FIG. 3A is a representation of a thermal face image 300 that illustrates a region of interest 310 and a background 320. […] The region of interest 310 is detected and may include various sub-regions of interest such as eyes, mouth, nose, forehead, etc”). Regarding claim 4, Lai modified by Hunt and Alloo discloses the diseased person distinguishing device according to claim 3. Lai further discloses wherein the thermal image is obtained as the body surface temperature measurement device photographs the face of the subject, wherein the specific site defining unit locates face parts constituting the face on the thermal image and defines the plurality of specific sites on the thermal image, and wherein the temperature pattern calculation unit extracts temperature data from the plurality of specific sites defined and derives the temperature patterns for each of the plurality of specific sites (see, e.g., FIG. 3A, and Abstract, “A computer implemented method includes capturing, via a camera, one or more digital images of a face of a person representative of blood circulation of the person, collecting context information via one or more processors corresponding to the person contemporaneously with the capturing of the one or more digital images”, and Para. [0041], “The smart device 110 includes a mechanism 115 to capture a person's profile images. The mechanism 115 may be a camera which includes at least one of RGB and thermal sensors, such as an array of microbolometers to capture infrared radiation from the person. Both types of images, RGB and thermal sensor based images, may be used to detect the status of a person's circulatory condition. The camera may be front facing or rear facing and integrated into the smart device 110. A front facing camera enables easier use by the person to capture their own image, while a rear facing camera enables health care providers, either professional, friends, or family to capture the person's image”, and Para. [0042], “Both types of images, RGB and thermal sensor images, may be used to detect the status of a person's circulatory condition. For example, a higher temperature shown in the image may indicate denser capillaries and better blood circulation. Temperature distribution and changes may map to the status of the circulatory system, with different distributions being associated with different health conditions. Thermal sensor based images contain robust pixel information that clearly reflects such temperatures and temperature distributions”, and Para. [0061], “FIG. 3A is a representation of a thermal face image 300 that illustrates a region of interest 310 and a background 320. Note that the image is reproduced herein as a black and white image, but the different shades of black still show different temperatures of the face. Color images show temperature variations as different colors and intensities of color, however, the black and white image still conveys that there are color differences. The pixel data behind the images is what is processed to create the individual health model. The region of interest 310 is detected and may include various sub-regions of interest such as eyes, mouth, nose, forehead, etc”). Regarding claim 5, Lai modified by Hunt and Alloo discloses the diseased person distinguishing device according to claim 3. Lai further discloses wherein the thermal image is obtained as the body surface temperature measurement device photographs the face of the subject, and wherein the temperature pattern calculation unit locates face parts constituting the face and calculates the plurality of temperature patterns based on respective pixels of the thermal image (see, e.g., FIG. 3A, and Abstract, “A computer implemented method includes capturing, via a camera, one or more digital images of a face of a person representative of blood circulation of the person, collecting context information via one or more processors corresponding to the person contemporaneously with the capturing of the one or more digital images”, and Para. [0041], “The smart device 110 includes a mechanism 115 to capture a person's profile images. The mechanism 115 may be a camera which includes at least one of RGB and thermal sensors, such as an array of microbolometers to capture infrared radiation from the person. Both types of images, RGB and thermal sensor based images, may be used to detect the status of a person's circulatory condition. The camera may be front facing or rear facing and integrated into the smart device 110. A front facing camera enables easier use by the person to capture their own image, while a rear facing camera enables health care providers, either professional, friends, or family to capture the person's image”, and Para. [0042], “Both types of images, RGB and thermal sensor images, may be used to detect the status of a person's circulatory condition. For example, a higher temperature shown in the image may indicate denser capillaries and better blood circulation. Temperature distribution and changes may map to the status of the circulatory system, with different distributions being associated with different health conditions. Thermal sensor based images contain robust pixel information that clearly reflects such temperatures and temperature distributions”, and Para. [0061], “FIG. 3A is a representation of a thermal face image 300 that illustrates a region of interest 310 and a background 320. Note that the image is reproduced herein as a black and white image, but the different shades of black still show different temperatures of the face. Color images show temperature variations as different colors and intensities of color, however, the black and white image still conveys that there are color differences. The pixel data behind the images is what is processed to create the individual health model. The region of interest 310 is detected and may include various sub-regions of interest such as eyes, mouth, nose, forehead, etc”). Regarding claim 6, Lai modified by Hunt and Alloo discloses the diseased person distinguishing device according to claim 3. Lai further discloses wherein the thermal image is obtained as the body surface temperature measurement device photographs the face and a neck of the subject, wherein the specific site defining unit locates face parts constituting the face on the thermal image, and determines that the face of the subject faces front in an equal direction to a direction in which the body surface temperature measurement device performs the photographing and defines the plurality of specific sites from the thermal image, and wherein the temperature pattern calculation unit derives the plurality of temperature patterns of the plurality of specific sites (see, e.g., FIG. 3A, and Abstract, “A computer implemented method includes capturing, via a camera, one or more digital images of a face o
Read full office action

Prosecution Timeline

Mar 24, 2022
Application Filed
Jun 13, 2024
Non-Final Rejection — §101, §103, §112
Sep 10, 2024
Response Filed
Jan 06, 2025
Final Rejection — §101, §103, §112
Mar 27, 2025
Request for Continued Examination
Mar 29, 2025
Response after Non-Final Action
Apr 11, 2025
Non-Final Rejection — §101, §103, §112
Jun 20, 2025
Interview Requested
Jul 15, 2025
Examiner Interview Summary
Jul 15, 2025
Applicant Interview (Telephonic)
Jul 17, 2025
Response Filed
Oct 03, 2025
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12564353
ELECTRONIC APPARATUS AND METHOD FOR MEASURING SKIN FLUORESCENCE USING ELECTRONIC APPARATUS
2y 5m to grant Granted Mar 03, 2026
Patent 12527549
COMPOUND METHOD OF SHEAR-WAVE ELASTOGRAPHY AND QUASI-STATIC ELASTOGRAPHY
2y 5m to grant Granted Jan 20, 2026
Patent 12496039
ULTRASONIC ENDOSCOPE
2y 5m to grant Granted Dec 16, 2025
Patent 12484878
ACOUSTIC WINDOW WITH COMPOUND SHAPE FOR ULTRASOUND PROBE
2y 5m to grant Granted Dec 02, 2025
Patent 12376755
INDUCTIVE SENSING SYSTEM AND METHOD
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
55%
Grant Probability
92%
With Interview (+37.4%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 97 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month