DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-19 are hereby the present claims under consideration
Examiner’s Note: All references to Applicant’s specification are made using the paragraph numbers assigned in the US publication of the present application US 20240215861 A1.
Claim Objections
Claims 2-4 are objected to because of the following informalities:
Claim 2 lines 1-2 recite “the host computer comprises receiving” but it would seem this limitation should read “the host computer is further configured to receive” since the receipt of signals is not a physical component of the computer but rather a function of the computer. This objection is similarly applied to the language of claims 3-4.
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
the real-time object detection algorithm configured to … of claim 1
an alarm component of claim 10
a communication component of claim 11
A real-time object detection algorithm of claim 12 which detects face regions in the current visible images
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
a real-time object detection algorithm of claims 1 and 12 is interpreted as the particular algorithm for carrying out the recited functions. The specification does not describe the particular algorithm for carrying out the recited function. The particular steps taken to transform the input data into the output of abnormal respiratory information are not described. In particular, the algorithm is described as a machine learning You Only Look Once (YOLO) algorithm (Paragraph 0032) which undergoes a generic training procedure and is then capable of producing the desired outputs from the recited inputs. In particular, paragraphs 0053-0055 and 0063-0065 recite generic steps of training the algorithm to identify nostrils and foreheads, or face regions. Paragraph 0049 recites generic training steps for identifying the presence of a cover. The functions of detecting cycles of inhalation and exhalation is described in largely functional language. The specification discusses identifying changes in brightness to identify inhalation from exhalation but does not describe the particular algorithm used to detect the current brightness, compare it to a “normal respiratory standard” and identify that brightness as relating to inhalation or exhalation. Furthermore, the algorithm of converting the brightness identification into breathing information and determining if that breathing information is abnormal is not disclosed. Paragraphs 0055-0060 recite the functions of the algorithm but do not appear to describe the algorithm itself.
An alarm component of claim 10 is interpreted as a buzzer or indicator light and their equivalents as described in paragraph 0034.
A communication component of claim 11 is described in purely functional language in paragraphs 0035-0036 as communicating via wires, wireless, network connection, or access point. No particular structure or device has been described.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The claims are generally narrative and indefinite, failing to conform with current U.S. practice. They appear to be a literal translation into English from a foreign document and are replete with grammatical and idiomatic errors.
Claim limitations “real-time object detection algorithm” of claims 1 and 12 and “a communication component” of claim 11 invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function, as described in the above presented claim interpretation section. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Applicant may:
(a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph;
(b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)).
If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either:
(a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181.
Claim 1 recites “to notify abnormal respiratory information when the host computer determines a nasal area in the current thermal images is abnormal” but it is unclear what this limitation is meant to convey. In particular, it is unclear what “notify abnormal respiratory information” entails. Furthermore, the recitation of “when the host computer determines a nasal area in the current thermal images is abnormal” appears to convey that the algorithm evaluates the nasal area itself for abnormalities but the following limitation describing the function of the algorithm appears to indicate that it is the cycles of inhalation and exhalation that are evaluated to “generates the abnormal respiratory information” rather than any evaluation of the nasal area itself. For the purposes of this examination, this limitation will be interpreted as generating an alert when abnormal respiratory conditions are detected by evaluating cycles of inhalation and exhalation.
Claim 1 recites “the real-time object detection algorithm, configured to identify a face region in the current visible and thermal images, when the face region in the current visible images is not identifiable, the real-time object detection algorithm identifies the nasal area of the face region in the current thermal images” but it is unclear what metrics are considered to render a visual face image as “not identifiable”. It is unclear what this analysis entails and how the determination of identifiable or not identifiable relates to the claimed method. It is unclear how this limitation is intended to be interpreted.
Claim 1 recites “identify a face region in the current visible and thermal images, when the face region in the current visible images is not identifiable, the real-time object detection algorithm identifies the nasal area of the face region in the current thermal images, and the host computer generates the abnormal respiratory information according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area” but it is unclear if the limitations of “the real-time object detection algorithm identifies the nasal area of the face region in the current thermal images, and the host computer generates the abnormal respiratory information according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area” are intended to only be performed “when the face region in the current visible images is not identifiable” or if the system performs these operation even when the face region is identifiable in the image. It is unclear what the operation of the system entails when the face region is identifiable in the visible images. It is unclear how these limitations are intended to be interpreted
Claim 1 recites “the host computer generates the abnormal respiratory information according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area” but it is unclear how the “cycles of exhalation and inhalation” are detected through “brightness changes in the nostril area”. It is further unclear if the “brightness changes” are in reference to the thermal or visual images. For the purposes of this examination, the limitation will be interpreted as detected brightness changes in the thermal images and comparing the changes to thresholds to identify periods of inhalation and exhalation.
Claims 2-11 are rejected by virtue of their dependance on claim 1.
Claims 2-4, and 8 each refer to “respiratory information” but it is unclear if this limitation is the same as, related to, or different from “abnormal respiratory information” of claim 1. For the purposes of this examination, the system will be interpreted as generating respiratory information which may be labeled as abnormal and the abnormal information is notified as described in claim 1.
Claim 2 recites “receiving a plurality of temperatures readings detected from the brightness changes in the nostril area” but it is unclear if the received temperatures are received via the “thermal imaging sensor” of claim 1 or an external device. For the purposes of this examination, the limitation is interpreted as the temperature being received from the thermal sensor of claim 1.
Claim 2 recites “comparing the temperature difference in the nostril area with a normal respiratory standard to determine that respiratory information is abnormal” but the limitation “the temperature difference” lacks sufficient antecedent basis. It is further unclear how this limitation relates to the received temperatures. Additionally, it is unclear what the determination that the respiratory information is abnormal entails. The claim appears to indicate that a comparison to normal data is what makes the received information abnormal rather than a deviation from normal data. For the purposes of this examination, the limitation is interpreted as comparing the received temperatures to normal temperatures for normal respiration and determining that the respiration information is abnormal if it falls outside of predetermined thresholds or ranges.
Claim 3 recites “the brightness in the nostrils” but it is unclear how the brightness in the nostril is determined and how it relates to “the brightness in the nostril area” of claim 1. For the purposes of this examination, the limitations will be interpreted as referring to the same brightness in the same area.
Claim 4 recites “converting the number of times that brightness alternates between light and dark to an individual’s breaths per minute (BPM)” but the limitation “the number of times that brightness alternates between light and dark” lacks sufficient antecedent basis. Furthermore, the terms “light and dark” is a relative term which renders the claim indefinite. The terms “light and dark” are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The particular required degree in temperature fluctuation to qualify as the brightness of the infrared image becoming “light” or “dark” is unclear. For the purposes of this examination, any level of fluctuation used to identify a breathing cycle will be considered to anticipate this limitation.
Claim 4 the limitations “the converted individual’s breaths per minute” and “the criteria of the individual’s breaths per minute” lack sufficient antecedent basis.
Claim 5 recites “before detecting the brightness changes in the nostril area, alternating between light and dark in the current thermal images, the real-time object detection algorithm is configured to classify the nostril during inhalation and the nostril during exhalation” but it is unclear what this limitation is meant to convey. The limitation appears to indicate that inhalation and exhalation is “classified” prior to detecting temperature fluctuations. It is unclear how inhalation and exhalation may be classified without evaluating the thermal imaging, or brightness, information.
Claim 6 recites “when the face region in the current visible images is not identifiable, the host computer is configured to identify brightness changes of the forehead area, transform the brightness changes of the forehead area into a spectrum to extract at least one peak frequency as the heart rate information, and determine that the heart rate information is abnormal if the peak frequency is above or below a normal range” but it is unclear how the “forehead area” is being identified when the “face area” has been considered “not identifiable”. It is unclear if this limitation is directed towards analysis with the thermal or visible light data. For the purposes of this examination, the limitation is interpreted as referring to processing using visible light data.
Claim 6, the limitations “the forehead area” and “the heart rate information”, lack sufficient antecedent basis.
Claim 6 recites “extract at least one peak frequency as the heart rate information” but it is unclear if “peak frequency” is meant to refer to the highest frequency. The frequency of maximum power, or a frequency that is peak in some other metric. For the purposes of this examination, the limitation will be interpreted as the frequency of maximum power.
Claim 7 recites “the brightness changes of pixels within the forehead area” but it is unclear if this limitation is the same as, related to, or different from “brightness changes of the forehead area” of claim 6. For the purposes of this examination, the limitations are interpreted as the referring to the same brightness changes.
Claim 7 recites “transforms the pixels from the time domain into the frequency domain to generate the spectrum” but it is unclear how the “pixels” are being transformed and what such a transformation entails. For the purposes of this examination, the limitation is interpreted as the brightness signal in the time domain being transformed into the frequency domain.
Claim 8 recites “when the face region in the current visible images is not identifiable, the host computer is configured to identify both a mouth area and the nasal area in the current visible images” but it is unclear how face regions are being identified when the visible images have been deemed not identifiable. For the purposes of this examination, the limitation will be interpreted as identifying face regions.
Claim 9 recites “wherein the real-time object detection algorithm is configured to identify the forehead area in the current thermal images; the detector is configured to detect a plurality of temperatures at different positions within the forehead area; and then the host computer is configured to receive and average a plurality of temperatures readings detected from the brightness changes in the nostril area and determine that the average temperature is abnormal if the average temperature is above or below a temperature range” but it is unclear if the temperatures of the forehead are related to the average temperatures of the nostril region. It is unclear how the data gathered on the temperature of the forehead relates to the claimed system. For the purposes of this examination, the limitation will be interpreted as detecting ana average temperature at either the forehead or nostrils.
Claim 10 recites “an alarm component, coupled to the processor to notify the abnormal respiratory information or the abnormal the heart rate information” but no such abnormal heart rate information has been determined. It is unclear what a “notify” of this information entails. For the purposes of this examination, the limitation will be interpreted as an alarm for notifying of abnormal breathing.
Claim 10 recites “a communication component, coupled to the processor to transmit the physiological information on the internet” but it is unclear if the “physiological information” is the same as, related to, or different from “abnormal respiratory information” of claim 1. For the purposes of this examination, the limitation will be interpreted as any component for transmitting any type of physiological information. This rejection is further applied to claim 11.
Claim 11 recites “transmit the physiological information on the internet” but it is unclear what this limitation is meant to convey. It is unclear if the limitation is meant to convey that the system communicates with an external device over the internet, if the system stores information in “the cloud” or other internet storage, or if it is meant to convey some other form of data communication via the internet. For the purposes of this examination, any communication involving the internet will be considered sufficient to anticipate this limitation.
Claim 12 recites “identifying a nasal area in the current thermal images when a face region in the current visible images is not identified by the real-time object detection algorithm” but it is unclear how the nasal area is identified in the thermal images. It is further unclear how a face region is determined to be not identified in the visible images. It is unclear how these limitations are intended to be interpreted.
Claim 12 recites “identifying a nasal area in the current thermal images when a face region in the current visible images is not identified by the real-time object detection algorithm” but it is unclear if the identification of the nasal area in the thermal region is only performed when the face region of the visible images is not identifiable or if the identification is always performed. It is unclear if the following limitations of determining respiratory frequency are performed only when the face region in the visible images is not identified or of they occur whether or not the face regions are identified from the visible images. It is unclear what the method entails when the face regions can be identified. It is unclear how these limitations are intended to be interpreted
Claim 12 recites “determining that respiratory information is abnormal according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area” but it is unclear if the “brightness changes in the nostril area” are associated with the thermal or visible images. It is further unclear what “changes” in brightness serve to “detect” inhalation and exhalation and how these cycles are used to determine that respiratory information is abnormal. For the purposes of this examination, the limitation will be interpreted as being directed towards analysis of thermal data and any changes in “brightness” may be considered sufficient to identify cycles of inhalation and exhalation which may be considered abnormal through any metric.
Claim 12 the limitation “the cycles of exhalation and inhalation” lacks sufficient antecedent basis.
Claim 12 recites “notifying abnormal respiratory information” but it is unclear what this limitation is meant to convey. For the purposes of this examination, the limitation will be interpreted as generating an alert or alarm in response to determining abnormal respiratory information.
Claims 13-19 recite substantially the same limitations as claims 2-6 and 8-9 and are rejected mutatis mundus for the same reasons presented above in regards to claims 2-6 and 8-9 where the limitations are similar.
Claim Rejections - 35 USC § 112(a)
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-2, 5-9, 12-13, and 16-19 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Claim limitations “real-time object detection algorithm” of claims 1 and 12 and “a communication component” of claim 11 invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function, as described in the above presented claim interpretation section. Therefore, the claim lacks sufficient written description and is rejected under 35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112, first paragraph.
Claim 2 recites “comparing the temperature difference in the nostril area with a normal respiratory standard to determine that respiratory information is abnormal” but paragraphs 0056-0057 indicate that it is not the temperature difference itself that is compared to a standard to detect abnormality but rather a duration during which the temperature difference is unchanged which is compared to a standard value. The specification does not appear to describe that the temperature difference can be compared to a threshold temperature difference to determine abnormal respiratory information.
Claim 5 recites “before detecting the brightness changes in the nostril area, alternating between light and dark in the current thermal images, the real-time object detection algorithm is configured to classify the nostril during inhalation and the nostril during exhalation” but the specification does not appear to support the classification of inhalation and exhalation prior to detecting changes in thermal readings. Paragraph 0060 of the specification appears to indicate that the inhalation and exhalation classifications are performed using the brightness information.
Claim 6 recites “when the face region in the current visible images is not identifiable, the host computer is configured to identify brightness changes of the forehead area” but the specification does not appear to describe how the face region is determined to be unidentifiable. Paragraphs 0025-0026 recite that the visible images may be identifiable for given distances and illuminances but does not appear to describe what makes a face region “not identifiable”. Furthermore, Fig. 4B and paragraph 0045 recite that the brightness in the forehead regions are only identifiable when the face region is identifiable. This rejection is further applied to claim 7.
Claim 7 recites “transforms the pixels from the time domain into the frequency domain to generate the spectrum” but the specification does not appear to describe how “pixels” can be transformed into the frequency domain. In particular, paragraphs 0045 recites that the brightness changes of the signals are used to extract a PPG signal which is then transformed into the frequency domain rather than the pixels themselves.
Claim 8 recites “determine the respiratory information is abnormal if the mouth area or the nasal area are obscured by at least one cover” but the specification does not appear to describe how the determination if a mouth or nasal area re obscured by at least one cover. In particular, the recitations of training in paragraphs 0048-0050 are considered insufficient to support the recites functionality of the algorithm. Additionally, Paragraph 0048 and Fig. 4D appear to contradict the claim. Paragraph 0048 and Fig. 4D indicate that when abnormal respiratory information is generated when the algorithm detects that the face is not covered or obscured (i.e. the face is visible), and normal respiratory information is generated when the face is covered or obscured.
Claim 9 recites “wherein the real-time object detection algorithm is configured to identify the forehead area in the current thermal images; the detector is configured to detect a plurality of temperatures at different positions within the forehead area; and then the host computer is configured to receive and average a plurality of temperatures readings detected from the brightness changes in the nostril area and determine that the average temperature is abnormal if the average temperature is above or below a temperature range” but paragraphs 0064-0065 appear to indicate that the average temperature metric is generated from the temperature readings at the forehead rather than the nostril.
Claim 12 recites “identifying a nasal area in the current thermal images when a face region in the current visible images is not identified by the real-time object detection algorithm” but the specification does not describe an algorithm for performing the recited functions. In particular, paragraphs 0053-0055 and 0063-0065 describe only generic training for a machine learning algorithm to perform the recited function. The particular steps taken to transform the input to the output are not disclosed, furthermore, particular details in how the machine learning model is structured and trained are not provided. As such, the specification is not considered to provides sufficient written description support for the claimed algorithm performing the recited functions.
Claim 12 recites “determining that respiratory information is abnormal according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area” but paragraphs 0059-0061 do not describe how the algorithm detects “cycles of exhalation and inhalation” in order to determine abnormal respiratory information. In particular, paragraphs 0059-0061 recites that during exhalation the temperature of the nostrils is “slightly similar to the normal human body temperature” and that during inhalation the temperature of the nostrils is “slightly cooler than the normal human body temperature” but the specification does not appear to describe the degree of difference required to separate inhalation from exhalation, the processing method for determining when the pixels are “light” or “dark” or any other processing method for detecting inhalation and exhalation. The functions of the algorithm are described in purely functional language rather than as a description of how the algorithm performs the functions. The generic recitations of training the algorithm to detect the cycles in paragraph 0062 is not considered sufficient support for an algorithm which detects respiratory cycles because the particular steps to transform the inputs to the outputs are not disclosed nor is the particular structure and training method of the machine learning model disclosed.
Claims 13, and 16-19 recite substantially the same limitations as claims 2, 5-6 and 8-9 and are rejected mutatis mundus for the same reasons presented above in regards to claims 2, 5-6 and 8-9 where the limitations are similar.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1-19 are directed to a method of processing visible and thermal signals using a computational algorithm, which is an abstract idea. Claims 1-19 do not include additional elements that integrate the exception into a practical application or that are sufficient to amount to significantly more than the judicial exception for the reasons provided below which are in line with the 2014 Interim Guidance on Patent Subject Matter Eligibility (Federal Register, Vol. 79, No. 241, p 74618, December 16, 2014), the July 2015 Update on Subject Matter Eligibility (Federal Register, Vol. 80, No. 146, p. 45429, July 30, 2015), the May 2016 Subject Matter Eligibility Update (Federal Register, Vol. 81, No. 88, p. 27381, May 6, 2016), and the 2019 Revised Patent Subject Matter Eligibility Guidance (Federal Register, Vol. 84, No. 4, page 50, January 7, 2019) and the 2024 Update on Subject Matter Eligibility (Federal Register, Vol 89, No. 137, page 58128, July 17, 2024).
The analysis of claim 1 is as follows:
Step 1: Claim 1 is drawn to a machine.
Step 2A – Prong One: Claim 1 recites an abstract idea. In particular, claim 1 recites the following limitations:
[A1] notify abnormal respiratory information when the host determines a nasal area in the current thermal images is abnormal
[B1] identify a face region in the current visible and thermal images
[C1] identifies the nasal area of the face region in the current thermal images
[D1] generates the abnormal respiratory information according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area
These elements [A1]-[D1] of claim 1 are drawn to an abstract idea since they involve a mental process that can be practically performed in the human mind including observation, evaluation, judgment, and opinion and using pen and paper.
Step 2A – Prong Two: Claim 1 recites the following limitations that are beyond the judicial exception:
[A2] a detector, comprising a visible light sensor, configured to capture one or more current visible images within a target area, and a thermal imaging sensor, configured to capture one or more current thermal images within the target area
[B2] a host computer
[C2] a real-time object detection algorithm
These elements [A2]-[B2] of claim 1 do not integrate the exception into a practical application of the exception. In particular, the element [A2] is merely adding insignificant extra-solution activity to the judicial exception, i.e., mere data gathering at a higher level of generality - see MPEP 2106.04(d) and MPEP 2106.05(g). Furthermore, the element [B2] is merely an instruction to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.04(d) and MPEP 2106.05(f). Additionally, the element [C2] is nothing more than the computer implementation/automation of an abstract mental process of screening a patient, which is what a physician typically does with a patient in a diagnostic setting
Step 2B: Claim 1 does not recite additional elements that amount to significantly more than the judicial exception itself. In particular, the recitation “a detector, comprising a visible light sensor, configured to capture one or more current visible images within a target area, and a thermal imaging sensor, configured to capture one or more current thermal images within the target area” is merely insignificant extrasolution activity to the judicial exception, e.g., mere data gathering in conjunction with the abstract idea that uses conventional, routine, and well known elements or simply displaying the results of the algorithm that uses conventional, routine, and well known elements. In particular, the data acquirer is nothing more than a visible light imaging sensor and a thermal imaging sensor for imaging the patient’s face. Such sensors are conventional as evidenced by Applicant’s lack of a particular description in the specification. In particular paragraphs 0022-0024, 0028, and 0036 recite that the visible imaging sensor “can be a Charge-coupled Device (CCD) or a CMOS to constantly detect the plurality of visible images (VSI) for a period of time” and that the thermal imaging sensor “can be thermocouples, thermopiles, optical arrays, and the like”. Applicant’s lack of a particular description of the sensors indicates that they are well-known, routine, and/or conventional sensors.
Further, the elements [B2] and [C2] do not qualify as significantly more because this limitation is simply appending well-understood, routine and conventional activities previously known in the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known in the industry (see Electric Power Group, 830 F.3d 1350 (Fed. Cir. 2016); Alice Corp. v. CLS Bank Int’l, 110 USPQ2d 1976 (2014)) and/or a claim to an abstract idea requiring no more than being stored on a computer readable medium which is a well-understood, routine and conventional activity previously known in the industry (see Electric Power Group, 830 F.3d 1350 (Fed. Cir. 2016); Alice Corp. v. CLS Bank Int’l, 110 USPQ2d 1976 (2014); SAP Am. v. InvestPic, 890 F.3d 1016 (Fed. Circ. 2018)).
In view of the above, the additional elements individually do not integrate the exception into a practical application and do not amount to significantly more than the above-judicial exception (the abstract idea). Looking at the limitations as an ordered combination (that is, as a whole) adds nothing that is not already present when looking at the elements taking individually. There is no indication that the combination of elements improves the functioning of a computer, for example, or improves any other technology. There is no indication that the combination of elements permits automation of specific tasks that previously could not be automated. There is no indication that the combination of elements includes a particular solution to a computer-based problem or a particular way to achieve a desired computer-based outcome. Rather, the collective functions of the claimed invention merely provide conventional computer implementation, i.e., the computer is simply a tool to perform the process.
Claims 2-11 depend from claim 1, and recite the same abstract idea as claim 1. Furthermore, these claims only contain recitations that further limit the abstract idea (that is, the claims only recite limitations that further limit the algorithm), with the following exceptions:
Claim 10: an alarm component;
Claim 11: a communication component; and
Each of these claim limitations does not integrate the exception into a practical application. In particular, each of these limitations does not recite additional elements that amount to significantly more than the judicial exception itself because they are merely insignificant extrasolution activity to the judicial exception, e.g., mere data gathering in conjunction with the abstract idea that uses conventional, routine, and well known elements or simply displaying the results of the algorithm that uses conventional, routine, and well known elements. In particular, the alarm component may be a buzzer or indicator light which uses routine, conventional, and well-known components to output the result of the algorithm,
Also, this limitation from claim 11 is simply appending well-understood, routine and conventional activities previously known in the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions (that is, one of internet communication) that are well-understood, routine and conventional activities previously known in the industry (see Electric Power Group, 830 F.3d 1350 (Fed. Cir. 2016); Alice Corp. v. CLS Bank Int'l, 110 USPQ2d 1976 (2014); SAP Am. v. InvestPic, 890 F.3d 1016 (Fed. Circ. 2018)).
In view of the above, the additional elements individually do not integrate the exception into a practical application and do not amount to significantly more than the above-judicial exception (the abstract idea). Looking at the limitations of each claim as an ordered combination in conjunction with the claims from which they depend (that is, as a whole) adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer, for example, or improves any other technology. There is no indication that the combination of elements permits automation of specific tasks that previously could not be automated. There is no indication that the combination of elements includes a particular solution to a computer-based problem or a particular way to achieve a desired computer-based outcome. Rather, the collective functions of the claimed invention merely provide conventional computer implementation, i.e., the computer is simply a tool to perform the process.
The analysis of claim 12 is performed in light of the above analysis of claim 1 and is abridged where limitations are similar
The analysis of claim 12 is as follows:
Step 1: Claim 12 is drawn to a process.
Step 2A – Prong One: Claim 12 recites an abstract idea. In particular, claim 12 recites the following limitations:
[A1] identifying a nasal area in the current thermal images when a face region in the current visible images is not identified by the real-time object detection algorithm
[B1] determining that respiratory information is abnormal according to the cycles of exhalation and inhalation, as detected through brightness changes in the nostril area
[C1] notifying abnormal respiratory information
These elements [A1]-[C1] of claim 12 are drawn to an abstract idea since they involve a mental process that can be practically performed in the human mind including observation, evaluation, judgment, and opinion and using pen and paper.
Step 2A – Prong Two: Claim 12 recites the following limitations that are beyond the judicial exception:
[A2] a processor
[B2] a facial recognition system
[C2] receiving one or more current visible and thermal images
[D2] a detector in the facial recognition system
These elements [A2]-[D2] of claim 12 do not integrate the exception into a practical application of the exception. In particular, the elements [C2]-[D2] are merely adding insignificant extra-solution activity to the judicial exception, i.e., mere data gathering at a higher level of generality - see MPEP 2106.04(d) and MPEP 2106.05(g). Furthermore, the element [A2] is merely an instruction to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.04(d) and MPEP 2106.05(f). Further, the element [B2] merely generally links the use of the judicial exception to a particular technological environment or field of use – see MPEP 2106.05(h).
Additionally, the element “a real-time object detection algorithm” is nothing more than the computer implementation/automation of an abstract mental process of screening a patient, which is what a physician typically does with a patient in a diagnostic setting
Each of the above elements has been addressed in the above rejection of claim 1 and do not integrate the abstract idea into a practical application. Additionally, the element [B2] comprises only routine conventional, and well-known elements each used for their typical purpose and thus their combination into a system is further considered to be routine and conventional as they are not combined in a particular manner to produce a surprising technical effect.
In view of the above, the additional elements individually do not integrate the exception into a practical application and do not amount to significantly more than the above-judicial exception (the abstract idea). Looking at the limitations as an ordered combination (that is, as a whole) adds nothing that is not already present when looking at the elements taking individually. There is no indication that the combination of elements improves the functioning of a computer, for example, or improves any other technology. There is no indication that the combination of elements permits automation of specific tasks that previously could not be automated. There is no indication that the combination of elements includes a particular solution to a computer-based problem or a particular way to achieve a desired computer-based outcome. Rather, the collective functions of the claimed invention merely provide conventional computer implementation, i.e., the computer is simply a tool to perform the process.
Claims 13-19 depend from claim 12, and recite the same abstract idea as claim 12. Furthermore, these claims only contain recitations that further limit the abstract idea (that is, the claims only recite limitations that further limit the algorithm).
In view of the above, the additional elements individually do not integrate the exception into a practical application and do not amount to significantly more than the above-judicial exception (the abstract idea). Looking at the limitations of each claim as an ordered combination in conjunction with the claims from which they depend (that is, as a whole) adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer, for example, or improves any other technology. There is no indication that the combination of elements permits automation of specific tasks that previously could not be automated. There is no indication that the combination of elements includes a particular solution to a computer-based problem or a particular way to achieve a desired computer-based outcome. Rather, the collective functions of the claimed invention merely provide conventional computer implementation, i.e., the computer is simply a tool to perform the process.
Prior Art
In light of the above presented clarity rejections, a prior art mapping is unable to be performed as the intended scope of the claims is unclear. As stated in In re Steele, 305 F.2d 859, 134 USPQ 292 (CCPA 1962), a rejection under 35 U.S.C. 103 should not be based on considerable speculation regarding the meaning of terms employed in a claim or assumptions that must be made as to the scope of the claim. New grounds of rejection under 35 USC 102 and/or 103 may be necessitated based on the scope of the clarified claims.
The closest prior art of record is presently considered to be:
US Patent Application Publication Number US 2015/0124067 A1 hereinafter Bala which teaches a detector which may capture both visible and thermal images (Paragraph 0020). The detector may be coupled to a processor (Paragraph 0034). The system may evaluate respiratory and cardiac function and produce an alert caused by respiratory and/or cardiac failure (Paragraphs 0025-0028 and 0034)
US Patent Application Publication Number US 2012/0289850 A1 hereinafter Xu teaches a thermal imaging system which may identify face regions including the nostrils (Paragraph 0031). The thermal imaging camera is used to generate values to track a user’s breathing patterns over time by tracking the pixel values of one or more facial features associated with respiration over time to detect periodic peaks and valleys in the thermal imaging data (Paragraphs 0032-0034; Fig. 5A)
International Application Publication Number WO 2014012070 A1 hereinafter Lewis teaches a monitoring system including a visible and infrared imaging camera and processor (Page 29 paragraphs 2-5). The system identified facial regions including the oronasal region and may track these regions through the thermal and/or visual camera system. The visual camera system may be used when light conditions are favorable. Combining both the thermal and visual tracking may result in improves tracking accuracy (Page 33 paragraph 3 – page 34 paragraph 1). The system detects whether or not the patient is breathing by detecting temperature changes caused by inhalation and exhalation. The system may generate an alarm based on a time between exhalations being outside of a predetermined range to detect events such an apnea and hyperventilation (Page 34 paragraph 2). The system may further be configured to detect abnormal body temperature by producing an average temperature associated with thermal pixels corresponding to the face of the patient (Page 35 paragraphs 2-3)
US Patent Number US 9750420 B1 hereinafter Agrawal teaches a computing device which can utilize detectors and image classifiers to detect features of a user’s face including the eyes, nose, and mouth. The detected features may be used to for detecting the heart rate of the user by analyzing color and other image data measurements (Col 2 lines 21-40). The detectors may include cameras (Col 2 lines 56-67). The device may select a region of interest such as a forehead of the user for image analysis to determine heart rate (Col 11 lines 25-33). The region of interest is analyzed for factors such as intensity or brightness for a period of time. The average intensity for each color channel may be determined to produce a time-based pattern. A Fourier transform may be applied to the time signal to produce a frequency graph. The frequency with the highest amplitude may be determined as the heart rate (Col 12 lines 12-52).
US Patent Application Publication Number US 2018/0035082 A1 hereinafter Patil teaches a monitoring system with an alert generator (Abstract). The system includes a camera (Paragraph 0027). The system may detect when a subject’s face is covered or partially covered by an object such as a blanket and determine if the subject is in a safe or unsafe condition based on if the blanket covers their nose (Paragraphs 0096-0094; Fig 4).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW ERIC OGLES whose telephone number is (571)272-7313. The examiner can normally be reached M-F 8:00AM - 5:30PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Sims can be reached on Monday-Friday from 9:00AM – 4:00PM at (571) 272 – 7540. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MATTHEW ERIC OGLES/Examiner, Art Unit 3791
/JASON M SIMS/Supervisory Patent Examiner, Art Unit 3791