Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Response to Amendment
In the Amendment dated 07 January 2026, the following occurred:
Claims 1, 9, 10, and 12-19 were amended.
Claim 20 is new.
Claims 1, 2, 4-10, and 12-20 are pending.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 2, 4-10, and 12-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Claims 1, 9, and 17 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1
The claims recite a system and method for determining a classification of a lymphedema induced fluorescence pattern, and therefore meet step 1.
Step 2A1
The limitations of (Claim 17 being representative) receiving… a fluorescence image…, the fluorescence image being generated from measurement of a fluorescence signal in a tissue of the body part to which the fluorescence agent has been administered; and a corresponding visible light image… of the tissue of the body part, wherein capturing of the fluorescence image and capturing of the visible light image are performed simultaneously; providing… the fluorescence image and the corresponding visible light image as an input feature […for analysis…]; deriving… information from the corresponding visible light image; performing an inference operation… by applying the fluorescence image and the information derived from the corresponding visible light image to […the analysis…] to generate a classification of a lymphedema induced fluorescence pattern; deriving… a diagnostic result from the classification of the lymphedema induced fluorescence pattern; and outputting the classification of the lymphedema induced fluorescence pattern and the diagnostic result to a user…, as drafted, is a process that, under the broadest reasonable interpretation, falls in the grouping of certain methods of organizing human activity (i.e., managing personal behavior including following rules or instructions).
That is, other than reciting a system and methods implemented by one or more processors (a general-purpose computing device), the claimed invention amounts to managing personal behavior or interaction between people. If a claim limitation, under its broadest reasonable interpretation, covers managing personal behavior or interactions between people but for the recitation of generic computer components, then it falls within the “certain methods of organizing human activity” grouping of abstract ideas. The Examiner notes that the artificial intelligence (AI) model is described in the Specification at Page 3, Line 19 as encompassing a classification model, which is simple enough to be included in the abstract idea. Accordingly, the claim recites an abstract idea.
Step 2A2
This judicial exception is not integrated into a practical application. In particular, the claims recite the additional element of one or more processors (claims 1, 9, and 17) that implements the identified abstract idea. The computing elements are not exclusively described by the applicant and are recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
The claims further recite the additional elements of an input interface, a user interface, a fluorescence image sensor, and a visible light image sensor. The input interface, user interface, fluorescence image sensor, and visible light image sensor merely generally link the abstract idea to a particular technological environment or field of use. MPEP 2106.04(d)(I) indicates that generally linking an abstract idea to a particular technological environment or field of use is insufficient to provide a practical application. Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application.
Claim 17 recites the additional element of (a) “administering a fluorescent agent to a body part of a patient” while Claim 19 recites the additional element of (b) “performing a therapy on the patient, the therapy being customized to the diagnostic result relative to the severity of lymphedema.” Regarding (a), the administration of a fluorescent agent to the patient represents extra-solution data gathering. MPEP 2106.04(d)(I) indicates that extra-solution data gathering activity cannot provide a practical application. Regarding (b), the performance of a therapy represents an “apply it” step. MPEP 2106.04(d)(I) indicates that merely saying “apply it” or equivalent to the abstract idea cannot provide a practical application. Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application.
Step 2B
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a general-purpose computer to perform the noted steps amounts to no more than mere instructions to apply the exception using a generic computer component cannot provide an inventive concept (“significantly more”).
As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of an input interface, a user interface, a fluorescence image sensor, and a visible light image sensor were considered to generally link the abstract idea to a particular technological environment or field of use. This has been re-evaluated under the “significantly more” analysis and has also been found insufficient to provide significantly more. MPEP 2106.05(A) indicates that generally linking an abstract idea to a particular technological environment or field of use cannot provide significantly more.
Also, as discussed above with respect to integration of the abstract idea into a practical application, the additional elements of (a) “administering a fluorescent agent to a body part of a patient” and (b) “performing a therapy on the patient, the therapy being customized to the diagnostic result relative to the severity of lymphedema,” were found to be extra-solution data gathering activity and “apply it,” respectively. Regarding (a), this has been re-evaluated under the “significantly more” analysis and determined to be well-understood, routine, conventional activity in the field. The prior art of record indicates that administering a fluorescent dye for diagnostic purposes to a patient is well-understood, routine, conventional activity in the field (see Gurevich, US 2018/0028079 at Para. 0092; Shabanpoor, US 2023/0399378 at Para. 0115; Xu, US 2021/0038065 at Para. 0131; Erturk, US 2020/0209118 at Para. 0196). Regarding (b), the additional element of performing a therapy was determined to be “apply it.” This has been re-evaluated under the “significantly more” analysis and has also been found insufficient to provide significantly more. MPEP2106.05(I)(A) indicates that merely saying “apply it” or equivalent to the abstract idea cannot provide an inventive concept (“significantly more”). Accordingly, even in combination, this additional element does not provide significantly more. As such the claim is not patent eligible.
Claims 2, 4-8, 10, 12-16, and 18-20 are similarly rejected because they either further define/narrow the abstract idea and/or do not further limit the claim to a practical application or provide an inventive concept such that the claims are subject matter eligible even when considered individually or as an ordered combination.
Claims 2, 10, and 18 merely describe the classification of lymphedema induced fluorescence pattern, which further defines the abstract idea.
Claims 4 and 12 merely describe the input interface and the tissue, which further defines the abstract idea. Claims 5 and 13 merely describe a large fluorescence image, a corresponding large visible light image, and analysis techniques applied to the data, which further defines the abstract idea. Claim 6 merely describes the dichroic prism assembly, which further defines the abstract idea. Claim 20 merely describes receiving pairs of the fluorescence image and the corresponding visible light image, providing input features, and performing the inference operation, which further defines the abstract idea.
Claims 5, 13, and 20 include the additional element of a stitching algorithm applied to images. Under the practical application a1nalysis this generally links the abstract idea to a particular technological environment. Under the significantly more analysis, the prior art of record indicates that the applying a stitching algorithm (which necessarily includes parameters) to images is well-understood, routine, and conventional (see US2023/0385984 to Cecen et al. at Abstract; US 20230326074 to Strandborg et al. at Para. 0002, 0062; US 20210168284 to Sjolund et al. at Para. 0002).
Claims 4-6, 12, and 13 further recite an image capturing and processing device comprising an image capturing device, which is considered to “generally link” under both the practical application and significantly more analysis; the image capturing device comprising a dichroic prism assembly, an illumination light source, a fluorescence image sensor, and a visible light image sensor.
Claim 7 merely describes the input interface and patient related data, which further defines the abstract idea.
Claims 8, 15, and 16 merely describe patient related data, which further defines the abstract idea.
Claim 14 merely describes the measurement of the fluorescence signal and the capturing of the fluorescence image, which further defines the abstract idea.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4, 9, 12, 14, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Gurevich et al. (U.S. 2018/0028079) in view of Bradbury et al. (U.S. 2015/0182118) and McDowall et al. (U.S. 2022/0095903), referred to hereinafter as Gurevich, Bradbury, and McDowall, respectively.
REGARDING CLAIM 1
Gurevich teaches the claimed computer-based clinical decision support system (CDSS) comprising: one or more processors comprising hardware, the one or more processors being configured to: [Para. 0090 teaches a system configured to characterize tissue and present image data to enable clinical decision making. Para. 0010 teaches one or more processors.]
perform an inference operation by applying the fluorescence image and the information derived […] to the AI model to generate a classification of a lymphedema induced fluorescence pattern; [Para. 0159 teaches a wound is caused by lymphedema. Para. 0154 teaches using trained neural networks to detect patterns. Various learning models are used, such as error-based learning (logistic regression, support vector machines, and neural networks). Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
and output the classification of the lymphedema induced fluorescence pattern to a user interface. [Para. 0150 teaches a user interface through which the spatial map image is displayed.]
Gurevich may not explicitly teach
receive, through an input interface: a fluorescence image captured by a fluorescence image sensor, the fluorescence image being generated from measurement of a fluorescence signal in a tissue of a body part to which a fluorescence agent has been added; and
provide the fluorescence image and the corresponding visible light image […] as input features to an artificial intelligence (Al) model;
derive information from the corresponding visible light image;
…from the corresponding visible light image…
However, Bradbury teaches the following:
receive, through an input interface, a fluorescence image captured by a fluorescence image sensor, the fluorescence image being generated from measurement of a fluorescence signal in a tissue of a body part to which a fluorescence agent has been added; and [Para. 0008 teaches a fluorescence imaging camera. Para. 0189 teaches an input and display device (input interface).]
provide the fluorescence image and the corresponding visible light image […] as input features to an artificial intelligence (Al) model; [Para. 0188 teaches broad spectrum light and the emitted fluorescence light are collected in the camera by one or more sensors.]
derive information from the corresponding visible light image; [Para. 0264 teaches extrapolation is based on an analysis of the full spectrum, including light image data.]
…from the corresponding visible light image… [Para. 0264 teaches extrapolation is based on an analysis of the full spectrum, including light image data.]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich to provide a visible light image as taught by Bradbury, with the motivation of reducing the risk of lymphedema (see Bradbury at Para. 0004).
Gurevich in view of Bradbury may not explicitly teach
a corresponding visible light image, captured by a visible light image sensor, of the tissue of the body part, wherein capturing of the fluorescence image and capturing of the visible light image are performed simultaneously;
that were captured simultaneously
However, McDowall teaches the following:
a corresponding visible light image, captured by a visible light image sensor, of the tissue of the body part, wherein capturing of the fluorescence image and capturing of the visible light image are performed simultaneously; [Para. 0072 teaches capturing visible light and fluorescence illumination simultaneously (interpreted as the images of Gurevich/Bradbury).]
that were captured simultaneously [Para. 0072 teaches capturing visible light and fluorescence illumination simultaneously (interpreted as the images of Gurevich/Bradbury).]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury to capture the fluorescence image and the visible light image simultaneously as taught by McDowall, with the motivation of improving a surgeon’s efficiency (see McDowall at Para. 0027).
REGARDING CLAIM 4
Gurevich in view of Bradbury and McDowall teaches the claimed system comprising: the computer-based CDSS according to claim 1.
Bradbury further teaches
an image capturing device comprising: [Para. 0008 teaches an imaging device.]
an illumination light source configured to illuminate the tissue with excitation light having a wavelength suitable to generate emitted light by excited emission of the fluorescent agent; [Para. 0188 teaches a light engine configured to generate excitation light at target wavelengths and to excite each of the fluorescent agents.]
the fluorescence image sensor configured to capture the fluorescence image by spatially resolved measurement of the emitted light so as to provide the fluorescence image; and [Para. 0247 teaches a fluorescence image sensor.]
the visible light image sensor configured to capture the corresponding visible light image, the corresponding visible light image being a visible light image of a section of a surface of the body part, [Para. 0247 teaches a visible light image sensor.]
wherein the fluorescence image sensor and the visible light image sensor are configured in that one or more of a viewing direction and a perspective of the fluorescence image and the corresponding visible light image are linked via a known relationship. [Para. 0214 teaches measuring a spatial coordinate. Para. 0215 teaches the imaging system is calibrated via a known relationship.]
REGARDING CLAIMS 9 AND 17
Claims 9 and 17 are analogous to Claim 1, thus Claims 9 and 17 are similarly analyzed and rejected in a manner consistent with the rejection of Claim 1.
REGARDING CLAIM 12
Claim 12 is analogous to Claim 4, thus Claim 12 is similarly analyzed and rejected in a manner consistent with the rejection of Claim 4.
REGARDING CLAIM 14
Gurevich in view of Bradbury and McDowall teaches the claimed computer-implemented method of claim 9.
Gurevich further teaches
wherein the first fluorescence image […] [is] provided through the input interface as [an] input feature(s) to the AI model, and [Para. 0037, 0090 teaches a system configured to characterize tissue using fluorescence imaging agents. Para. 0009 teaches using images as input features of a machine learning algorithm.]
Bradbury further teaches
…the second fluorescence image… [Para. 0027, 0065 teaches providing the first and second fluorescence images.]
wherein the measurement of the fluorescence signal is performed on the tissue, to which at least a first fluorescent agent and a second fluorescent agent have been added, wherein the capturing of the fluorescence image comprises: [Para. 0270 teaches a tissue absorbing a fluorescent species (agent). Para. 0027 teaches a first fluorescent reporter (agent) and a second fluorescent reporter. Para. 0065 teaches capturing an image of the fluorescence signal.]
capturing a first fluorescence image in a first wavelength range, which is generated by illuminating the tissue with first excitation light having a first wavelength suitable to generate emitted light by a first excited emission of the first fluorescent agent, and capturing a second fluorescence image in a second wavelength range, which is generated by illuminating the tissue with second excitation light having a second wavelength suitable to generate emitted light by a second excited emission of the second fluorescent agent, [Para. 0228 teaches various wavelength ranges. It is clear to a skilled person that a dichroic prism assembly is a means of light separation which can be configured to separate light into arbitrary wavelength ranges as required by a desired application.]
wherein the computer-implemented method comprises performing the inference operation by applying the first fluorescence image, […] and the information derived from the corresponding visible light image to the AI model to generate the classification of the lymphedema induced fluorescence pattern. [Para. 0159 teaches a wound is caused by lymphedema. Para. 0154 teaches using trained neural networks to detect patterns. Various learning models are used, such as error-based learning (logistic regression, support vector machines, and neural networks). Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
Claims 2, 10, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Gurevich in view of Bradbury, McDowall, Miyashita et al. (U.S. 2018/0306717), and Jae Yong Jeon et al. (KR 2022/0049060), referred to hereinafter as Miyashita and Jae, respectively.
REGARDING CLAIM 2
Gurevich in view of Bradbury and McDowall teaches the claimed computer-based CDSS according to claim 1.
Gurevich in view of Bradbury and McDowall may not explicitly teach
wherein the classification of the lymphedema induced fluorescence pattern...
However, Miyashita teaches the following:
wherein the classification of the lymphedema induced fluorescence pattern... [Para. 0008 teaches the fluorescence pattern indicates a lymphatic drainage path.]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury and McDowall to classify a lymphedema induced fluorescence pattern as taught by Miyashita, with the motivation of precisely recognizing a lymphatic drainage path (see Miyashita at Para. 0005).
Gurevich in view of Bradbury, McDowall, and Miyashita may not explicitly teach
…is one or more of a stage of severity of lymphedema and a clinical type of the fluorescence pattern.
However, Jae teaches the following:
…is one or more of a stage of severity of lymphedema and a clinical type of the fluorescence pattern. [Para. 0120 teaches stages of lymphedema.]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury, McDowall, and Miyashita to include a stage of lymphedema as taught by Jae, with the motivation of decreasing subjectivity in diagnosis (see Jae at Para. 0004).
REGARDING CLAIMS 10 AND 18
Claims 10 and 18 are analogous to Claim 2, thus Claims 10 and 18 are similarly analyzed and rejected in a manner consistent with the rejection of Claim 2.
Claims 5 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Gurevich in view of Bradbury, McDowall, Miyashita, and Wong et al. (U.S. 2023/0000600), referred to hereinafter as Wong.
REGARDING CLAIM 5
Gurevich in view of Bradbury and McDowall teaches the claimed system of claim 4.
Gurevich further teaches
provide the large fluorescence image […] as the input features to the Al model; [Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
Bradbury further teaches
…and the corresponding large visible light image…; [Para. 0188 teaches broad spectrum light and the emitted fluorescence light are collected in the camera by one or more sensors.]
derive information from the corresponding large visible light image; and [Para. 0264 teaches extrapolation is based on an analysis of the full spectrum, including light image data.]
perform the inference operation by applying the large fluorescence image and the information derived […] to the Al model to generate the classification of the lymphedema induced fluorescence pattern. [Para. 0159 teaches a wound is caused by lymphedema. Para. 0154 teaches using trained neural networks to detect patterns. Various learning models are used, such as error-based learning (logistic regression, support vector machines, and neural networks). Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
…from the large corresponding visible light image… [Para. 0264 teaches extrapolation is based on an analysis of the full spectrum, including light image data.]
Gurevich in view of Bradbury and McDowall may not explicitly teach
wherein the fluorescence image sensor and the visible light image sensor are further configured to repeat capturing of the fluorescence image and the corresponding visible light image to provide a series of fluorescence images and a series of corresponding visible light images,
However, Miyashita teaches the following:
wherein the fluorescence image sensor and the visible light image sensor are further configured to repeat capturing of the fluorescence image and the corresponding visible light image to provide a series of fluorescence images and a series of corresponding visible light images, [Para. 0052 teaches an image acquisition unit configured to acquire fluorescence images in series.]
Motivation to combine the teaching of Miyashita with the teachings of Gurevich, Bradbury, and McDowall is the same as that presented with respect to claim 2 and is therefore reiterated here.
Gurevich in view of Bradbury, McDowall, and Miyashita may not explicitly teach
and wherein the one or more processors are configured to:
apply a stitching algorithm on the series of corresponding visible light images to generate a corresponding large visible light image of the body part, wherein the stitching algorithm determines and applies a set of stitching parameters;
apply the stitching algorithm on the series of fluorescence images to generate a large fluorescence image, wherein the stitching algorithm applies the set of stitching parameters determined when performing the stitching of the corresponding visible light images;
However, Wong teaches the following:
and wherein the one or more processors are configured to: [Para. 0148 teaches one or more processors.]
apply a stitching algorithm on the series of corresponding visible light images to generate a corresponding large visible light image of the body part, wherein the stitching algorithm determines and applies a set of stitching parameters; [Para. 0093 teaches using visible light to capture images. Para. 0048 teaches applying a stitching algorithm to generate a composite image.]
apply the stitching algorithm on the series of fluorescence images to generate a large fluorescence image, wherein the stitching algorithm applies the set of stitching parameters determined when performing the stitching of the corresponding visible light images; [Para. 0064 teaches an imaging apparatus that acquires fluorescence images. Para. 0048 teaches applying a stitching algorithm to generate a composite image (large fluorescence image).]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury, McDowall, and Miyashita to apply a stitching algorithm as taught by Wong, with the motivation of organizing and displaying the information obtained from multiple imaging modes (see Wong at Para. 0107).
REGARDING CLAIM 13
Claim 13 is analogous to Claim 5, thus Claim 13 is similarly analyzed and rejected in a manner consistent with the rejection of Claim 5.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Gurevich in view of Bradbury, McDowall, and Meester et al. (WO 2018/164579), referred to hereinafter as Meester.
REGARDING CLAIM 6
Gurevich in view of Bradbury and McDowall teaches the claimed system of claim 4.
Gurevich in view of Bradbury and McDowall may not explicitly teach
wherein the image capturing device comprises a dichroic prism assembly configured to receive fluorescent light and visible light through an entrance face, the dichroic prism assembly comprising:
a first prism, a second prism,
a first compensator prism located between the first prism and the second prism,
a second dichroic prism assembly configured to split the visible light in three light components;
and a second compensator prism located between the second prism and the second dichroic prism assembly,
wherein the first prism and the second prism each have a cross section with at least five corners, each corner having an inside angle of at least 90 degrees,
wherein the corners of the first prism and the second prism each have a respective entrance face and a respective exit face,
and are each configured so that an incoming beam which enters the entrance face of the respective first and second prisms in a direction parallel to a normal of said entrance face is reflected twice inside the respective first and second prisms and exits the respective first and second prisms through their exit face parallel to a normal of said exit face,
wherein the normal of the entrance face and the normal of the exit face of the respective first and second prisms are perpendicular to each other, and
wherein, when light enters the first prism through the entrance face, the light is partially reflected towards the exit face of the first prism thereby traveling a first path length from the entrance face of the first prism to the exit face of the first prism,
and the light partially enters the second prism via the first compensator prism and is partially reflected towards the exit face of the second prism, thereby traveling a second path length from the entrance face of the first prism to the exit face of the second prism,
and wherein the first prism is larger than the second prism so that the first and the second path lengths are the same.
However, Meester teaches the following:
PNG
media_image1.png
444
760
media_image1.png
Greyscale
Fig. 5 of Meester, reproduced above, teaches a five-channel dichroic prism assembly, which is exactly the same as the description of the dichroic prism assembly presented in Fig. 5 and associated text of Applicant’s present disclosure.
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury and McDowall to use a dichroic prism assembly as taught by Meester, with the motivation of avoiding time-switching of the received signals (see Meester at Page 3, Line 32).
Claims 7, 8, 15, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Gurevich in view of Bradbury, McDowall, and Barnard et al. (CN 116830207), referred to hereinafter as Barnard.
REGARDING CLAIM 7
Gurevich in view of Bradbury and McDowall teaches the claimed computer-based CDSS of claim 1.
Gurevich further teaches
perform the inference operation by applying the fluorescence image [and] the information derived from the corresponding visible light image […] to the AI model to generate the classification of the lymphedema induced fluorescence pattern. [Para. 0159 teaches a wound is caused by lymphedema. Para. 0154 teaches using trained neural networks to detect patterns. Various learning models are used, such as error-based learning (logistic regression, support vector machines, and neural networks). Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
Gurevich in view of Bradbury and McDowall may not explicitly teach
…the patient related data…
wherein the one or more processors are configured to: receive, through the input interface as a direct link, patient related data recorded on an electronic patient record, provide the patient related data as further input features to the AI model; and
However, Barnard teaches the following:
…the patient related data… [Para. 0083 teaches a data collection module receiving medical data directly from a portal, which allows users to import document files from a patient data source. Patient data sources include EMR systems.]
wherein the input interface further is a direct link to an electronic patient record, wherein patient related data are provided through the input interface as further input features to the AI model. [Para. 0083 teaches a data collection module receiving medical data directly from a portal, which allows users to import document files from a patient data source. Patient data sources include EMR systems. Para. 0027 teaches the portal includes an input interface. Para. 0010 teaches applying a machine learning model to the medical record.]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury and McDowall to link to electronic medical records via an input interface as taught by Barnard, with the motivation of improving access to a patient’s medical data (see Barnard at Para. 0044).
REGARDING CLAIM 8
Gurevich in view of Bradbury, McDowall, and Barnard teaches the claimed computer-based CDSS of claim 7.
Gurevich further teaches
wherein the patient related data comprises one or more of data relative to: age, gender, height, weight, Body Mass Index, fat mass, muscle mass, daily exercise mass, presence or absence of work, skin color, medication status, presence or absence of vascular disease, presence or absence of disease, dialysis or diabetes, amount of albumin in blood, kidney function, liver function, heart function, Hemoglobin concentration in the blood, blood estimate, lipid metabolism, blood glucose concentration in the blood, urea or nitrogen, ankle/humeral index value; lymphatic function measurement data at the same location before the occurrence of lymphedema, endocrine information and hormone level of the patient. [Para. 0156 teaches non-clinical data comprises a subject’s age.]
REGARDING CLAIMS 15 AND 16
Claims 15 and 16 are analogous to Claims 7 and 8, thus Claims 15 and 16 are similarly analyzed and rejected in a manner consistent with the rejections of Claims 7 and 8.
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Gurevich in view of Bradbury, McDowall, Myashita, Jae, and Weiler et al. (U.S. 2016/0235354), referred to hereinafter as Weiler.
REGARDING CLAIM 19
Gurevich in view of Bradbury, McDowall, Miyashita, and Jae teaches the claimed computer-implemented method of long-term therapy of lymphedema, comprising:
diagnosing a severity of lymphedema by performing the computer-implemented method of diagnosing lymphedema according to claim 18; [Gurevich in view of Bradbury, Miyashita, and Jae teaches claim 18.]
Gurevich in view of Bradbury and McDowall may not explicitly teach
performing a therapy on the patient, the therapy being customized to the diagnostic result relative to the severity of lymphedema; and
repeating the diagnosing of the severity of lymphedema and the performing of the therapy on the patient, wherein in each iteration of the repeating, the therapy is adjusted to the stage of lymphedema detected.
However, Weiler teaches the following:
performing a therapy on the patient, the therapy being customized to the diagnostic result relative to the severity of lymphedema; and [Para. 0002 teaches treating a patient with lymphedema. The severity of lymphedema is categorized by percent increase in extremity volume. Limb volume measurements can be made using a tape measure. Para. 0074 teaches taking serial tape measurements along the limb(s) in question and creating custom garments (therapy) to the specifications.]
repeating the diagnosing of the severity of lymphedema and the performing of the therapy on the patient, wherein in each iteration of the repeating, the therapy is adjusted to the stage of lymphedema detected. [Para. 0072 teaches repeatedly monitoring lymphedema progression (severity) to determine whether the patient needs an adjustment to their compression garment (therapy).]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury and McDowall to repeat the diagnosing and performing of a therapy as taught by Weiler, with the motivation of improving treatment outcomes and reducing medical costs (see Weiler at Para. 0044).
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Gurevich in view of Bradbury, McDowall, and Wong.
REGARDING CLAIM 20
Gurevich in view of Bradbury and McDowall teaches the claimed computer-based CDSS according to claim 1.
McDowall further teaches
wherein the one or more processors are configured to: [Para. 0004 teaches a processor.]
receive, through the input interface, a plurality of simultaneously captured pairs of the fluorescence image and the corresponding visible light image, of different sections of the tissue of the body, wherein each of the pairs of the fluorescence image and the corresponding visible light image are linked by known and constant relationship with respect to viewing direction and/or perspective in which the each of the pair of the fluorescence image and the corresponding visible light image is captured; [Para. 0004 teaches capturing visible light and fluorescence illumination from a surgical area, and generating a visible light image stream based on the captured visible light and a fluorescence image stream based on the captured fluorescence illumination.]
Gurevich further teaches
provide the large fluorescence image […] as the input features to the Al model; and [Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
perform the inference operation by applying the large fluorescence image and the corresponding large visible light image provided to the AI model to generate the classification of the lymphedema induced fluorescence pattern. [Para. 0159 teaches a wound is caused by lymphedema. Para. 0154 teaches using trained neural networks to detect patterns. Various learning models are used, such as error-based learning (logistic regression, support vector machines, and neural networks). Para. 0157 teaches using data derived from a spatial map, which is derived from fluorescence images, as input to the classification neural network model.]
Bradbury further teaches
…and the corresponding large visible light image…; [Para. 0188 teaches broad spectrum light and the emitted fluorescence light are collected in the camera by one or more sensors.]
Gurevich in view of Bradbury and McDowall may not explicitly teach
apply a stitching algorithm on the fluorescence images from the plurality of simultaneously captured pairs of the fluorescence image and the corresponding visible light image, to generate a large fluorescence image, wherein the stitching algorithm applies a set of stitching parameters;
apply the same stitching algorithm on the visible light images from the plurality of simultaneously captured pairs of the fluorescence image and the corresponding visible light image, to generate a corresponding large visible light image, wherein the same stitching algorithm applies a same set of stitching parameters;
However, Wong teaches the following:
apply a stitching algorithm on the fluorescence images from the plurality of simultaneously captured pairs of the fluorescence image and the corresponding visible light image, to generate a large fluorescence image, wherein the stitching algorithm applies a set of stitching parameters; [Para. 0064 teaches an imaging apparatus that acquires fluorescence images. Para. 0048 teaches applying a stitching algorithm to generate a composite image (large fluorescence image).]
apply the same stitching algorithm on the visible light images from the plurality of simultaneously captured pairs of the fluorescence image and the corresponding visible light image, to generate a corresponding large visible light image, wherein the same stitching algorithm applies a same set of stitching parameters; [Para. 0093 teaches using visible light to capture images. Para. 0048 teaches applying a stitching algorithm to generate a composite image.]
Therefore, it would have been prima facie obvious to one of ordinary skill in the art of computerized healthcare, before the effective filling date of the invention, to modify the computer-implemented system of Gurevich in view of Bradbury and McDowall to apply a stitching algorithm as taught by Wong, with the motivation of organizing and displaying the information obtained from multiple imaging modes (see Wong at Para. 0107).
Response to Arguments
Claim Objections
Regarding the objection to Claims 14 and 17, the Applicant has amended the claim such that the prior objection is no longer required. The prior objection has been withdrawn.
Rejections under 35 U.S.C. § 103
Regarding the rejection of Claims 1, 2, 4-10, and 12-19, the Examiner has considered Applicant’s arguments; however, the arguments are moot given the new grounds of rejection as necessitated by amendment.
Rejection under 35 U.S.C. § 101
Regarding the rejection of Claims 1, 2, 4-10, and 12-19, the Examiner has considered the Applicant’s arguments; however, the arguments are not persuasive. Applicant argues:
…Applicant respectfully submits that claim 1 recites additional limitations that provide an improvement to the technical field of computer-based classification of a lymphedema induced fluorescence pattern.
Regarding (a), the Examiner respectfully disagrees. MPEP 2106.04(d)(1) states “the word ‘improvements’ in the context of this consideration is limited to improvements to the functioning of a computer or any other technology/technical field, whether in Step 2A Prong Two or in Step 2B.” Here, there is no improvement to the computer nor is there an improvement to another technology. Because neither type of improvement is present in the claims, an improvement to technology is not present and there is no practical application.
Applicant’s argument that the field of computer-based classification of a lymphedema induced fluorescence pattern is a technology and the claimed invention improves this field is not reflected in the claimed invention. The claims are confined to a general-purpose computer. Moreover, the entire field of computer-based classification of a lymphedema induced fluorescence pattern is not reasonably understood to be a problem arising in technology, as it is instead a problem arising in healthcare. The claimed invention is using a computer as a tool and any improvement present is an improvement to the abstract idea of, to paraphrase, classifying a lymphedema induced fluorescence pattern. Finally, where Applicant’s line of reasoning correct, the invention in Alice Corp. would have been subject matter eligible because it was an improvement to the technology of settlement risk mitigation.
The specifically recited input of higher quality inputs into the AI model is not implementation of well-understood, routine, conventional activity.
Regarding (b), the Examiner respectfully disagrees. MPEP 2106.05(d) states: “Another consideration when determining whether a claim recites significantly more than a judicial exception is whether the additional element(s) are well-understood, routine, conventional activities previously known to the industry (emphasis added).” Further, MPEP 2106.05(I) states: “As made clear by the courts, the novelty of any element or steps in a process, or even of the process itself, is of no relevance in determining whether the subject matter of a claim falls within the § 101 categories of possibly patentable subject matter (internal quotations omitted, emphasis original).” As such, it is only the additional elements identified by the Examiner to not be part of the abstract idea that are analyzed to determine whether they represent well-understood, routine, conventional activities in the field of the invention.
In that regard, MPEP 2106.05(d)(I) indicates that in determining whether the additional elements represent are well-understood, routine, conventional activities, the Examiner should consider whether the additional elements (1) provide an improvement to the technological environment to which the claim is confined, (2) whether the additional elements are mere instructions to apply the judicial exception, or (3) whether the additional elements represent insignificant extra-solution activity. The additional elements of the claims do not provide significantly more based on this inquiry.
Taking these in turn, whether the additional elements of the claim provide an improvement was analyzed/addressed in the 2A2 analysis. The technological environment to which the claims are confined (a general-purpose computer performing generic computer functions) is recited at a high level of generality and has been found by the courts to be insufficient to provide a practical application (see MPEP 2106.05(d)(II); Alice Corp.). The additional elements of (a) “administering a fluorescent agent to a body part of a patient” and (b) “performing a therapy on the patient, the therapy being customized to the diagnostic result relative to the severity of lymphedema,” that were found to represent extra-solution activity were analyzed and determined to represent well-understood, routine, conventional activities in the field. As such, when viewed either individually or as an ordered combination, the additional elements do not provide significantly more to the abstract idea and the claims are not subject matter eligible.
Conclusion
Prior art made of record though not relied upon in the present basis of rejection are noted in the attached PTO 892 and include:
Schroeter et al. (U.S. 2024/0111276) which discloses methods and systems for verification of machine learning-based varnish analysis.
Yang et al. (U.S. 2024/0280489) which discloses a method for estimating mass of microplastics by using fluorescent staining.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CAMRYN B LEWIS whose telephone number is (703)756-1807. The examiner can normally be reached Monday - Friday, 11:00 am - 8:00 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert W Morgan can be reached on (571)272-6773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CAMRYN B LEWIS/
Examiner, Art Unit 3683
/JASON S TIEDEMAN/Primary Examiner, Art Unit 3683