DETAILED ACTION
This action is made in response to the request for continued examination filed on January 27, 2026. This action is made non-final.
Claims 1, 9 and 10 are pending. Claims 4 and 7 are presently cancelled. Claims 2, 3, 5, 6, and 8 are presently cancelled. Claims 1, 9, and 10 have been amended. Claims 1, 9, and 10 are independent claims.
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on January 27, 2026 has been entered.
Response to Arguments
Applicant’s arguments filed on January 27, 2026 with respect to the prior art rejection has been fully considered but is moot in light of the new grounds of rejection. Applicant’s arguments with respect to the 101 rejection have been fully considered but are not persuasive.
As to the previous 101 rejection, Applicant argues (on page 9of their Remarks), that the portable device configured with a trained neural-network model and privacy-preserving ingestion and reacquisition protocol are limitations that improve the technical operation and reliability of the portable diagnostic device and integrates the abstract idea into a practical application. However, the examiner respectfully disagrees.
As a first matter, MPEP 2106.04(d)(1) and MPEP 2106.05(a) indicates that a practical application may be present where the claimed invention provides a technical solution to a technical problem. See, e.g., DDR Holdings, LLC. v. Hotels.com, L.P., 773 F.3d 1245, 1259 (Fed. Cir. 2014) (finding that claiming a website that retained the “look and feel” of a host webpage provided a technological solution to the problem of retention of website visitors by utilizing a website descriptor that emulated the “look and feel” of the host webpage, where the problem arose out of the internet and was thus a technical problem). Here, the Applicant’s argued problem is not a technological problem caused by the technological environment to which the claims are confined. Assuming, arguendo, the claims are directed to “preventing leakage of unnecessary personal information”. The problem of data security was not a problem cause by the computer, but rather is a problem that existed and/or exists regardless of whether a computer is involved in the process. At best, Applicant’s identified problem is a data management problem. Because no technological problem is present, the claims do not provide a practical application.
Furthermore, MPEP 2106.04(d)(1) states “the word ‘improvements’ in the context of this consideration is limited to improvements to the functioning of a computer or any other technology/technical field, whether in Step 2A Prong Two or in Step 2B.” Here, there is no improvement to the computer nor is there an improvement to another technology. As stated above, the concept of data security is not a technology or technical field. Because neither type of improvement is present in the claims, an improvement to technology is not present and there is no practical application.
Accordingly, for at least the above stated reasons, and again below, the previous 101 rejection is maintained.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 9, and 10 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Claim 1 recites a diagnosis support device for preventing leakage of personal information, which is within the statutory class of a machine. Claim 9 recites a method of preventing leakage of personal information, which is within the statutory category of a process. Claim 10 recites a non-transitory computer readable memory performing instructions for preventing leakage of personal information, which is within the statutory class of a manufacture.
Claims are eligible for patent protection under § 101 if they are in one of the four statutory categories and not directed to a judicial exception to patentability. Alice Corp. v. CLS Bank Int'l, 573 U.S. ___ (2014). Claims 1, 9, and 10, each considered as a whole and as an ordered combination, are directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
MPEP 2106 Step 2A – Prong 1:
The bolded limitations of:
Claims 1, 9, and 10 (claim 9 being representative)
receiving, from an external device, an image file including a medical image and accessory information, and storing the image file in the memory; determining whether the accessory information includes personal information unnecessary for computer-aided diagnostic processing; in a case where it is determined that the accessory information includes the unnecessary personal information, discarding the received image file stored in the memory and instructing the external device to transmit a new image file in which the unnecessary personal information has been deleted from the accessory information; receiving the new image file transmitted from the external device and store the new image file in the memory; executing the computer-aided diagnostic processing by inputting the medical image included in the new image file stored in the memory into the trained model and obtaining a detection result of a region including an abnormal shadow; and transmitting, to the external device, an image including the detection result.
as presently drafted, under the broadest reasonable interpretation, covers a method of organizing human activity (i.e., managing personal behavior including following rules or instructions) but for the recitation of generic computer components. For example, but for the noted computer elements, the claim encompasses a person following rules or instructions to analyze medical images for personal information and instructing an external person/source to not send medical images containing personal information and analyzing the medical images for diagnostic purposes in the manner described in the abstract idea. The examiner further notes that “methods of organizing human activity” includes a person’s interaction with a computer (see October 2019 Update: Subject Matter Eligibility at Pg. 5). If the claim limitation, under its broadest reasonable interpretation, covers managing persona behavior or interactions between people but for the recitation of generic computer components, then it falls within the “method of organizing human activity” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
MPEP 2106 Step 2A – Prong 2:
This judicial exception is not integrated into a practical application because there are no meaningful limitations that transform the exception into a patent eligible application. The additional elements merely amount to instructions to apply the exception using generic computer components (“a processor”; “a memory”; “built-in battery”; “computer-aided”; “device”; “non-transitory computer-readable storage medium”—all recited at a high level of generality). Although they have and execute instructions to perform the abstract idea itself, this also does not serve to integrate the abstract idea into a practical application as it merely amounts to instructions to "apply it." (See MPEP 2106.04(d)(2) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose meaningful limits on practicing the abstract idea. Therefore, the claims are directed to an abstract idea.
The “diagnosis support device” is not a generic computer component; however it is recited at a high levels of generality and similarly amount to generally linking the abstract idea to a particular technological environment. (See MPEP 2106.04(d)(I) indicating generally linking an abstract idea to a particular technological environment does not amount to integrating the abstract idea into a practical application).
The claim further recites the additional elements of (1) a trained model configured using a neural network and (2) executing the computer-aided diagnostic by inputting into the trained model. When given the broadest reasonable interpretation in light of the nonexistent description of model training in the disclosure, training of a model configured using a neural network with the noted data amounts to a mathematical concept that creates data associations. As such, this training of the model is interpreted to be subsumed within the identified abstract idea and the use of the trained model provides nothing more than mere instructions to implement the abstract idea, supra. July 2024 Subject Matter Eligibility Examples, Example 47, Claim 2, discussion of item (c) at Pgs. 7-9. Regarding (2), the use of trained model provides nothing more than mere instructions to implement an abstract idea on a generic computer (“apply it”). See MPEP 2106.05(f). MPEP 2106.05(f); July 2024 Subject Matter Eligibility Examples, Example 47, Claim 2, discussion of items (d) and (e) at Pgs. 8-9.
The claims only manipulate abstract data elements into another form. They do not set forth improvements to another technological field or the functioning of the computer itself and instead use computer elements as tools in a conventional way to improve the functioning of the abstract idea identified above. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. None of the additional elements recited "offers a meaningful limitation beyond generally linking 'the use of the [method] to a particular technological environment,' that is, implementation via computers." Alice Corp., slip op. at 16 (citing Bilski v. Kappos, 561 U.S. 610, 611 (U.S. 2010)).
At the levels of abstraction described above, the claims do not readily lend themselves to a finding that they are directed to a nonabstract idea. Therefore, the analysis proceeds to step 2B. See BASCOM Global Internet v. AT&T Mobility LLC, 827 F.3d 1341, 1349 (Fed. Cir. 2016) ("The Enfish claims, understood in light of their specific limitations, were unambiguously directed to an improvement in computer capabilities. Here, in contrast, the claims and their specific limitations do not readily lend themselves to a step-one finding that they are directed to a nonabstract idea. We therefore defer our consideration of the specific claim limitations’ narrowing effect for step two.") (citations omitted).
MPEP 2106 Step 2B:
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception for the same reasons as presented in Step 2A Prong 2. Moreover, the additional elements recited are known and conventional generic computing elements (“a processor”; “a memory”; “built-in battery”; “computer-aided”; “device”; “non-transitory computer-readable storage medium”—see Specification [0050], [0120]-[0122] describing the various components as general purpose, common, standard, known to one of ordinary skill, and at a high level of generality, and in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy the statutory disclosure requirements). Therefore, these additional elements amount to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept that amounts to significantly more. See MPEP 2106.05(f).
The Federal Circuit has recognized that "an invocation of already-available computers that are not themselves plausibly asserted to be an advance, for use in carrying out improved mathematical calculations, amounts to a recitation of what is 'well-understood, routine, [and] conventional.'" SAP Am., Inc. v. InvestPic, LLC, 890 F.3d 1016, 1023 (Fed. Cir. 2018) (alteration in original) (citing Mayo v. Prometheus, 566 U.S. 66, 73 (2012)). Apart from the instructions to implement the abstract idea, they only serve to perform well-understood functions (e.g., receiving, translating, and displaying data—see Specification above as well as Alice Corp.; Intellectual Ventures I LLC v. Symantec Corp., 838 F.3d 1307 (Fed. Cir. 2016); and Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334 (Fed. Cir. 2015) covering the well-known nature of these computer functions).
Furthermore, as discussed above, the additional element of a “diagnosis support device” is recited at high levels of generality and were determined to generally link the abstract idea into a particular technological environment or field of use. This additional element have been re-evaluated under step 2B and have also been found insufficient to provide significantly more. (See MPEP 2106.05(A) indicating generally linking an abstract idea to a particular technological environment does not amount to significantly more).
Also, as discussed above with respect to integration of the abstract idea into a practical application, the additional elements (1) a trained model configured using a neural network and (2) executing the computer-aided diagnostic by inputting into the trained model were considered to be part of the abstract idea and “apply it,” respectively. This has been re-evaluated under the “significantly more” analysis and has also been found insufficient to provide significantly more. Regarding (1), the training of the model is considered part of the abstract idea and thus cannot provide a practical application. Regarding (2), the use of the trained model represented saying “apply it.” Item (2) has been revaluated under the “significantly more” analysis and does not provide “significantly more” to the abstract idea. MPEP 2106.05(A) indicates also indicates that merely adding the words “apply it” or equivalent use cannot provide significantly more. Accordingly, even in combination, this additional element does not provide significantly more. As such the claim is not patent eligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 9 and 10 are is/are rejected under 35 U.S.C. 103 as being unpatentable over Teeuwen et al. (USPPN: 2021/0090719; hereinafter Teeuwen) in further view of Bender et al. (USPPN: 2016/0241766; hereinafter Bender), Mostofi (USPPN: 2019/0341150; hereinafter Mostofi), and Takeo et al. (USPPN: 2003/0081822; hereinafter Takeo).
As to claim 1, Teeuwen teaches An image diagnosis support device that is portable by a user (e.g., see Abstract, [0013]), the image diagnosis support device comprising:
a processor (e.g., see [0019]); and
a memory that stores a trained model configured using a neural network (e.g., see [0019]),
wherein the processor is configured to:
receive, from an external device, an image file including a medical image and accessory information, and store the image file in the memory (e.g., see Abstract, [0015], [0020], [0055], [0064], [0073] teaching image file transmission of medical images having various metadata, including personal information, to a remote device by a user for diagnostic purposes and further storing the images);
determine whether the accessory information includes personal information unnecessary for computer-aided diagnostic processing (e.g., see [0017], [0055], [0062], [0063] wherein the image is analyzed in accordance to a relevance index, identifying relevant portions);
in a case where it is determined that the accessory information includes the unnecessary personal information, instruct the external device to transmit a new image file in which the unnecessary personal information has been deleted from the accessory information (e.g., see [0017], [0055] wherein data, including personal information, irrelevant for the purposes of the file transfer are not sent in the image file to a requesting source and can be erased or masked, wherein deletion of data in the image is interpreted as reading upon the claimed “new image file” as supported by the originally filed specification, see [0095]);
receive the new image file transmitted from the external device and store the new image file in the memory (e.g., see Fig. 2, [0064] wherein the images are stored in an image storage);
execute the computer-aided diagnostic processing by inputting the medical image included in the new image file stored in the memory into the trained model (e.g., see [0013], [0069] teaching the use of a machine learning model to make the diagnosis); and
transmit, to the external device, an image including the detection result (e.g., see Fig. 1, [0073], [0076], [0080] wherein the image and/or report are transferred to a destination).
While Teeuwen teaches erasing content of the image, Teeuwen fails to teach discard the received image file stored in the memory. However, in the same field of endeavor of image transfer, Bender teaches discard the received image file stored in the memory (e.g., [0035], [0040], [0045] of Bender wherein the rules for identifying non-compliant images are sent to the device and non-compliant images are not saved (i.e., discarded)).
Accordingly, it would have been obvious to modify Bender in view of Teeuwen with a reasonable expectation of success. One would have been motivated to make the modification in order to prevent inappropriate or non-compliant content from leaving an origination point, thereby increasing data security (e.g., see [0003] of Bender).
While Teeuwen teaches the system can be location independent and on any mobile device having a memory and further teaches the use of machine learning for the diagnosis, Teeuwen fails to explicitly teach the device comprising a memory that stores a trained model configured using a neural network and a built-in battery.
However, in the same field of endeavor diagnosing patients, Mostofi teaches a portable device comprising a memory that stores a trained model configured using a neural network and a built-in battery (e.g., see [0034], [0046] teaching the use of a locally stored lightweight deep learning model on a smartphone. While Mostofi fails to explicitly teach a built-in battery, Mostofi teaches a portable device, such as a smartphone, which would obviously, if not necessarily, include a built-in battery). Accordingly, it would have been obvious to modify Teeuwen-Bender in view of Mostofi before the effective date with a reasonable expectation of success. One would have been motivated to make the modification in order to generate image diagnosis locally and without the need for cell service of back-end server connection (e.g., see [0034] of Mostofi).
While Teeuwen and Mostofi teach diagnosing images using machine learning models, wherein Teeuwen and Mostofi additionally teach identifying regions of interest (e.g., see [0067]-[0068] of Teeuwen and [0039] of Mostofi), the above cited references fail to explicitly teach obtaining a detection result of a region including an abnormal shadow. However, in the same field of endeavor of diagnosing patients, Takeo teaches obtaining a detection result of a region including an abnormal shadow (e.g., see [0005]-[0006], [0034], [0123] teaching detecting an abnormal shadow and obtaining a result of the detection). Accordingly, it would have been obvious to modify Teeuwen-Bender-Mostofi in view of Takeo before the effective date with a reasonable expectation of success. One would have been motivated to make the modification in order to properly detect abnormal shadows in images (e.g., see [0005] of Takeo).
As to claim 9, the claim is directed to the method implemented on the device of claim 1 and is similarly rejected.
As to claim 10, the claim is directed to the non-transitory computer readable medium implementing instructions on the device of claim 1 and is similarly rejected.
It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. “The use of patents as references is not limited to what the patentees describe as their own inventions or to the problems with which they are concerned. They are part of the literature of the art, relevant for all they contain.” In re Heck, 699 F.2d 1331, 1332-33, 216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968)). Further, a reference may be relied upon for all that it would have reasonably suggested to one having ordinary skill the art, including nonpreferred embodiments. Merck & Co. v. Biocraft Laboratories, 874 F.2d 804, 10 USPQ2d 1843 (Fed. Cir.), cert. denied, 493 U.S. 975 (1989). See also Upsher-Smith Labs. v. Pamlab, LLC, 412 F.3d 1319, 1323, 75 USPQ2d 1213, 1215 (Fed. Cir. 2005); Celeritas Technologies Ltd. v. Rockwell International Corp., 150 F.3d 1354, 1361, 47 USPQ2d 1516, 1522-23 (Fed. Cir. 1998).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STELLA HIGGS whose telephone number is (571)270-5891. The examiner can normally be reached Monday-Friday: 9-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Peter Choi can be reached at (469) 295-9171. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/STELLA HIGGS/Primary Examiner, Art Unit 3681