Prosecution Insights
Last updated: April 19, 2026
Application No. 18/709,535

PROCESSING IMAGE DATA FOR ASSESSING A CLINICAL QUESTION

Non-Final OA §101§102§103
Filed
May 13, 2024
Examiner
MA, MICHELLE HAU
Art Unit
2617
Tech Center
2600 — Communications
Assignee
Koninklijke Philips N V
OA Round
1 (Non-Final)
81%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 81% — above average
81%
Career Allow Rate
17 granted / 21 resolved
+19.0% vs TC avg
Strong +36% interview lift
Without
With
+36.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
35 currently pending
Career history
56
Total Applications
across all art units

Statute-Specific Performance

§101
3.0%
-37.0% vs TC avg
§103
84.2%
+44.2% vs TC avg
§102
6.4%
-33.6% vs TC avg
§112
5.5%
-34.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 21 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the “natural language processing subsystem” in claim 8 must be shown or the feature(s) canceled from the claim(s). No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Specification Applicant is reminded of the proper language and format for an abstract of the disclosure. The abstract should be in narrative form and generally limited to a single paragraph on a separate sheet within the range of 50 to 150 words in length. The abstract should describe the disclosure sufficiently to assist readers in deciding whether there is a need for consulting the full patent text for details. The language should be clear and concise and should not repeat information given in the title. It should avoid using phrases which can be implied, such as, “The disclosure concerns,” “The disclosure defined by this invention,” “The disclosure describes,” etc. In addition, the form and legal phraseology often used in patent claims, such as “means” and “said,” should be avoided. The disclosure is objected to because of the following informalities: On page 7 line 9, “based on the latest evidence based clinical guidelines” should be revised for clarity. On page 23 line 19, “method 700 of Fig. 8” should read “method 700 of Fig. 9”. Appropriate correction is required. Claim Objections Claims 4, 9-11, and 15-19 are objected to because of the following informalities: Claim 4 recites the limitations “the codified clinical guidelines”, “the metrics”, “the criteria”, and “the guidance” in lines 6-9. There is insufficient antecedent basis for these limitations in the claim. Perhaps, claim 4 should depend on claim 3, instead of claim 1. Claim 9 recites the limitation “the codified clinical guideline” in lines 8-9. There is insufficient antecedent basis for this limitation in the claim. In claim 10 line 2-3, “the use case is treatability based on” should read “the use case treatability is based on” or “the use case is treatable based on”. Claim 11 recites the limitation “the display data” in lines 7-8. There is insufficient antecedent basis for this limitation in the claim. Claim 15 recites the limitation “the display” in line 6. There is insufficient antecedent basis for this limitation in the claim. Claims 16-18 are objected to because of their dependency on claim 15. Claim 19 recites the limitation “the display” in line 2. There is insufficient antecedent basis for this limitation in the claim. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-6, 9-14, and 20-21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The limitations, under their broadest reasonable interpretation, cover mental process (concept performed in a human mind, including as observation, evaluation, judgment, opinion, organizing human activity and mathematical concepts and calculations). The claim(s) recite(s) a system and method for processing medical image data regarding a clinical question. This judicial exception is not integrated into a practical application because the steps do not add meaningful limitations to be considered specifically applied to a particular technological problem to be solved. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the steps of the claimed invention can be done mentally and no additional features in the claims would preclude them from being performed as such except for the generic computer elements at high level of generality (i.e., processor, memory). According to the USPTO guidelines, a claim is directed to non-statutory subject matter if: STEP 1: the claim does not fall within one of the four statutory categories of invention (process, machine, manufacture or composition of matter), or STEP 2: the claim recites a judicial exception, e.g. an abstract idea, without reciting additional elements that amount to significantly more than the judicial exception, as determined using the following analysis: STEP 2A (PRONG 1): Does the claim recite an abstract idea, law of nature, or natural phenomenon? STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? Using the two-step inquiry, it is clear that claims 1 and 20 are directed to an abstract idea as shown below: STEP 1: Do the claims fall within one of the statutory categories? YES; claim(s) 1 and 20 is/are directed to a system and a method, respectively. STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon or an abstract idea? YES; The claims are directed toward a mental process (i.e. abstract idea). With regard to STEP 2A (PRONG 1), the guidelines provide three groupings of subject matter that are considered abstract ideas: Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations; Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and Mental processes – concepts that are practicably performed in the human mind (including an observation, evaluation, judgment, opinion). The system in claim 1 (and the method in claim 20) comprise a mental process that can be practicably performed in the human mind (or generic computers or components configured to perform the method) and, therefore, an abstract idea. Regarding Claim(s) 1 and 20: the method recites the steps (functions) of: to determine a use case regarding the clinical question for a patient based on a user input (mental process including observation and evaluation, and can be done mentally in the human mind); to retrieve the patient data of the patient regarding the use case (mental process including observation and evaluation, and can be done mentally in the human mind); to determine an anatomical object based on the use case and the patient data (mental process including observation and evaluation, and can be done mentally in the human mind); to select an algorithm from a repository based on the determined anatomical object (mental process including observation and evaluation, and can be done mentally in the human mind); and to execute the algorithm on the image data to provide the output data for assessing the clinical question (mental process including observation and evaluation, and can be done mentally in the human mind). These limitations, as drafted, is a simple process that, under their broadest reasonable interpretation, covers performance of the limitations in the mind or by a human. The Examiner notes that under MPEP 2106.04(a)(2)(III), the courts consider a mental process (thinking) that “can be performed in the human mind, or by a human using a pen and paper" to be an abstract idea. CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1372, 99 USPQ2d 1690, 1695 (Fed. Cir. 2011). As the Federal Circuit explained, "methods which can be performed mentally, or which are the equivalent of human mental work, are unpatentable abstract ideas the ‘basic tools of scientific and technological work’ that are open to all.’" 654 F.3d at 1371, 99 USPQ2d at 1694 (citing Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 673 (1972)). See also Mayo Collaborative Servs. v. Prometheus Labs. Inc., 566 U.S. 66, 71, 101 USPQ2d 1961, 1965 ("‘[M]ental processes[] and abstract intellectual concepts are not patentable, as they are the basic tools of scientific and technological work’" (quoting Benson, 409 U.S. at 67, 175 USPQ at 675)); Parker v. Flook, 437 U.S. 584, 589, 198 USPQ 193, 197 (1978) (same). STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? NO, the claims do not recite additional elements that integrate the judicial exception into a practical application. With regard to STEP 2A (prong 2), whether the claim recites additional elements that integrate the judicial exception into a practical application, the guidelines provide the following exemplary considerations that are indicative that an additional element (or combination of elements) may have integrated the judicial exception into a practical application: an additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field; an additional element that applies or uses a judicial exception to affect a particular treatment or prophylaxis for a disease or medical condition; an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim; an additional element effects a transformation or reduction of a particular article to a different state or thing; and an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. While the guidelines further state that the exemplary considerations are not an exhaustive list and that there may be other examples of integrating the exception into a practical application, the guidelines also list examples in which a judicial exception has not been integrated into a practical application: an additional element merely recites the words “apply it” (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea; an additional element adds insignificant extra-solution activity to the judicial exception; and an additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use. Claim(s) 1 and 20 does/do not recite any of the exemplary considerations that are indicative of an abstract idea having been integrated into a practical application. Claim(s) 1 and 20 recite(s) the further limitations of: processor subsystem (generic computer(s) or component(s) configured to perform the method); computer-implemented (generic computer(s) or component(s) configured to perform the method); repository (generic component); and retrieve the image data of the patient (insignificant pre/post-solution extra activity of gathering data). These limitations are recited at a high level of generality (i.e. as a general action or change being taken based on the results of the acquiring step) and amounts to mere pre/post-solution actions, which is/are a form of insignificant extra-solution activity. Further, the claims are claimed generically and are operating in their ordinary capacity such that they do not use the judicial exception in a manner that imposes a meaningful limit on the judicial exception. Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? NO, the claims do not recite additional elements that amount to significantly more than the judicial exception. With regard to STEP 2B, whether the claims recite additional elements that provide significantly more than the recited judicial exception, the guidelines specify that the pre-guideline procedure is still in effect. Specifically, that examiners should continue to consider whether an additional element or combination of elements: adds a specific limitation or combination of limitations that are not well-understood, routine, conventional activity in the field, which is indicative that an inventive concept may be present; or simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present. Claim(s) 1 and 20 does/do not recite any additional elements that are not well-understood, routine or conventional. The use of a computer to 'determine' , 'retrieve', 'select', and ‘execute’, as claimed in Claim(s) 1 and 20 is/are a routine, well-understood and conventional process(es) that is/are performed by computers. Thus, since Claim(s) 1 and 20 is/are: (a) directed toward an abstract idea, (b) do not recite additional elements that integrate the judicial exception into a practical application, and (c) do not recite additional elements that amount to significantly more than the judicial exception, it is clear that Claim(s) 1 and 20 is/are not eligible subject matter under 35 U.S.C 101. Regarding claims 2-6, 9-14, and 21, the additional limitations do not integrate the mental process into practical application or add significantly more to the mental process. The limitation(s) recite either: a mental process including observation and evaluation, and can be done mentally in the human mind (i.e., claims 2, 3, 4, 5, 6, 9, 10, 12, 13, 14 recite the mental processes: determine said anatomical object, and to select said algorithm, also based on the codified clinical guidelines; wherein the codified clinical guidelines for a use case define at least one of…; an algorithm matching lookup table for selecting the algorithm from the repository; select the algorithm from the repository containing analytic algorithms; select the algorithm from the repository containing visualization algorithms for visualization of the output data; detect an anomaly in the medical image data of an area of the body of the patient and select said algorithm from the repository based on the detected anomaly and/or affected anatomical object or affected structure; wherein the use case is treatability based on the relation of a lesion with surrounding anatomical structures or resectability of tumors in the vicinity of blood vessels; select positions on the anatomically relevant path and extract 3D coordinates of the positions; select a 2D algorithm from the repository for processing medical image data; and wherein the anatomically relevant path via the 3D object is determined along an interface of the 3D object and a further anatomical structure, respectively); mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations (i.e., claims 11, 12, and 13 recite the mathematical concepts: execute the slicing algorithm on the image data to provide, as the display data, slices of the 3D image data that are perpendicular to the anatomically relevant path for assessing the clinical question; computing a slice passing through the first position at an angle perpendicular to the vector; and execute the 2D algorithm on a slice to calculate clinically relevant information, respectively); (3) insignificant pre/post-solution extra activity of generating data (i.e., claim 2 recites the activity/activities of gathering: retrieve codified clinical guidelines regarding the use case); and/or (4) generic computers or components configured to perform the method (i.e., claim 21 recites the generic computer(s) or component(s) of: a non-transitory computer readable storage medium). Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 5-6, 8, and 20-21 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Weese et al. (WO 2009060355 A1), hereinafter Weese. Regarding claim 1, Weese teaches a system for processing medical image data regarding a clinical question (Page 5 lines 26-29 – “a system 10 for facilitating diagnosis is provided. The system 10 may comprise a medical database 11 comprising pictorial and/or textual information regarding pathological or healthy states, such as sample images or guidance texts for facilitating diagnosis of a disease or disorder of a patient”; Note: diagnosis is a type of clinical question), comprising: a processor subsystem (Page 5 line 30 – “the system comprises a first processor unit 12 configured to…”) configured: to determine a use case regarding the clinical question for a patient based on a user input (Page 7 lines 23-30 – “The system further comprises a query unit 14 configured to prepare or retrieve a search query, such as a search mask, that may be used to search the medical database. In 25 Fig. 3e a search query is defined. The query unit may be a part of graphical user interface allowing the user to indicate or refine the search space, such as for example, search diagnosis exhibiting hypointense signals in the left cerebral hemisphere. The query may e.g. from the user's point of view, look like "search cases that contain hypointense to white matter in the left cerebral hemisphere". In an embodiment the search query may comprise textual input from the user”; Note: the user’s input and search query are the use case. The user is the patient); to retrieve patient data of the patient regarding the use case (Page 8 lines 4-5 – “The client workstation may also be configured to access a patient image e.g. from a scanner or via Picture archiving and communication system (PACS).”; Note: patient images are a type of patient data); to determine an anatomical object based on the use case and the patient data (Page 10 lines 18-20 – “the graphical user interface is configured to enable the user to indicate which anatomy in the acquired patient image that is of interest, e.g. "left 20 cerebral hemisphere", "midbrain", etc. versus "cerebral hemisphere" only”; Note: the anatomical structure is determined based on patient image data and user input/selection, which is part of the use case); to select an algorithm from a repository based on the determined anatomical object (Page 8 lines 17-25 and 29-31 – “The system may further comprise a second processor unit 15 configured to retrieve (15a) a set of selected image-processing algorithms that are associated with the found set of diagnoses and sample cases, and optionally located on a memory 16 or a selection unit 20. A selected image-processing algorithm may be an algorithm that may be customized from a common IP algorithm and may be adapted to the modality, to the organ and so on. By customized is here meant that the algorithm may be modified particularly for being used on a certain sample image, and the acquired image resulting in parameters that may be used for facilitating diagnosis, or for discrimination between different diagnoses…The selected image-processing algorithms associated with the set of diagnoses and sample cases, which are located in the medical encyclopedia, may be stored either in the medical encyclopedia or in a memory connected to the system”; Note: an algorithm is selected from a memory that stores algorithms, which is equivalent to the repository); to retrieve image data of the patient (Page 8 lines 4-5 – “The client workstation may also be configured to access a patient image e.g. from a scanner or via Picture archiving and communication system (PACS)”); and to execute the algorithm on the image data to provide output data for assessing the clinical question (Page 9 lines 6-9 and 19-21, Page 11 lines 1-2 – “the second processor 15 may further be configured to process (15b) the region of interest in the image using the retrieved selected image-processing algorithms to determine at least one diagnosis of the image…the selected image-processing algorithms is configured to calculate e.g. characteristics of lesion properties such as image appearance, or border characteristics in the image to facilitate subsequent diagnosis of the image…the graphical user interface may be visualized on a display for presenting processed information, such as determined diagnoses, etc. to a user”; Note: the selected algorithm is used on the image to output a diagnosis). Regarding claim 5, Weese teaches the system according to claim 1. Weese further teaches wherein the processor subsystem is configured to select the algorithm from the repository containing analytic algorithms for calculating metrics, detection, characterization or segmentation of an anomaly, a tumor and/or an anatomical structure (Page 7 lines 15-17, Page 9 lines 19-21 – “to detect anatomies in the image, a mesh based image segmentation method may be used. Each organ may be modeled by a mesh model depicting the surface model of the organ.…the selected image-processing algorithms is configured to calculate e.g. characteristics of lesion properties such as image appearance, or border characteristics in the image to facilitate subsequent diagnosis of the image”; Note: selected algorithms can include analytic algorithms that segment organs or calculate metrics for anomalies like lesions). Regarding claim 6, Weese teaches the system according to claim 1. Weese further teaches wherein the processor subsystem is configured to select the algorithm from the repository containing visualization algorithms for visualization of the output data (Page 7 lines 10-11 and 15-20, Page 17 lines 8-9 – “The first, second or third image-processing algorithm may be model-based image segmentation techniques…to detect anatomies in the image, a mesh based image segmentation method may be used. Each organ may be modeled by a mesh model depicting the surface model of the organ. In case of brain, such mesh model may contain parts like left/right cerebral hemispheres, cerebellum and so on. The mesh model may then be adapted to the volumetric image, after which the anatomical regions in the image are identified…a display for displaying the sample image, acquired image, or results from image processing”; Note: a mesh/model-based algorithm can be selected, which is a type of visualization algorithm since it helps show the anatomical structure and is later displayed to the user). Regarding claim 8, Weese teaches the system according to claim 1. Weese further teaches wherein the processor subsystem comprises a natural language processing subsystem for determining the anatomical object and/or metrics related to the object based on the use case and the patient data (Page 7 lines 23-27 and 30, Page 8 lines 8-11 – “The system further comprises a query unit 14 configured to prepare or retrieve a search query… The query unit may be a part of graphical user interface allowing the user to indicate or refine the search space, such as for example, search diagnosis exhibiting hypointense signals in the left cerebral hemisphere… the search query may comprise textual input from the user…the search query may be encoded in standard Unified Medical Language System (UMLS), Foundational Model of Anatomy (FMA), or International Classification of Diseases (ICD-10) terms and may be used in search for relevant cases in the medical encyclopedia”; Note: the query unit is a natural language processing subsystem, as it processes text in the natural language into a medical language format. The query unit is used to determine the anatomical object, which in the example is the left cerebral hemisphere), wherein the natural language processing subsystem is configured to process at least one of: clinical guidelines regarding the use case; documents regarding the use case; or documents regarding the clinical question; for identifying anatomical objects or structures, metrics, criteria or guidance relevant for the clinical question (Page 7 lines 23-24 and 30, Page 8 lines 14-15 – “The system further comprises a query unit 14 configured to prepare or retrieve a search query… the search query may comprise textual input from the user…the query unit 14 may be configured to process the search query in the medical encyclopedia, resulting in a set of possible diagnoses and sample cases”; Note: the search query is a document regarding the use case, as it contains text, and it is used to identify diagnoses or sample cases, which are guidance for the clinical question). Regarding claim 20, Weese teaches a computer-implemented method for processing medical image data regarding a clinical question (Page 12 lines 29-30, Page 13 lines 8-9 – “a computer-readable medium having embodied thereon a computer program for processing by a computer is provided…method or computer-readable medium is provided for facilitating diagnosis of a region of interest in an image dataset”), comprising: determining a use case regarding the clinical question for a patient based on a user input (Page 7 lines 23-30 – “The system further comprises a query unit 14 configured to prepare or retrieve a search query, such as a search mask, that may be used to search the medical database. In 25 Fig. 3e a search query is defined. The query unit may be a part of graphical user interface allowing the user to indicate or refine the search space, such as for example, search diagnosis exhibiting hypointense signals in the left cerebral hemisphere. The query may e.g. from the user's point of view, look like "search cases that contain hypointense to white matter in the left cerebral hemisphere". In an embodiment the search query may comprise textual input from the user”; Note: the user’s input and search query are the use case. The user is the patient); retrieving patient data of the patient regarding the use case (Page 8 lines 4-5 – “The client workstation may also be configured to access a patient image e.g. from a scanner or via Picture archiving and communication system (PACS).”; Note: patient images are a type of patient data); determining an anatomical object based on the use case and the patient data (Page 10 lines 18-20 – “the graphical user interface is configured to enable the user to indicate which anatomy in the acquired patient image that is of interest, e.g. "left 20 cerebral hemisphere", "midbrain", etc. versus "cerebral hemisphere" only”; Note: the anatomical structure is determined based on patient image data and user input/selection, which is part of the use case); selecting an algorithm from a repository based on the determined anatomical object (Page 8 lines 17-25 and 29-31 – “The system may further comprise a second processor unit 15 configured to retrieve (15a) a set of selected image-processing algorithms that are associated with the found set of diagnoses and sample cases, and optionally located on a memory 16 or a selection unit 20. A selected image-processing algorithm may be an algorithm that may be customized from a common IP algorithm and may be adapted to the modality, to the organ and so on. By customized is here meant that the algorithm may be modified particularly for being used on a certain sample image, and the acquired image resulting in parameters that may be used for facilitating diagnosis, or for discrimination between different diagnoses…The selected image-processing algorithms associated with the set of diagnoses and sample cases, which are located in the medical encyclopedia, may be stored either in the medical encyclopedia or in a memory connected to the system”; Note: an algorithm is selected from a memory that stores algorithms, which is equivalent to the repository); retrieving image data of the patient (Page 8 lines 4-5 – “The client workstation may also be configured to access a patient image e.g. from a scanner or via Picture archiving and communication system (PACS)”); executing the algorithm on the image data to provide output data for assessing the clinical question (Page 9 lines 6-9 and 19-21, Page 11 lines 1-2 – “the second processor 15 may further be configured to process (15b) the region of interest in the image using the retrieved selected image-processing algorithms to determine at least one diagnosis of the image…the selected image-processing algorithms is configured to calculate e.g. characteristics of lesion properties such as image appearance, or border characteristics in the image to facilitate subsequent diagnosis of the image…the graphical user interface may be visualized on a display for presenting processed information, such as determined diagnoses, etc. to a user”; Note: the selected algorithm is used on the image to output a diagnosis). Regarding claim 21, Weese teaches the method according to claim 20. Weese further teaches a non-transitory computer-readable medium comprising instructions for causing a processor system to perform the method (Page 3 lines 6-7, Page 13 lines 16-17 and 24-26 – “a computer-readable medium having embodied thereon a computer program for processing by a processor is provided…The processor unit may be any unit normally used for performing the involved tasks, e.g. a hardware, such as a processor with a memory…The memory may also be a FLASH memory such as a USB, Compact Flash, SmartMedia, MMC 25 memory, MemoryStick, SD Card, MiniSD, MicroSD, xD Card, TransFlash, and MicroDrive memory etc.”; Note: flash memory is a type of non-transitory computer-readable medium). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2-3 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Weese in view of Samset et al. (US 20190392944 A1), hereinafter Samset. Regarding claim 2, Weese teaches the system according to claim 1. Weese further teaches wherein the processor subsystem is configured: to determine said anatomical object (Page 6 lines 2-4 – “In Fig. 3d the first processor unit has identified the left and right cerebral hemispheres as an anatomical structure. In Fig. 3e the first processor unit has identified the left cerebral hemisphere as an anatomical structure to be queried”; Note: an anatomical object is determined. It is implied that the anatomical object is determined based on medical information because it could not be identified without knowledge of what the anatomical object looks like), and to select said algorithm, also based on medical information (Page 8 lines 17-25 – “The system may further comprise a second processor unit 15 configured to retrieve (15a) a set of selected image-processing algorithms that are associated with the found set of diagnoses and sample cases, and optionally located on a memory 16 or a selection unit 20. A selected image-processing algorithm may be an algorithm that may be customized from a common IP algorithm and may be adapted to the modality, to the organ and so on. By customized is here meant that the algorithm may be modified particularly for being used on a certain sample image, and the acquired image resulting in parameters that may be used for facilitating diagnosis, or for discrimination between different diagnoses”; Note: an algorithm is selected based on medical information, including a set of diagnoses and sample cases). Weese does not teach retrieving codified clinical guidelines regarding the use case; nor the “codified clinical guidelines” in the limitation: “to determine said anatomical object, and to select said algorithm, also based on the codified clinical guidelines”. However, Samset teaches retrieving codified clinical guidelines regarding the use case (Paragraph 0035 – “the clinical workflow may inform virtual diagnostic assistant 139 of the potential clinical findings to generate, and virtual diagnostic assistant 139 may identify clinical findings by comparing the clinical parameters generated by virtual parameter assistant 138 to various guidelines, which may include normal ranges of the clinical parameters obtained from published guidelines, research studies, etc. In some examples, the normal ranges for the clinical parameters may be adjusted based on patient information (e.g., patient gender, patient age). In other examples, virtual diagnostic assistant 139 may be trained to generate only specific findings for specific patients (e.g., virtual diagnostic assistant 139 may be trained to only generate clinical findings for echocardiograms of adult men)”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Samset to retrieve clinical guidelines for the use case because it can assist in quickly determining a diagnosis. “By starting with a set of automatically-generated clinical findings and working backwards to the medical images, the process of reviewing a diagnostic imaging exam may be expedited and aspects of the exam prone to error or inconsistencies (e.g., human-to-human variability in taking measurements of features of the images) may be performed in a uniform manner” (Samset: Paragraph 0004). Additionally, a person of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the medical information of Weese could have been substituted for the clinical guidelines of Samset because both the medical information and clinical guidelines serve the purpose of informing and contributing to medical decisions. Furthermore, a person of ordinary skill in the art would have been able to carry out the substitution. Finally, the substitution achieves the predictable result of determining an anatomical object and algorithm based on the clinical guidelines. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the medical information of Weese for the clinical guidelines of Samset according to known methods to yield the predictable result of determining an anatomical object and algorithm based on the clinical guidelines. Regarding claim 3, Weese in view of Samset teaches the system according to claim 2. Weese does not teach wherein the codified clinical guidelines for a use case define at least one of: anatomical object; metrics regarding relationships between anatomical objects; criteria regarding boundary values of the metrics; or guidance regarding clinical recommendations based on evaluation of the criteria. However, Samset teaches wherein the codified clinical guidelines for a use case define at least one of: anatomical object; metrics regarding relationships between anatomical objects; criteria regarding boundary values of the metrics; or guidance regarding clinical recommendations based on evaluation of the criteria (Paragraph 0035-0036 – “the clinical workflow may inform virtual diagnostic assistant 139 of the potential clinical findings to generate, and virtual diagnostic assistant 139 may identify clinical findings by comparing the clinical parameters generated by virtual parameter assistant 138 to various guidelines, which may include normal ranges of the clinical parameters obtained from published guidelines, research studies, etc… For example, referring to the left ventricle diastolic diameter, virtual diagnostic assistant 139 may compare the measurement of the left ventricle diameter during diastole generated by virtual parameter assistant 138 (e.g., 6.3 cm) to a normal range of left ventricle diastolic diameters for men (e.g., 4.2-5.9 cm) and generate a clinical finding that the left ventricle diastolic diameter is larger than normal”; Note: the clinical guidelines define an anatomical object, which in this case is the ventricle). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Samset to have the clinical guidelines define an anatomical object because it can be used to identify abnormalities in the anatomical object. “By starting with a set of automatically-generated clinical findings and working backwards to the medical images, the process of reviewing a diagnostic imaging exam may be expedited and aspects of the exam prone to error or inconsistencies (e.g., human-to-human variability in taking measurements of features of the images) may be performed in a uniform manner” (Samset: Paragraph 0004). Regarding claim 9, Weese teaches the system according to claim 1. Weese further teaches wherein the processor subsystem is configured to determine the use case; based on detecting an anomaly in the medical image data of an area of the body of the patient; and/or based on detecting an anatomical object or structure affected by an anomaly in the medical image data of an area of the body of the patient (Page 5 lines 30-34, Page 6 lines 8-10 – “the system comprises a first processor unit 12 configured to retrieve 12a a user defined region of interest (ROI) on an image of a patient to be analyzed, as is indicated in Fig. 3b. The first processor unit 12 may also be configured to identify 12b a lesion area in the region of interest using a first image-processing algorithm… Knowing the anatomical structure and location will certainly refine the search space of DDx. In this way, the user may indicate the exact 10 15 anatomy of interest, e.g. "find diagnosis that exhibit lesions in the midbrain."”; Note: a lesion, which is an anomaly, is detected in the medical patient image data. Detecting the lesion helps with determining the use case, as it refines the search space); for enabling selecting said algorithm from the repository based on the detected anomaly and/or affected anatomical object or affected structure (Page 3 lines 1-5, Page 5 lines 32-33, Page 6 lines 8-10, Page 9 lines 17-19 – “The method comprises selecting (51) an acquired image, identifying (52) an anatomical region in the acquired image, selecting (53) a record in a database corresponding to the anatomical region in the acquired image, executing (54) at least one image-processing algorithm associated with the record on the acquired image… The first processor unit 12 may also be configured to identify 12b a lesion area in the region of interest using a first image-processing algorithm…Knowing the anatomical structure and location will certainly refine the search space of DDx. In this way, the user may indicate the exact 10 15 anatomy of interest… The system may further comprise a second processor unit 15 configured to retrieve (15a) a set of selected image-processing algorithms that are associated with the found set of diagnoses and sample cases”; Note: Detecting the lesion allows for finding medical records related to the lesion, and the medical records are associated with algorithms that can be selected and used on the image). Weese does not teach retrieving the codified clinical guidelines. However, Samset teaches retrieving the codified clinical guidelines (Paragraph 0035 – “the clinical workflow may inform virtual diagnostic assistant 139 of the potential clinical findings to generate, and virtual diagnostic assistant 139 may identify clinical findings by comparing the clinical parameters generated by virtual parameter assistant 138 to various guidelines, which may include normal ranges of the clinical parameters obtained from published guidelines, research studies, etc. In some examples, the normal ranges for the clinical parameters may be adjusted based on patient information (e.g., patient gender, patient age). In other examples, virtual diagnostic assistant 139 may be trained to generate only specific findings for specific patients (e.g., virtual diagnostic assistant 139 may be trained to only generate clinical findings for echocardiograms of adult men)”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Samset to retrieve clinical guidelines for the use case because it can assist in quickly determining a diagnosis. “By starting with a set of automatically-generated clinical findings and working backwards to the medical images, the process of reviewing a diagnostic imaging exam may be expedited and aspects of the exam prone to error or inconsistencies (e.g., human-to-human variability in taking measurements of features of the images) may be performed in a uniform manner” (Samset: Paragraph 0004). Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Weese. Regarding claim 4, Weese teaches the system according to claim 1. Weese further teaches wherein the processor subsystem comprises an algorithm matching lookup table for selecting the algorithm from the repository based on at least one of: the determined anatomical object; the use case; the codified clinical guidelines; the metrics; the criteria; or the guidance (Page 3 lines 1-5, page 8 lines 29-31 – “The method comprises selecting (51) an acquired image, identifying (52) an anatomical region in the acquired image, selecting (53) a record in a database corresponding to the anatomical region in the acquired image, executing (54) at least one image-processing algorithm associated with the record on the acquired image…The selected image-processing algorithms associated with the set of diagnoses and sample cases, which are located in the medical encyclopedia, may be stored either in the medical encyclopedia or in a memory connected to the system”; Note: while it is not explicitly stated that there is a lookup table, it is implied. There is a database comprising medical records, where the medical records are associated with respective algorithms. In order for the selection unit to pick an algorithm corresponding to a medical record, it is obvious there would be a table to represent the correspondence between the medical record and algorithms). Claims 7, 15-16, and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Weese in view of Shukla (US 20110161854 A1), hereinafter Shukla. Regarding claim 7, Weese teaches the system according to claim 5. Weese does not teach wherein the processor subsystem comprises a layout rendering and interaction manager for selecting the visualization algorithm matching the output data as provided by the analytic algorithm for assessing the clinical question. However, Shukla teaches a layout rendering and interaction manager for selecting the visualization algorithm matching the output data as provided by the analytic algorithm for assessing the clinical question (Paragraph 0027, 0043, 0046 – “By selecting a view and/or indicator of data within a view, a detailed view is generated to convey information regarding diagnosis, treatment, medication, physician specialist, time frame, etc…Each presentation pane can have its different context sensitive graphical user interface ("GUI") controls…Graphical and/or overlay elements of each of presentation panes can be clickable and/or otherwise selectable resulting in a certain action happening upon clicking or selecting an element, thus being a special sort of interactive controls…A user can select a tab 120 for review and access to included information via the manager interface 100.”; Note: the manager interface is equivalent to the layout rendering and interaction manager. A user can select a view/visualization, which will then render/display the corresponding layout of clinical information. The output data provided by the analytic algorithm was previously taught by Weese in the rejection of claim 5). Regarding claim 15, Weese teaches the system according to claim 1. Weese further teaches wherein the selected algorithm is an analytical algorithm arranged to analyse the image data for generating a 3D anatomical model, anatomical annotations or metrics regarding the anatomical object (Page 7 lines 15-20 – “to detect anatomies in the image, a mesh based image segmentation method may be used. Each organ may be modeled by a mesh model depicting the surface model of the organ. In case of brain, such mesh model may contain parts like left/right cerebral hemispheres, cerebellum and so on. The mesh model may then be adapted to the volumetric image, after which the anatomical regions in the image are identified”; Note: a mesh model of the organ, which is equivalent to a 3D anatomical model, is generated using a mesh-based image segmentation algorithm; the algorithm is analytical). Weese does not teach that the display comprises a 2D view for displaying a guidance panel showing the anatomical annotations, metrics, codified clinical guidelines and/or a slice of the medical image data, and a 3D view for displaying the 3D anatomical model; the processor subsystem is configured to synchronize the 3D view upon user input regarding the 2D view; or to synchronize the 2D view upon user input regarding the 3D view. However, Shukla teaches a display comprising a 2D view for displaying a guidance panel showing the anatomical annotations, metrics, codified clinical guidelines and/or a slice of the medical image data, and a 3D view for displaying the 3D anatomical model (Fig. 1, Paragraph 0047-0048, 0051 – “Views can provide 2D and/or 3D anatomical views based on actual image(s) and/or idealized anatomical representations, for example. Within each view 131-134 (and/or a composite view), one or more indicators 135-139 can be shown indicating medical data associated with the patient. For example, the anatomy representation 131-134 can include a graphical indication of findings and/or other events/conditions for the patient, areas of image data for the patient, and/or other information… The manager 100 also includes one or more additional expandable windows including, for example, one or more of allergies 160, lab results 161, radiology 162, demographics 163, medications 164, problems 165, orders 166, alert review 167, etc.”; Note: Fig. 1 shows a 3D view of a 3D anatomical model of the patient on the left side and a 2D view of textual information guiding the patient on their metrics on the right side; see modified screenshot of Fig. 1 below); the processor subsystem is configured to synchronize the 3D view upon user input regarding the 2D view; or to synchronize the 2D view upon user input regarding the 3D view (Paragraph 0042, 0054 – “Content of the presentation panes can be synchronized between any two or more panes as part of a customization pattern, and/or by explicit choice of an operator, for example. For purposes of example only, selection of an anatomical region (e.g., an abdominal region) on anatomical presentation pane automatically reduces a list of historical exams to only those prior exams targeted to the selected anatomical part… the representation 131-134 can be a 3D representation of a body, certain body part”; Note: the 2D view of the historical exams are synchronized to the selection of an anatomical region on the 3D anatomical structure of the patient. The historical exams are implied to be part of the 2D view, as they are textual information). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla to have a 2D view, in order “to help various end users of the system to explain diagnosis and/or treatment (e.g., between a patient who is visiting a second time and a nurse, between a physician/surgeon and a patient, etc.), explore a solution for a medical procedure (e.g., between two surgeons deciding on a medical procedure), maintain a visual history of a patient's health (e.g., hospital systems), etc.” (Shukla: Paragraph 0028). In other words, a 2D view provides text information to help users understand a medical condition. It also would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla to have a 3D view because “The anatomical system views) can be used as a solution for various end users, such as between a patient and a triage nurse to understand the patient's present health condition, past health history, and pending a health condition which was diagnosed but not treated; for a surgeon to explore multiple medical procedures on a patient 3D visualization; between a surgeon and a patient to explain the related medical procedure which will be conducted on patient; etc. The views) can form a visual part of an enterprise solution system, for example” (Shukla: Paragraph 0024). In other words, a 3D view provides a clear visualization of a patient’s health. Finally, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla top synchronize the 2D and 3D view for the benefit of being able to use the views together, so that they can supplement each other. This would enhance the user experience and increase their understanding of the condition. PNG media_image1.png 790 1003 media_image1.png Greyscale Screenshot of Fig. 1 (taken from Shukla) Regarding claim 16, Weese in view of Shukla teaches the system according to claim 15. Weese does not teach wherein the processor subsystem is configured to synchronize a 3D viewpoint with respect to the 3D model upon user interaction in the 2D view comprising one of manipulating a cursor position in a slice; selecting anatomical annotations, metrics or codified clinical guidelines in the guidance panel; or to synchronize the 2D view upon change of a 3D cursor position in the 3D view by displaying a slice, anatomical annotations or metrics corresponding to the 3D cursor position. However, Shukla teaches synchronizing a 3D viewpoint with respect to the 3D model upon user interaction in the 2D view comprising one of manipulating a cursor position in a slice; selecting anatomical annotations, metrics or codified clinical guidelines in the guidance panel (Fig. 1, Paragraph 0031, 0049 – “Information can be input to update the views and associated data. For example, 3D skeletal analysis information from a 3D digitizer can be input to display a 3D view of the patient's body with different systems identified…For example, the musculature view 131 can include a cataract indicator 135 and a shoulder muscle indicator 136. The skeletal representation 132 can include a collar bone fracture 137 and an osteoarthritis indicator 138. The circulatory view 133 can include a coronary artery blockage indicator 139, for example. By selecting a system view 131-134 and/or an indicator 135-139, a user can drill down or retrieve addition information and/or views”; Note: a user selects an indicator, which is an anatomical annotation as shown in Fig. 1, which retrieves additional 3D views; see screenshot of Fig. 1 above. In other words, the user interaction in the 2D view with the indicator causes a corresponding change in the 3D portion of the display, which is interpreted as synchronization); or synchronizing the 2D view upon change of a 3D cursor position in the 3D view by displaying a slice, anatomical annotations or metrics corresponding to the 3D cursor position (Paragraph 0042, 0054 – “Content of the presentation panes can be synchronized between any two or more panes as part of a customization pattern, and/or by explicit choice of an operator, for example. For purposes of example only, selection of an anatomical region (e.g., an abdominal region) on anatomical presentation pane automatically reduces a list of historical exams to only those prior exams targeted to the selected anatomical part… the representation 131-134 can be a 3D representation of a body, certain body part”; Note: the 2D view of the historical exams are synchronized to the selection of an anatomical region on the 3D anatomical structure of the patient. The historical exams, which are metrics, are implied to be part of the 2D view, as they are textual information). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla top synchronize the 2D and 3D view for the benefit of being able to use the views together, so that they can supplement each other. This would enhance the user experience and increase their understanding of the condition. Regarding claim 18, Weese in view of Shukla teaches the system according to claim 15. Weese does not teach wherein the processor subsystem is configured to display the 3D anatomical model while at least one of: removing non-relevant structures from the 3D model based on the use case; making the anatomical object transparent; choosing a viewpoint that shows a suitable angle to evaluate a tumor-vessel contact; making a footprint visible to clarify the area of contact and trajectory length of a tumor-vessel contact; or adapting lighting settings for the 3D model visualization to highlight the tumor- vessel contact. However, Shukla teaches displaying the 3D anatomical model while at least one of: removing non-relevant structures from the 3D model based on the use case; making the anatomical object transparent; choosing a viewpoint that shows a suitable angle to evaluate a tumor-vessel contact; making a footprint visible to clarify the area of contact and trajectory length of a tumor-vessel contact; or adapting lighting settings for the 3D model visualization to highlight the tumor- vessel contact (Fig. 1, Paragraph 0024, 0047 – “Certain examples provide systems and methods to display a three-dimensional (3D) view of the body with different systems shadowed, emphasized, or highlighted. The view can provide a composite shadowed view of all body systems together for a patient and can be used across a treatment timeline, such as a complete duration of a patient's stay in a hospital through various stages. The composite view can be separated into its component anatomical system views (e.g., circulatory, skeletal, organ, etc.), for example. The anatomical system views) can be used as a solution for various end users, such as between a patient and a triage nurse to understand the patient's present health condition…the anatomical views 131-134 can be combined into a single composite view from which the individual body system views 131-134 can be separated or isolated for viewing, for example. The anatomical views include a musculature system view 131, a skeletal system view 132, a circulatory system view 133, and an organ system view 134, for example. Views can provide 2D and/or 3D anatomical views based on actual image(s) and/or idealized anatomical representations”; Note: a 3D anatomical model is displayed and is isolated by the anatomical system based on the patient’s health condition. In other words, only the relevant anatomical system is shown). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla to display the 3D anatomical model with irrelevant structures removed for the benefit of user-friendly viewing options and allowing the user to focus on a specific health problem, which may help with better understanding the condition. Regarding claim 19, Weese teaches the system according to claim 1. Weese does not teach the system comprising the display having a 2D display for displaying a guidance panel showing the anatomical annotations, metrics, codified clinical guidelines and/or a slice of the medical image data, and a 3D display for displaying a 3D anatomical model of the anatomical object. However, Shukla teaches the display having a 2D display for displaying a guidance panel showing the anatomical annotations, metrics, codified clinical guidelines and/or a slice of the medical image data (Fig. 1, Paragraph 0051 – “The manager 100 also includes one or more additional expandable windows including, for example, one or more of allergies 160, lab results 161, radiology 162, demographics 163, medications 164, problems 165, orders 166, alert review 167, etc.”; Note: Fig. 1 shows a 2D view, including 160-169, of textual information guiding the patient on their metrics; see modified screenshot of Fig. 1 above), and a 3D display for displaying a 3D anatomical model of the anatomical object (Fig. 1, Paragraph 0054 – “The anatomic representation(s) 131-134…can be a 3D representation of a body, certain body part, etc.”; Note: Fig. 1 shows a 3D view of a 3D anatomical model of the patient on the left side; see modified screenshot of Fig. 1 above). ). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla to have a 2D view, in order “to help various end users of the system to explain diagnosis and/or treatment (e.g., between a patient who is visiting a second time and a nurse, between a physician/surgeon and a patient, etc.), explore a solution for a medical procedure (e.g., between two surgeons deciding on a medical procedure), maintain a visual history of a patient's health (e.g., hospital systems), etc.” (Shukla: Paragraph 0028). In other words, a 2D view provides text information to help users understand a medical condition. It also would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Shukla to have a 3D view because “The anatomical system views) can be used as a solution for various end users, such as between a patient and a triage nurse to understand the patient's present health condition, past health history, and pending a health condition which was diagnosed but not treated; for a surgeon to explore multiple medical procedures on a patient 3D visualization; between a surgeon and a patient to explain the related medical procedure which will be conducted on patient; etc. The views) can form a visual part of an enterprise solution system, for example” (Shukla: Paragraph 0024). In other words, a 3D view provides a clear visualization of a patient’s health. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Weese in view of Liang et al. (US 20100316268 A1), hereinafter Liang. Regarding claim 10, Weese teaches the system according to claim 1. Weese does not teach wherein the use case is treatability based on the relation of a lesion with surrounding anatomical structures; or resectability of tumors in the vicinity of blood vessels. However, Liang teaches wherein the use case is treatability based on the relation of a lesion with surrounding anatomical structures; or resectability of tumors in the vicinity of blood vessels (Paragraph 0005-0006 – “Liver tumor resection can be an efficient treatment method for addressing liver cancer. Before surgery, physicians need to carefully evaluate a hepatic lesion or tumor to be re-sectioned, the volume of the expected remaining liver segments, how a proposed resection is going to affect nearby vascular structures and corresponding blood supply/drainage regions, and how the resection will affect biliary systems… During the planning stage, a surgeon has to understand the spatial relationships between tumors and surrounding vessel structures”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Liang to have the use case treatability be based the relation of a lesion with surrounding anatomical structures because the relation of a lesion with surrounding anatomical structures can determine if the lesion can be removed or not (Liang: Paragraph 0005-0006), and removing lesions is one of the treatments for cancer. If the use case is related to cancer, then it is important to consider how close the lesion is with nearby structures to avoid removing or damaging healthy cells during treatment. Claims 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Weese in view of Verdonck (US 7636460 B2), hereinafter Verdonck. Regarding claim 11, Weese teaches the system according to claim 1. Weese further teaches wherein the image data comprises three-dimensional (3D) image data (Page 8 lines 4-5, Page 9 lines 18-20, Page 13 lines 27-29 – “The client workstation may also be configured to access a patient image e.g. from a scanner or via Picture archiving and communication system (PACS)… The mesh model may then be adapted to the volumetric image, after which the anatomical regions in the image are identified… the system is comprised in a medical workstation or medical system, such as a Computed Tomography (CT) system, Magnetic Resonance Imaging (MRI) System or Ultrasound Imaging (US) system”; Note: the image data is volumetric image from a CT, MRI, or ultrasound system, which produce 3D images); and the anatomical object is a 3D object in the 3D image data (Page 6 line 35, Page 7 lines 1-2, Page 9 lines 18-20, Page 13 lines 27-29 – “the first processor unit may be configured to identify 12c an anatomical structure in the image…The mesh model may then be adapted to the volumetric image, after which the anatomical regions in the image are identified… the system is comprised in a medical workstation or medical system, such as a Computed Tomography (CT) system, Magnetic Resonance Imaging (MRI) System or Ultrasound Imaging (US) system”; Note: it is implied that the anatomical object is a 3D object since it is part of the 3D image data). Weese does not teach wherein the selected algorithm is a slicing algorithm; and the processor subsystem is configured: to determine an anatomically relevant path via the 3D object; to execute the slicing algorithm on the image data to provide, as the display data, slices of the 3D image data that are perpendicular to the anatomically relevant path for assessing the clinical question. However, Verdonck teaches wherein the selected algorithm is a slicing algorithm (Col. 2 lines 33-38 – “according to the method of our invention which is characterized in that, a reference direction is determined in each cross sectional slice, and the object data set is created by concatenating the cross sectional slices, each cross sectional slice orientated so that the reference directions in the cross sectional slices are aligned”; Note: this method is a slicing algorithm. The selection of the algorithm was previously taught by Weese in the rejection of claim 1); and the processor subsystem is configured: to determine an anatomically relevant path via the 3D object (Col. 1 lines 13-17, Col. 4 lines 24-27 – “Medical imaging in particular contains many examples of tortuous objects. Arteries, veins, nerves and the lower digestive tract are all examples of structures which present with a large degree of tortuosity in relation to the surrounding tissue…The curve which constitutes the axis of the tortuous structure is described at all points by a tangent vector which identifies the direction in which the curve moves”; Note: the curve of the tortuous structure is determined. The curve is equivalent to the anatomically relevant path. The tortuous structure, like an artery, is a 3D object); to execute the slicing algorithm on the image data to provide, as the display data, slices of the 3D image data that are perpendicular to the anatomically relevant path for assessing the clinical question (Col. 2 lines 34-38, Col. 4 lines 10-17, Col. 7 lines 11-12 – “a reference direction is determined in each cross sectional slice, and the object data set is created by concatenating the cross sectional slices, each cross sectional slice orientated so that the reference directions in the cross sectional slices are aligned…The result of any of these cross sectional sampling methods is a series of such slices, each including a cross section of the tortuous structure which is essentially perpendicular to the slice throughout the length along which it intersects that slice. As long as each cross sectional slice remains centered on the tortuous structure as it is calculated in the object data set, the included section of tortuous structure remains in the center of each cross sectional slice…The method can be easily extended to 3 dimensions”; Note: the slicing algorithm results in slices that are perpendicular to the tortuous structure, which is the anatomically relevant path). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Verdonck to have and execute a slicing algorithm for the benefit of better 3D viewing of complex anatomical structures: “the clear visualization of complex and tortuous structures within a three dimensional object data set is a difficult problem within imaging…if the structure were a branching artery, reference directions might be chosen to produce orientations in the final images in which the branching structure of the artery was rendered particularly clearly” (Verdonck: Col. 1 lines 11-13, Col. 5 lines 63-67). In other words, aligning the cross-sectional slices of the anatomical structure in a specific way, using the slicing algorithm, produces a clearer image of the structure. It also would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Verdonck to determine an anatomically relevant path via the 3D object for the benefit of aligning the cross-sectional slices for optimal viewing. The path contributes to how the slices are aligned. Regarding claim 12, Weese in view of Verdonck teaches the system according to claim 11. Weese does not teach wherein the slicing algorithm comprises the steps of: selecting a first position on the anatomically relevant path and extract first 3D coordinates of the first position; selecting a second position on the anatomically relevant path at a distance from the first position and extract second 3D coordinates of the second position; calculating a vector from the first position to the second position; computing a slice passing through the first position at an angle perpendicular to the vector. However, Verdonck teaches selecting a first position on the anatomically relevant path and extract first 3D coordinates of the first position (Col. 7 lines 10-14 – “The Figure shows the method as applied to 2 dimensions. The method can be easily extended to 3 dimensions. A 2 dimensional curve 701 is shown with 2 perpendicular cross sectional slices 702 and 703 crossing the curve at points 704 and 705”; Note: the curve is equivalent to the anatomically relevant path, and the point 704 is equivalent to the first position and first coordinate. The coordinate is 3D because the method is in 3D. See screenshot of Fig. 7 below); selecting a second position on the anatomically relevant path at a distance from the first position and extract second 3D coordinates of the second position (Col. 7 lines 10-14 – “The Figure shows the method as applied to 2 dimensions. The method can be easily extended to 3 dimensions. A 2 dimensional curve 701 is shown with 2 perpendicular cross sectional slices 702 and 703 crossing the curve at points 704 and 705”; Note: the curve is equivalent to the anatomically relevant path, and the point 705 is equivalent to the second position and second coordinate. The coordinate is 3D because the method is in 3D. See screenshot of Fig. 7 below); calculating a vector from the first position to the second position (Col. 7 lines 10-20 – “The Figure shows the method as applied to 2 dimensions. The method can be easily extended to 3 dimensions. A 2 dimensional curve 701 is shown with 2 perpendicular cross sectional slices 702 and 703 crossing the curve at points 704 and 705. A tangent vector can be defined at both these points, vector 706 being the tangent at point 704 on slice 702 and vector 707 being the equivalent at point 705 on slice 703. A representative reference direction 708, a vector, is defined within cross sectional slice 702, originating at point 704 and this is transferred to originate from point 705 in slice 703”; Note: vector 708 is calculated from point 704 to 705); computing a slice passing through the first position at an angle perpendicular to the vector (Col. 7 lines 10-23 – “The Figure shows the method as applied to 2 dimensions. The method can be easily extended to 3 dimensions. A 2 dimensional curve 701 is shown with 2 perpendicular cross sectional slices 702 and 703 crossing the curve at points 704 and 705. A tangent vector can be defined at both these points, vector 706 being the tangent at point 704 on slice 702 and vector 707 being the equivalent at point 705 on slice 703. A representative reference direction 708, a vector, is defined within cross sectional slice 702, originating at point 704 and this is transferred to originate from point 705 in slice 703. Because of the curvature of 701, the vector 708 no longer lies within the slice at 705, viz. 703. A new axis is therefore defined at point 705, being the line perpendicular to the plane containing 707 and point 704”; Note: the plane containing 707 and point 704 is equivalent to the computed slice. It is perpendicular to the vector; see screenshot of Fig. 7 below). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Verdonck to determine an anatomically relevant path by calculating a vector between a first and second position and computing a perpendicular slice because the vector along the path determines the direction the slice should face for alignment, and when this process repeats, it would result in Fig. 1 of Verdonck (shown below), which shows alignment for better viewing of the anatomical structure. PNG media_image2.png 256 321 media_image2.png Greyscale Screenshot of Fig. 7 (taken from Verdonck) PNG media_image3.png 351 329 media_image3.png Greyscale Screenshot of Fig. 1 (taken from Verdonck) Regarding claim 13, Weese in view of Verdonck teaches the system according to claim 11. Weese futher teaches wherein the processor subsystem is configured: to select an algorithm from the repository for processing medical image data (Page 8 lines 17-25 and 29-31 – “The system may further comprise a second processor unit 15 configured to retrieve (15a) a set of selected image-processing algorithms that are associated with the found set of diagnoses and sample cases, and optionally located on a memory 16 or a selection unit 20. A selected image-processing algorithm may be an algorithm that may be customized from a common IP algorithm and may be adapted to the modality, to the organ and so on. By customized is here meant that the algorithm may be modified particularly for being used on a certain sample image, and the acquired image resulting in parameters that may be used for facilitating diagnosis, or for discrimination between different diagnoses…The selected image-processing algorithms associated with the set of diagnoses and sample cases, which are located in the medical encyclopedia, may be stored either in the medical encyclopedia or in a memory connected to the system”); to execute the algorithm to calculate clinically relevant information (Page 9 lines 6-9 and 19-21, Page 11 lines 1-2 – “the second processor 15 may further be configured to process (15b) the region of interest in the image using the retrieved selected image-processing algorithms to determine at least one diagnosis of the image…the selected image-processing algorithms is configured to calculate e.g. characteristics of lesion properties such as image appearance, or border characteristics in the image to facilitate subsequent diagnosis of the image…the graphical user interface may be visualized on a display for presenting processed information, such as determined diagnoses, etc. to a user”; Note: the selected algorithm is used on the image to output a diagnosis). Weese does not teach the “2D algorithm” and executing the “2D algorithm on a slice” from the limitations: “wherein the processor subsystem is configured: to select a 2D algorithm from the repository for processing medical image data; to execute the 2D algorithm on a slice to calculate clinically relevant information”. However, Verdonck teaches a 2D algorithm for processing medical image data (Col. 2 lines 33-38, Col. 5 lines 63-67 – “according to the method of our invention which is characterized in that, a reference direction is determined in each cross sectional slice, and the object data set is created by concatenating the cross sectional slices, each cross sectional slice orientated so that the reference directions in the cross sectional slices are aligned…For example, if the structure were a branching artery, reference directions might be chosen to produce orientations in the final images in which the branching structure of the artery was rendered particularly clearly”; Note: this method is a 2D algorithm, as it operates on cross-sectional slices); and executing the 2D algorithm on a slice to calculate clinically relevant information (Col. 2 lines 33-38, Col. 5 lines 63-67 – “according to the method of our invention which is characterized in that, a reference direction is determined in each cross sectional slice, and the object data set is created by concatenating the cross sectional slices, each cross sectional slice orientated so that the reference directions in the cross sectional slices are aligned…For example, if the structure were a branching artery, reference directions might be chosen to produce orientations in the final images in which the branching structure of the artery was rendered particularly clearly”; Note: this method is a 2D algorithm, as it operates on cross-sectional slices. The algorithm helps with rendering anatomical structures, like arteries, which in turn helps with calculating clinical information about the anatomical structure). A person of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the algorithm of Weese could have been substituted for the 2D algorithm of Verdonck because both the algorithm and 2D algorithm serve the purpose of processing medical image data. Furthermore, a person of ordinary skill in the art would have been able to carry out the substitution. Finally, the substitution achieves the predictable result of providing medical information based on images. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the algorithm of Weese for the 2D algorithm of Verdonck according to known methods to yield the predictable result of providing medical information based on images. Claims 14 are rejected under 35 U.S.C. 103 as being unpatentable over Weese in view of Verdonck and Gindele et al. (US 20090279756 A1), hereinafter Gindele. Regarding claim 14, Weese in view of Verdonck teaches the system according to claim 11. Weese does not teach wherein the anatomically relevant path via the 3D object is determined along an interface of the 3D object and a further anatomical structure. However, Gindele teaches wherein the anatomically relevant path via the 3D object is determined along an interface of the 3D object and a further anatomical structure (Paragraph 0041 – “The image-editing algorithm constructs a cutting plane in 3-dimensional space that includes the edit point 205 shown in FIG. 2. Line 206, shown in FIG. 3, is the projection of that 3-dimensional cutting plane. The cutting plane is used to divide the lesion region 204 (shown in FIG. 2) into two parts; potential lesion region 207 and potential vessel region 208. This division of the original lesion region 204 is performed in a 3-dimensional sense, i.e. the region is divided as a 3-dimensional object”; Note: the anatomically relevant path, shown by the line 206 in Fig. 3, is determined as a cutting plane, which is an interface of the 3D object of the lesion and the 3D object of the vessel; see screenshot of Fig. 3 below). PNG media_image4.png 434 392 media_image4.png Greyscale Screenshot of Fig. 3 (taken from Gindele) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Gindele to have the path be along an interface of the 3D object and another anatomical structure. In Verdonck, the path was intended for better viewing of the tortuous structure (like a blood vessel), but in cases like in Weese where there is a lesion (Page 10 lines 19-21), it would be beneficial to have the path be along the interface of the lesion and surrounding objects to have better viewing of the lesion for determining its properties and making a diagnosis. Claims 17 are rejected under 35 U.S.C. 103 as being unpatentable over Weese in view of Shukla and Liang. Regarding claim 17, Weese in view of Shukla teaches the system according to claim 15. Weese does not teach wherein the processor subsystem is configured: to open an interactive display panel showing involvement of a tumor with the anatomical object and metrics regarding the involvement, or metrics regarding the involvement per slice; and to synchronize the interactive display panel, the 2D view and the 3D view. However, Liang teaches opening an interactive display panel showing involvement of a tumor with the anatomical object and metrics regarding the involvement, or metrics regarding the involvement per slice (Paragraph 0040-0041 – “An exemplary embodiment for enabling a user to define a safety margin will now be described, with reference to FIG. 2a, which shows a representation of a safety margin indicator 201 surrounding a lesion or tumor 204 of an anatomical structure 205 (e.g., a liver), depicted in 3D space (i.e., in region 102 of FIG. 1). A safety margin can be useful to define an area for resection, for example, that includes both the lesion or tumor 204 and an additional buffer area around it for additional precautionary reasons…the size and shape of the safety margin indicator 201 can be specified and adjusted by the user 821 by way of a user interface”; Note: there is an interactive display (user interface) for showing a tumor, surrounding anatomical objects, and a safety indicator, which is a type of metric regarding the involvement of the tumor and the other anatomical objects); and to synchronize the interactive display panel, the 2D view and the 3D view (Fig. 1, Paragraph 0040-0041 – “An exemplary embodiment for enabling a user to define a safety margin will now be described, with reference to FIG. 2a, which shows a representation of a safety margin indicator 201 surrounding a lesion or tumor 204 of an anatomical structure 205 (e.g., a liver), depicted in 3D space (i.e., in region 102 of FIG. 1)…the size and shape of the safety margin indicator 201 can be specified and adjusted by the user 821 by way of a user interface… The safety margin indicator 201 is displayable in both 3D and 2D in the regions 102 and 103, respectively, and the size, shape, and/or location of the indicator 201, where adjusted by the user (via, e.g., the user interface 804), are updated in the display region(s) substantially instantaneously by the system 800”; Note: edits by the user are updated in the display. This means that the interactive display panel, which is the user interface, is synchronized with the 2D view and 3D view in regions 102 and 103 respectively, in Fig. 1). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Liang to have an interactive display panel showing a tumor with an anatomical object and their metrics because “The foregoing capabilities for adjusting the safety margin indicator 201 can be beneficial when, for example, multiple lesions/tumors, requiring differently sized and/or shaped safety margin indicators, are to be re-sectioned in a single operative setting. Real time visual feedback provided by the system 800 is especially useful when the user interactively adjusts the size or the like of the safety margin indicator 201 as described above” (Liang: paragraph 0041). In other words, displaying a tumor in its own interactive panel provides different views and annotations of the tumor and may help the user in determining the diagnosis and treatability. It also would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Weese to incorporate the teachings of Liang to synchronize the interactive display and 2D and 3D views for the benefit of being able to use all of the views together, so that they can supplement each other. This would enhance the user experience and increase their understanding of the condition. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kruger et al. (US 20230072095 A1) teaches a method of providing a clinical decision based on how clinical findings from a patient’s images compare to clinical guidelines. Evans et al. (US 20210398676 A1) teaches a method of using a machine learning model to determine a medical condition state based on patient image data. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHELLE HAU MA whose telephone number is (571)272-2187. The examiner can normally be reached M-Th 7-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Poon can be reached at (571) 270-0728. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MICHELLE HAU MA/ Examiner, Art Unit 2617 /KING Y POON/Supervisory Patent Examiner, Art Unit 2617
Read full office action

Prosecution Timeline

May 13, 2024
Application Filed
Jan 13, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602750
DIFFERENTIABLE EMULATION OF NON-DIFFERENTIABLE IMAGE PROCESSING FOR ADJUSTABLE AND EXPLAINABLE NON-DESTRUCTIVE IMAGE AND VIDEO EDITING
2y 5m to grant Granted Apr 14, 2026
Patent 12597208
BUILDING INFORMATION MODELING SYSTEMS AND METHODS
2y 5m to grant Granted Apr 07, 2026
Patent 12573217
SERVER, METHOD AND COMPUTER PROGRAM FOR GENERATING SPATIAL MODEL FROM PANORAMIC IMAGE
2y 5m to grant Granted Mar 10, 2026
Patent 12561851
HIGH-RESOLUTION IMAGE GENERATION USING DIFFUSION MODELS
2y 5m to grant Granted Feb 24, 2026
Patent 12536734
Dynamic Foveated Point Cloud Rendering System
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
81%
Grant Probability
99%
With Interview (+36.4%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 21 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month