DETAILED ACTION
Priority
Acknowledgment is made of applicant’s claim for priority. The certified copy has been filed in parent Application No. JP 2022-077138, filed on May 5th, 2022.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on April 27th, 2023 is being considered by the examiner.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 2, 6, and 9-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more.
Step 1
The claims recite subject matter within a statutory category as a process, machine, and/or article of manufacture. However, it will be shown in the following steps, that claims 1, 2, 6, and 9-20 are nonetheless unpatentable under 35 U.S.C. 101.
Step 2A Prong One
Claim 18 states:
A medical support method comprising:
acquiring a real-time image obtained by imaging the inside of an organ with an endoscope,
executing an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real- time image, thereby acquiring a determination result
generating, on the basis of at least one of the determination result or a voice instruction received by a microphone, collection-related information which is information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected, and
display, on a screen of a display device, the collection-related information, superimposed as at least one of a symbol, character color or numeral, on an anatomical image corresponding to the first positions.
wherein the collection-related information includes collection number information which is information indicating the number of times the tissues have been collected at each of the first positions, and
wherein the collection number information includes collection identification information which is information for identifying each of a plurality of collections of the tissues that have been performed at the first position.
The broadest reasonable interpretation of these steps includes mental processes and/or organizing human activity because each bolded component can practically be performed by the human mind or with pen and paper. Other than reciting generic computer components like “microphone”, “display” or “processor”, nothing in the claims precludes the bold-font portions from practically being performed by a person. For example, but for the “microphone” language, “generate, on the basis on the determination result or a voice instruction received … collection-related information which is information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected,” in the context of this claim encompasses managing personal behavior of a user distinguishing various samples while taking biopsies from a patient in the operating room. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” or “Organizing Human Activity” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
executing an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real- time image, thereby acquiring a determination result
generating, on the basis of at least one of the determination result … collection-related information which is information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected, and
as drafted, could lay out a physician tracking the quality, quantity, and location of biopsies in a surgery by highlighting various information on images taken in a typical surgery. Therefore, under the broadest reasonable interpretation, these steps include multiple abstract ideas that will be identified as a single abstract idea moving forward.
Independent claims 1 and 20 cover similar steps of acquiring a medical image, generating collection related information, and outputting the collection related information. These claims fall under the same category of an abstract idea and follows the same rationale as claim 18.
Dependent claims recite additional subject matter which further narrows or defines the abstract idea embodied in the claims (such as claim 16, reciting particular aspects of how “wherein the number of times the tissue are collected which is indicated by the collection-related information is displayed at the second position on a display device” may be performed in the mind but for recitation of generic computer components).
Dependent claims 6, 9-12 and 14-16 add additional elements to their parent claims which will be further inspected in the following steps for a practical application to their abstract idea.
Step 2A Prong Two
This judicial exception of “Mental Processes” or “Organizing Human Activity” is not integrated into a practical application. Independent claim 1’s system recites additional elements such as a processor, microphone, endoscope, and display. The processor will be treated as a generic computer component. The remaining additional elements will be analyzed for conventionality in step 2B. In particular, these additional elements do not integrate the abstract idea into a practical application because the additional elements:
amount to mere instructions to apply an exception (such as recitation of “a processor, wherein, the processor is configured to”, amounts to invoking computers as a tool to perform the abstract idea, see applicant’s specification [0142] “An example of the processor is a CPU which is a general-purpose processor that executes software”, see MPEP 2106.05(f))
add insignificant extra-solution activity to the abstract idea (such as recitation of “acquire a real-time image obtained by imaging the inside of an organ with an endoscope,” amounts to mere data gathering, recitation of “a voice instruction received by a microphone” or “display, on a screen of a display device, the collection-related information, superimposed as at least one of a symbol, character color or numeral, on an anatomical image corresponding to the first positions wherein the collection-related information includes collection number information which is information indicating the number of times the tissues have been collected at each of the first positions, and wherein the collection number information includes collection identification information which is information for identifying each of a plurality of collections of the tissues that have been performed at the first position.” amounts to insignificant application, see MPEP 2106.05(g))
Dependent claims recite additional subject matter which amount to limitations consistent with the additional elements in the independent claims. For instance, dependent claims 9 and 14 add additional elements of an ultrasound image and a receiving device to their parent claims. Additionally, claim 9 “wherein the real-time image is at least one of an optical image or an ultrasound image” amount to invoking computers as a tool to perform the abstract idea, claim 10 “wherein the collection-related information is generated in a case in which the collection of the tissues at the first positions is performed in the real-time image.” and claim 11 “wherein the collection-related information is associated with the real-time image obtained for each of the plurality of first positions” and claim 12 “wherein the collection-related information is generated on the basis of an instruction received by a receiving device.” and claim 14 “wherein the instruction received by the receiving device is an instruction received in a case in which the collection of the tissues is detected on the basis of a medical image showing the collection of the tissues at the first positions” add insignificant extra-solution activity to the abstract idea which amounts to mere data gathering, claim 15 “wherein the collection state of the tissues indicated by the collection-related information corresponding to the first position is displayed at each second position which is a display position corresponding to the first position in an organ image which is an image showing the organ”, and claim 16 “wherein the number of times the tissues are collected which is indicated by the collection-related information is displayed at the second position on a display device”, amounts to necessary data outputting, see MPEP 2106.05(g)), and claim 6 “wherein the collection identification information is associated with a pathological diagnosis result obtained by the collection of the tissues for each of the plurality of collections of the tissues at the first positions” amounts to insignificant activity). Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation and do not impose a meaningful limit to integrate the abstract idea into a practical application.
The remaining dependent claims 2, 8, 13, 17, and 19 do not recite additional elements or activity but further narrow or define the abstract idea embodied in the claims and hence also do not integrate the aforementioned abstract idea into a practical application.
Step 2B
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to discussion of integration of the abstract idea into a practical application, the additional elements amount to no more than mere instructions to apply an exception and add insignificant extra-solution activity to the abstract idea. Additionally, the additional limitations, amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields.
The independent claims recites an additional element of an endoscope. Fukami et al. (Pat. 3670721) demonstrates in Fig 1 “a general side view of a conventional endoscope” and shows that endoscopes were conventional long before the priority data of the claimed invention. As such, this additional element, individually and in combination with the prior additional element, does not amount to significantly more.
As previously noted, the claim recites an additional element of a microphone. Sebire et al (Pat. 6665289) demonstrates in paragraph (5) “A user interface may include a conventional earphone or speaker 17, a conventional microphone” that a receiving device was conventional long before the priority data of the claimed invention. As such, this additional element, individually and in combination with the prior additional element, does not amount to significantly more.
As previously noted, the claim recites an additional element of a display device. Mault (US 20030226695) demonstrates in paragraph [0104] “the display can be used to provide visual representations of diet points consumed and expenditure points expended, and can be any conventional display” that displays were conventional long before the priority data of the claimed invention. As such, this additional element, individually and in combination with the prior additional element, does not amount to significantly more.
To elaborate:
“acquire a real-time image obtained by imaging the inside of an organ with an endoscope”, is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i);
“display, on a screen of a display device, the collection-related information, superimposed as at least one of a symbol, character color or numeral, on an anatomical image corresponding to the first positions wherein the collection-related information includes collection number information which is information indicating the number of times the tissues have been collected at each of the first positions, and wherein the collection number information includes collection identification information which is information for identifying each of a plurality of collections of the tissues that have been performed at the first position”, is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i);
“a voice instruction received by a microphone”, is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i)
Dependent claims recite additional subject matter which, as discussed above with respect to integration of the abstract idea into a practical application, amount to invoking computers as a tool to perform the abstract idea. Dependent claims recite additional subject matter which amount to limitations consistent with the additional elements in the independent claims. These additional limitations amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields.
As previously noted, the claim recites an additional element of a receiving device. Sebire et al (Pat. 6665289) demonstrates in paragraph (5) “A user interface may include a conventional earphone or speaker 17, a conventional microphone” that a receiving device was conventional long before the priority data of the claimed invention. As such, this additional element, individually and in combination with the prior additional element, does not amount to significantly more.
As previously noted, the claim recites an additional element of an ultrasound image. Wen et al. (Pat. 6645144) demonstrates in paragraph (8) “For example, conventional ultrasound images of biological specimens primarily reveal density variations in the specimen” that ultrasound images were conventional long before the priority data of the claimed invention. As such, this additional element, individually and in combination with the prior additional element, does not amount to significantly more.
To elaborate:
claim 6 “wherein the collection identification information is associated with a pathological diagnosis result obtained by the collection of the tissues for each of the plurality of collections of the tissues at the first positions” Arranging a hierarchy of groups and sorting information, Versata Dev. Group, Inc. v. SAP Am., Inc., MPEP 2106.05(d)(II)(vi)
claim 10 “wherein the collection-related information is generated in a case in which the collection of the tissues at the first positions is performed in the real-time image.”, is equivalently, Arranging a hierarchy of groups and sorting information, Versata Dev. Group, Inc. v. SAP Am., Inc., MPEP 2106.05(d)(II)(vi)
claim 11 “wherein the collection-related information is associated with the real-time image obtained for each of the plurality of first positions” Arranging a hierarchy of groups and sorting information, Versata Dev. Group, Inc. v. SAP Am., Inc., MPEP 2106.05(d)(II)(vi)
claim 12 “wherein the collection-related information is generated on the basis of an instruction received by a receiving device.” is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i);
claim 14 “wherein the instruction received by the receiving device is an instruction received in a case in which the collection of the tissues is detected on the basis of a medical image showing the collection of the tissues at the first positions” is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i);
claim 15 “wherein the collection state of the tissues indicated by the collection-related information corresponding to the first position is displayed at each second position which is a display position corresponding to the first position in an organ image which is an image showing the organ”, is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i);
claim 16 “wherein the number of times the tissues are collected which is indicated by the collection-related information is displayed at the second position on a display device”, is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i);
Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 2, 6, and 9-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wagner et al. (WO2022035709).
Regarding claim 1, Wagner teaches.
A medical support device comprising: a processor, ([0004] “processors of a computing system” where the computing system is a medical support device])
wherein the processor is configured to: acquire a real-time image obtained by imaging the inside of an organ with an endoscope, ([0057] “the endoscopic imaging system 509 includes one or more image capture devices (not shown) that record endoscopic image data that includes concurrent or real-time images (e.g., video, still images, etc.) of patient anatomy.” Where the endoscopic imaging system [comprising, imaging the inside of an organ with an endoscope] uses the processor to acquire real-time images of patient anatomy [comprising, of an organ]; see also [0061] “The display system 510 can display various images or representations of patient anatomy and/or of the medical instrument system 504 that are generated by the positional sensor system 508, by the endoscopic imaging system 509” and [Fig 11’s image of lungs] where the medical instrument system is displayed in the navigational image as it performs a procedure within a patient’s organ system)
execute an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real-time image, thereby acquiring a determination result ([0060-0061] “visualization system 515 of the control system 512 provides navigation and/or anatomy-interaction assistance to the operator 505 when controlling the medical instrument system 504 during an image-guided medical procedure…. The display system 510 can display various images or representations of patient anatomy and/or of the medical instrument system 504 that are generated by the positional sensor system 508, by the endoscopic imaging system 509, by the imaging system 518”; see also [0067] “Thus, the medical instrument 632 can include image capture probes, biopsy instruments or devices (e.g., biopsy needles)… having one or more image capture devices 647 positioned at a distal portion 637 of and/or at other locations along the medical instrument 632. In these embodiments, an image capture device 647 can capture one or more real navigational images” and [0053] “The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure. For example, the operator can be alerted if a lymph node site was missed, if a lymph node site was biopsied out of sequence, if the biopsy device is no longer on the correct path, etc.” where the system uses a visualization system providing anatomy interaction assistance during an image guided medical procedure [comprising image recognition processes on the real time image] that provides feedback indicating that one or more lymph sites were biopsied [which comprises determining whether or not a puncture needle is depicted in the real time image] using biopsy instruments [comprising, a puncture needle]; see optionally [0057] “In some embodiments, the positional sensor system 508 includes … a shape sensor system for capturing positional sensor data (e.g., position, orientation, speed, velocity, pose, shape, etc.) of the medical instrument system 504.” And [0060] “The virtual visualization system 515 then registers the anatomic model to positional sensor data generated by the positional sensor system 508 and/or to endoscopic image data generated by the endoscopic imaging system 509 to (i) map the tracked position, orientation, pose, shape, and/or movement of the medical instrument system 504 within the anatomic region to a correct position within the anatomic mode” where using endoscopic image data to map the tracked position, shape and movement comprises determining whether a puncture needle is depicted in the real-time image)
generate, on the basis of at least one of the determination result or a voice instruction received by a microphone, collection-related information which is information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected, and ([0031] “The lymph node segmentation procedure can be performed in various ways, such as automatically (e.g., without requiring any operator input to identify the lymph nodes).”; [0039] “Any suitable number and combination of lymph node sites can be selected. In some embodiments, step 230 involves selecting at least one, two, three, four, five, or more different lymph node sites to be biopsied (e.g., at least one, two, three, four, five, or more different lymph node stations). Optionally, step 230 can involve selecting a certain number of lymph nodes per station to be biopsied (e.g., at least one, two, three, or more lymph nodes)”; see also ([0041] “At step 240, a sequence for biopsying the selected lymph node sites is determined. The sequence can indicate the order in which the selected lymph node sites should be biopsied during the procedure.” see also [0060] “The virtual visualization system 515 then registers the anatomic model to positional sensor data generated by the positional sensor system 508 and/or to endoscopic image data generated by the endoscopic imaging system 509 to (i) map the tracked position, orientation, pose, shape, and/or movement of the medical instrument system 504 within the anatomic region to a correct position within the anatomic mode” and [0068] “after all or a portion of the medical procedure at the target location is complete, the medical instrument 632 can be retracted back into the elongate device” where automatic selection and segmentation of lymph node sites [comprising, generating, on the basis of a determination result, collection related information] utilizes the registration of the position of the medical instrument to determine the completion of the medical procedure [comprising, information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected])
display, on a screen of a display device, the collection-related information, ([0046] “Subsequently, during the biopsy procedure, the 3D model, selected lymph node sites, and/or biopsy sequence can be displayed to the operator (e.g., via a graphical user interface)“ superimposed as at least one of a symbol, character color or numeral, on an anatomical image corresponding to the first positions. ([0051-0052] “At step 420, the system displays instructions for navigating the biopsy device to one or more lymph node sites. The lymph node sites can be selected during preoperative planning for the biopsy procedure, as previously described with respect to the method 200 of FIG. 2. During the biopsy procedure, the system can output a graphical representation of the lymph node sites via a suitable graphical user interface. For example, the lymph node sites can be rendered as opaque objects within the 3D model, while other anatomic structures of the model (e.g., airways, vessels) may be rendered as transparent or semi-transparent objects so that the lymph node sites remain visible. As another example, the lymph node sites can be displayed as points or images on a 2D map of the anatomic region. Alternatively or in combination, the system can output textual, audio, or other instructions that direct the operator to navigate the biopsy device to the selected lymph node sites. For instance, the system can instruct the operator to biopsy lymph nodes within certain lymph node stations, biopsy lymph nodes located within particular anatomic zones or regions of the anatomy, and so on. In some embodiments, step 420 also includes displaying instructions for biopsying the selected lymph node sites according to a specified sequence. The sequence can be determined during preoperative planning for the biopsy procedure, as previously described with respect to the method 200 of FIG. 2. The sequence can be output to the operator in various ways, such as via the same graphical user interface used to display the lymph node sites. For example, the sequence can be graphically represented as a path that connects the selected lymph node sites in the desired order, as discussed above with respect to the method 200 of FIG. 2. The path can be overlaid onto the model and/or images of the actual patient anatomy to provide visual guidance as the operator navigate the biopsy device within the anatomic region. Alternatively or in combination, the system can output textual, audio, or other instructions that direct the operator to biopsy lymph node sites in a particular order and/or direct the operator to navigate the biopsy device along the path (e.g., in a particular direction and/or for a particular distance, with respect to particular anatomic landmarks, etc.).” and [0053] “At step 430, the method 400 displays positional data of the biopsy device relative to the selected lymph node sites... The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure. For example, the operator can be alerted if a lymph node site was missed, if a lymph node site was biopsied out of sequence, if the biopsy device is no longer on the correct path, etc.” Where overlaying the lymph node sites in an opaque manner on the image of the patient anatomy and adjusting the graphical user interface based for the sequence of lymph node sites comprises collection related information superimposed on an anatomical image corresponding to the first position)
wherein the collection-related information includes collection number information which is information indicating the number of times the tissues have been collected at each of the first positions, and ([0051-0053] above; see also [0039] “Any suitable number and combination of lymph node sites can be selected. In some embodiments, step 230 involves selecting at least one, two, three, four, five, or more different lymph node sites to be biopsied (e.g., at least one, two, three, four, five, or more different lymph node stations). Optionally, step 230 can involve selecting a certain number of lymph nodes per station to be biopsied (e.g., at least one, two, three, or more lymph nodes);” Where output a graphical representation of the lymph node sites via a suitable graphical user interface by images or textual information comprises collection number information)
wherein the collection number information includes collection identification information which is information for identifying each of a plurality of collections of the tissues that have been performed at the first position. ([0051-0053] above; see also [0041] “the biopsy sequence can include sampling N3 nodes before N2 nodes, and N2 nodes before N1 nodes. As another example, the biopsy sequence can include sampling peripheral lymph node sites before central lymph node sites.”, where outputting textual, audio, or other instructions that direct the operator to navigate the biopsy device to the selected lymph node sites [comprises collection identification information] can use positional data to monitor whether the lymph node sites have been biopsied [comprising identifying each of a plurality of collections of the tissues that have been performed at the first positions])
Regarding claim 2, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the collection-related information includes collection completion information which is information indicating completion of the collection of the tissues at each of the first positions. ([0051-0052]; see also [0053] “The position of the biopsy device can be graphically represented as an object within the 3D anatomic model so the operator can visualize the location of the biopsy device relative to the lymph node sites and/or the planned path. The positional data can also be used to monitor the progress of the biopsy procedure; track which lymph node sites have or have not been biopsied or otherwise provide instructions and/or feedback to assist the operator” where collection completion data is created using positional data to denote collection completion at each lymph node position on an anatomical model)
Regarding claim 6, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the collection identification information is associated with a pathological diagnosis result obtained by the collection of the tissues for each of the plurality of collections of the tissues at the first positions. ([0054] “In embodiments where the operator has immediate access to the biopsy results … the system can omit lymph node sites that are downstream of a negative biopsy site.” Where the collection identification information includes a pathological diagnosis result for each tissue sample and/or position)
Regarding claim 9, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the real-time image is at least one of an optical image or an ultrasound image. ([0027] “The image data can include, for example, computed tomography (CT) data… ultrasound data” where this image data is the precursor to lymph node segmentation)
Regarding claim 10, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the collection-related information is generated ([0031] “For example, automatic lymph node segmentation can be performed using a machine learning algorithm, such as a deep learning algorithm … that has been trained to identify and segment individual lymph nodes from CT scans or other image data of the patient anatomy.” where the collection-related data bases the generation of the information at the first collection position on the imaging and positional information of the instrument) in a case in which the collection of the tissues at the first positions is performed in the real-time image. (([0057] “In these and other embodiments, the endoscopic imaging system 509 includes one or more image capture devices (not shown) that record endoscopic image data that includes concurrent or real-time images (e.g., video, still images, etc.) of patient anatomy.” where the endoscopic imaging system provides information for lymph node segmentation); see optionally [0084] “During the segmentation process, pixels or voxels generated from the image data 1080 may be partitioned into segments or elements or be tagged to indicate that they share certain characteristics or computed properties such as color, density, intensity, and texture. The segments or elements may then be converted to an anatomic model and/or to an image point cloud of the medical instrument system 504.” [Figure 9] “a real navigational image of real patient anatomy from a viewpoint of the portion of the medical instrument system” is a source of image data for automatic lymph node segmentation described above)
Regarding claim 11, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the collection-related information is associated with the real-time image obtained for each of the plurality of first positions. ([0082] ” each real navigational image captured by the endoscopic imaging system 509 can be associated with a time stamp and/or a position recorded in the medical instrument frame of reference” where the position is where a biopsy is taken)
Regarding claim 12, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the collection-related information is generated on the basis of an instruction received by a receiving device. ([0032] “the operator can select one or more locations in the image data that include lymph nodes and/or correspond to lymph node stations, and the computing system can analyze the selected locations to identify and segment the lymph nodes at those locations. In some embodiments, the operator provides input indicating the selected locations (e.g., via a suitable graphical user interface), such as by clicking or otherwise marking a point corresponding to a lymph node, drawing a boundary around edges or surfaces of a lymph node, selecting areas of the image data including lymph nodes and/or lymph node stations, or any other suitable process. The system can then use the input from the operator as a starting point for automatically detecting one or more lymph nodes in the image data.” Where the graphical user interface [comprising, a receiving device] provides inputs [comprising, gathers instructions] to segment lymph nodes [comprising, to generate collection related information])
Regarding claim 13, Wagner teaches all of the limitations of claim 12. Wagner also teaches:
wherein the instruction received by the receiving device (see “graphical user interface” above) is an instruction (see “provides inputs” above) with respect to a result generated on the basis of a medical image showing the collection of the tissues at the first positions. ([0032] “the operator can select one or more locations in the image data that include lymph nodes and/or correspond to lymph node stations, and the computing system can analyze the selected locations to identify and segment the lymph nodes at those locations” where segmenting the lymph nodes occurs on the basis of the image data)
Regarding claim 14, Wagner teaches all of the limitations of claim 12. Wagner also teaches:
according to wherein the instruction received by the receiving device (see “graphical user interface” above) is an instruction received in a case in which the collection of the tissues is detected (see “identify and segment the lymph nodes in the image data” above) on the basis of a medical image showing the collection of the tissues at the first positions. ([0053] “The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure.” where monitoring the progress of the biopsy procedure based on the positional data comprises detecting the collection of tissues on the basis of a medical image)
Regarding claim 15, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein the collection state of the tissues indicated by the collection-related information corresponding to the first position (see “positional data”, referenced above) is displayed at each second position which is a display position corresponding to the first position in an organ image which is an image showing the organ. ([Figure 11] shows multiple display windows depicting collection state information of an organ image on multiple positions; [0085] “the display system… can display various images or representations of patient anatomy and/or of the medical instrument system based, at least in part, on data captured and/or generated”)
Regarding claim 16, Wagner teaches all of the limitations of claim 15. Wagner also teaches:
wherein the number of times the tissues are collected which is indicated by the collection-related information is displayed at the second position on a display device. ([0046] “during the biopsy procedure, the 3D model, selected lymph node sites, and/or biopsy sequence can be displayed to the operator (e.g., via a graphical user interface)” where the biopsy sequence denotes the number of times the tissue is collected at each of the positions)
Regarding claim 17, Wagner teaches all of the limitations of claim 1. Wagner also teaches:
wherein an order in which the tissues are collected at the plurality of first positions is predetermined. ([0046] “The output of the method 200 (e.g., the 3D model, selected lymph node sites, biopsy sequence, and/or path) can be saved (e.g., as one or more digital files) as part of a plan for the biopsy procedure” where the sequence is predetermined)
Regarding claim 18, Wagner teaches:
A medical support method comprising: ([0049] “For example, all or a subset of the steps of the method can be implemented by a control system of a medical instrument system or device” where the method comprises using an endoscopic imaging system)
acquiring a real-time image obtained by imaging the inside of an organ with an endoscope, ([0057] “the endoscopic imaging system 509 includes one or more image capture devices (not shown) that record endoscopic image data that includes concurrent or real-time images (e.g., video, still images, etc.) of patient anatomy.” Where the endoscopic imaging system [comprising, imaging the inside of an organ with an endoscope] uses the processor to acquire real-time images of patient anatomy [comprising, of an organ]; see also [0061] “The display system 510 can display various images or representations of patient anatomy and/or of the medical instrument system 504 that are generated by the positional sensor system 508, by the endoscopic imaging system 509” and [Fig 11’s image of lungs] where the medical instrument system is displayed in the navigational image as it performs a procedure within a patient’s organ system)
executing an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real- time image, thereby acquiring a determination result ([0060-0061] “visualization system 515 of the control system 512 provides navigation and/or anatomy-interaction assistance to the operator 505 when controlling the medical instrument system 504 during an image-guided medical procedure…. The display system 510 can display various images or representations of patient anatomy and/or of the medical instrument system 504 that are generated by the positional sensor system 508, by the endoscopic imaging system 509, by the imaging system 518”; see also [0067] “Thus, the medical instrument 632 can include image capture probes, biopsy instruments or devices (e.g., biopsy needles)… having one or more image capture devices 647 positioned at a distal portion 637 of and/or at other locations along the medical instrument 632. In these embodiments, an image capture device 647 can capture one or more real navigational images” and [0053] “The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure. For example, the operator can be alerted if a lymph node site was missed, if a lymph node site was biopsied out of sequence, if the biopsy device is no longer on the correct path, etc.” where the system uses a visualization system providing anatomy interaction assistance during an image guided medical procedure [comprising image recognition processes on the real time image] that provides feedback indicating that one or more lymph sites were biopsied [which comprises determining whether or not a puncture needle is depicted in the real time image] using biopsy instruments [comprising, a puncture needle]; see optionally [0057] “In some embodiments, the positional sensor system 508 includes … a shape sensor system for capturing positional sensor data (e.g., position, orientation, speed, velocity, pose, shape, etc.) of the medical instrument system 504.” And [0060] “The virtual visualization system 515 then registers the anatomic model to positional sensor data generated by the positional sensor system 508 and/or to endoscopic image data generated by the endoscopic imaging system 509 to (i) map the tracked position, orientation, pose, shape, and/or movement of the medical instrument system 504 within the anatomic region to a correct position within the anatomic mode” where using endoscopic image data to map the tracked position, shape and movement comprises determining whether a puncture needle is depicted in the real-time image)
generating, on the basis of at least one of the determination result or a voice instruction received by a microphone, collection-related information which is information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected, ([0031] “The lymph node segmentation procedure can be performed in various ways, such as automatically (e.g., without requiring any operator input to identify the lymph nodes).”; [0039] “Any suitable number and combination of lymph node sites can be selected. In some embodiments, step 230 involves selecting at least one, two, three, four, five, or more different lymph node sites to be biopsied (e.g., at least one, two, three, four, five, or more different lymph node stations). Optionally, step 230 can involve selecting a certain number of lymph nodes per station to be biopsied (e.g., at least one, two, three, or more lymph nodes)”; see also ([0041] “At step 240, a sequence for biopsying the selected lymph node sites is determined. The sequence can indicate the order in which the selected lymph node sites should be biopsied during the procedure.” see also [0060] “The virtual visualization system 515 then registers the anatomic model to positional sensor data generated by the positional sensor system 508 and/or to endoscopic image data generated by the endoscopic imaging system 509 to (i) map the tracked position, orientation, pose, shape, and/or movement of the medical instrument system 504 within the anatomic region to a correct position within the anatomic mode” and [0068] “after all or a portion of the medical procedure at the target location is complete, the medical instrument 632 can be retracted back into the elongate device” where automatic selection and segmentation of lymph node sites [comprising, generating, on the basis of a determination result, collection related information] utilizes the registration of the position of the medical instrument to determine the completion of the medical procedure [comprising, information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected])
display, on a screen of a display device, the collection-related information ([0046] “Subsequently, during the biopsy procedure, the 3D model, selected lymph node sites, and/or biopsy sequence can be displayed to the operator (e.g., via a graphical user interface)”) superimposed as at least one of a symbol, character color or numeral, on an anatomical image corresponding to the first positions. ([0051-0052] “At step 420, the system displays instructions for navigating the biopsy device to one or more lymph node sites. The lymph node sites can be selected during preoperative planning for the biopsy procedure, as previously described with respect to the method 200 of FIG. 2. During the biopsy procedure, the system can output a graphical representation of the lymph node sites via a suitable graphical user interface. For example, the lymph node sites can be rendered as opaque objects within the 3D model, while other anatomic structures of the model (e.g., airways, vessels) may be rendered as transparent or semi-transparent objects so that the lymph node sites remain visible. As another example, the lymph node sites can be displayed as points or images on a 2D map of the anatomic region. Alternatively or in combination, the system can output textual, audio, or other instructions that direct the operator to navigate the biopsy device to the selected lymph node sites. For instance, the system can instruct the operator to biopsy lymph nodes within certain lymph node stations, biopsy lymph nodes located within particular anatomic zones or regions of the anatomy, and so on. In some embodiments, step 420 also includes displaying instructions for biopsying the selected lymph node sites according to a specified sequence. The sequence can be determined during preoperative planning for the biopsy procedure, as previously described with respect to the method 200 of FIG. 2. The sequence can be output to the operator in various ways, such as via the same graphical user interface used to display the lymph node sites. For example, the sequence can be graphically represented as a path that connects the selected lymph node sites in the desired order, as discussed above with respect to the method 200 of FIG. 2. The path can be overlaid onto the model and/or images of the actual patient anatomy to provide visual guidance as the operator navigate the biopsy device within the anatomic region. Alternatively or in combination, the system can output textual, audio, or other instructions that direct the operator to biopsy lymph node sites in a particular order and/or direct the operator to navigate the biopsy device along the path (e.g., in a particular direction and/or for a particular distance, with respect to particular anatomic landmarks, etc.).” and [0053] “At step 430, the method 400 displays positional data of the biopsy device relative to the selected lymph node sites... The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure. For example, the operator can be alerted if a lymph node site was missed, if a lymph node site was biopsied out of sequence, if the biopsy device is no longer on the correct path, etc.” Where overlaying the lymph node sites in an opaque manner on the image of the patient anatomy and adjusting the graphical user interface based for the sequence of lymph node sites comprises collection related information superimposed on an anatomical image corresponding to the first position)
wherein the collection-related information includes collection number information which is information indicating the number of times the tissues have been collected at each of the first positions, and ([0051-0053] above; see also [0039] “Any suitable number and combination of lymph node sites can be selected. In some embodiments, step 230 involves selecting at least one, two, three, four, five, or more different lymph node sites to be biopsied (e.g., at least one, two, three, four, five, or more different lymph node stations). Optionally, step 230 can involve selecting a certain number of lymph nodes per station to be biopsied (e.g., at least one, two, three, or more lymph nodes);” Where output a graphical representation of the lymph node sites via a suitable graphical user interface by images or textual information comprises collection number information)
wherein the collection number information includes collection identification information which is information for identifying each of a plurality of collections of the tissues that have been performed at the first positions. ([0051-0053] above; see also [0041] “the biopsy sequence can include sampling N3 nodes before N2 nodes, and N2 nodes before N1 nodes. As another example, the biopsy sequence can include sampling peripheral lymph node sites before central lymph node sites.”, where outputting textual, audio, or other instructions that direct the operator to navigate the biopsy device to the selected lymph node sites [comprises collection identification information] can use positional data to monitor whether the lymph node sites have been biopsied [comprising identifying each of a plurality of collections of the tissues that have been performed at the first positions])
Regarding claim 19, Wagner teaches all of the limitations of claim 18. Wagner also teaches:
further comprising: determining whether or not the tissue has been collected on the basis of the real-time image showing the collection of the tissues; ([0057] “In these and other embodiments, the endoscopic imaging system 509 includes one or more image capture devices (not shown) that record endoscopic image data that includes concurrent or real-time images (e.g., video, still images, etc.) of patient anatomy.” where the endoscopic imaging system provides information for lymph node segmentation; see also [0053] “The positional data can also be used to monitor the progress of the biopsy procedure; track which lymph node sites have or have not been biopsied or otherwise provide instructions and/or feedback to assist the operator” where monitoring the progress of the biopsy procedure based on the positional data comprises detecting the collection of tissues on the basis of a medical image)
receiving a user's instruction with respect to a determination result; and ([0054]“The method (400) can further include receiving feedback from the operator during the biopsy procedure. … The plan for the biopsy procedure can be adjusted based on the operator feedback.” where feedback is a user’s instruction based on the user’s determination)
outputting collection number information indicating the number of times the tissues have been collected at each of the first positions according to the instruction. ([0046] “during the biopsy procedure, the 3D model, selected lymph node sites, and/or biopsy sequence can be displayed to the operator (e.g., via a graphical user interface)” where the biopsy sequence denotes the number of times the tissue is collected at each of the positions while the operator inputs information during the procedure)
Regarding claim 20, Wagner teaches:
A non-transitory storage medium storing a program that causes a computer to execute a process comprising: ([0004] “processors of a computing system” where the processor is a non-transitory storage medium)
acquiring a real-time image obtained by imaging the inside of an organ with an endoscope, ([0057] “the endoscopic imaging system 509 includes one or more image capture devices (not shown) that record endoscopic image data that includes concurrent or real-time images (e.g., video, still images, etc.) of patient anatomy.” Where the endoscopic imaging system [comprising, imaging the inside of an organ with an endoscope] uses the processor to acquire real-time images of patient anatomy [comprising, of an organ]; see also [0061] “The display system 510 can display various images or representations of patient anatomy and/or of the medical instrument system 504 that are generated by the positional sensor system 508, by the endoscopic imaging system 509” and [Fig 11’s image of lungs] where the medical instrument system is displayed in the navigational image as it performs a procedure within a patient’s organ system)
executing an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real- time image, thereby acquiring a determination result ([0060-0061] “visualization system 515 of the control system 512 provides navigation and/or anatomy-interaction assistance to the operator 505 when controlling the medical instrument system 504 during an image-guided medical procedure…. The display system 510 can display various images or representations of patient anatomy and/or of the medical instrument system 504 that are generated by the positional sensor system 508, by the endoscopic imaging system 509, by the imaging system 518”; see also [0067] “Thus, the medical instrument 632 can include image capture probes, biopsy instruments or devices (e.g., biopsy needles)… having one or more image capture devices 647 positioned at a distal portion 637 of and/or at other locations along the medical instrument 632. In these embodiments, an image capture device 647 can capture one or more real navigational images” and [0053] “The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure. For example, the operator can be alerted if a lymph node site was missed, if a lymph node site was biopsied out of sequence, if the biopsy device is no longer on the correct path, etc.” where the system uses a visualization system providing anatomy interaction assistance during an image guided medical procedure [comprising image recognition processes on the real time image] that provides feedback indicating that one or more lymph sites were biopsied [which comprises determining whether or not a puncture needle is depicted in the real time image] using biopsy instruments [comprising, a puncture needle]; see optionally [0057] “In some embodiments, the positional sensor system 508 includes … a shape sensor system for capturing positional sensor data (e.g., position, orientation, speed, velocity, pose, shape, etc.) of the medical instrument system 504.” And [0060] “The virtual visualization system 515 then registers the anatomic model to positional sensor data generated by the positional sensor system 508 and/or to endoscopic image data generated by the endoscopic imaging system 509 to (i) map the tracked position, orientation, pose, shape, and/or movement of the medical instrument system 504 within the anatomic region to a correct position within the anatomic mode” where using endoscopic image data to map the tracked position, shape and movement comprises determining whether a puncture needle is depicted in the real-time image)
generating, on the basis of at least one of the determination result or a voice instruction received by a microphone, collection-related information which is information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected, and ([0031] “The lymph node segmentation procedure can be performed in various ways, such as automatically (e.g., without requiring any operator input to identify the lymph nodes).”; [0039] “Any suitable number and combination of lymph node sites can be selected. In some embodiments, step 230 involves selecting at least one, two, three, four, five, or more different lymph node sites to be biopsied (e.g., at least one, two, three, four, five, or more different lymph node stations). Optionally, step 230 can involve selecting a certain number of lymph nodes per station to be biopsied (e.g., at least one, two, three, or more lymph nodes)”; see also ([0041] “At step 240, a sequence for biopsying the selected lymph node sites is determined. The sequence can indicate the order in which the selected lymph node sites should be biopsied during the procedure.” see also [0060] “The virtual visualization system 515 then registers the anatomic model to positional sensor data generated by the positional sensor system 508 and/or to endoscopic image data generated by the endoscopic imaging system 509 to (i) map the tracked position, orientation, pose, shape, and/or movement of the medical instrument system 504 within the anatomic region to a correct position within the anatomic mode” and [0068] “after all or a portion of the medical procedure at the target location is complete, the medical instrument 632 can be retracted back into the elongate device” where automatic selection and segmentation of lymph node sites [comprising, generating, on the basis of a determination result, collection related information] utilizes the registration of the position of the medical instrument to determine the completion of the medical procedure [comprising, information related to a situation in which tissues at each of a plurality of the first positions in the organ have been collected])
display, on a screen of a display device, the collection-related information, ([0046] “Subsequently, during the biopsy procedure, the 3D model, selected lymph node sites, and/or biopsy sequence can be displayed to the operator (e.g., via a graphical user interface)“) superimposed as at least one of a symbol, character color or numeral, on an anatomical image corresponding to the first positions. . ([0051-0052] “At step 420, the system displays instructions for navigating the biopsy device to one or more lymph node sites. The lymph node sites can be selected during preoperative planning for the biopsy procedure, as previously described with respect to the method 200 of FIG. 2. During the biopsy procedure, the system can output a graphical representation of the lymph node sites via a suitable graphical user interface. For example, the lymph node sites can be rendered as opaque objects within the 3D model, while other anatomic structures of the model (e.g., airways, vessels) may be rendered as transparent or semi-transparent objects so that the lymph node sites remain visible. As another example, the lymph node sites can be displayed as points or images on a 2D map of the anatomic region. Alternatively or in combination, the system can output textual, audio, or other instructions that direct the operator to navigate the biopsy device to the selected lymph node sites. For instance, the system can instruct the operator to biopsy lymph nodes within certain lymph node stations, biopsy lymph nodes located within particular anatomic zones or regions of the anatomy, and so on. In some embodiments, step 420 also includes displaying instructions for biopsying the selected lymph node sites according to a specified sequence. The sequence can be determined during preoperative planning for the biopsy procedure, as previously described with respect to the method 200 of FIG. 2. The sequence can be output to the operator in various ways, such as via the same graphical user interface used to display the lymph node sites. For example, the sequence can be graphically represented as a path that connects the selected lymph node sites in the desired order, as discussed above with respect to the method 200 of FIG. 2. The path can be overlaid onto the model and/or images of the actual patient anatomy to provide visual guidance as the operator navigate the biopsy device within the anatomic region. Alternatively or in combination, the system can output textual, audio, or other instructions that direct the operator to biopsy lymph node sites in a particular order and/or direct the operator to navigate the biopsy device along the path (e.g., in a particular direction and/or for a particular distance, with respect to particular anatomic landmarks, etc.).” and [0053] “At step 430, the method 400 displays positional data of the biopsy device relative to the selected lymph node sites... The positional data can also be used to monitor the progress of the biopsy procedure, track which lymph node sites have or have not been biopsied, or otherwise provide instructions and/or feedback to assist the operator in performing the procedure. For example, the operator can be alerted if a lymph node site was missed, if a lymph node site was biopsied out of sequence, if the biopsy device is no longer on the correct path, etc.” Where overlaying the lymph node sites in an opaque manner on the image of the patient anatomy and adjusting the graphical user interface based for the sequence of lymph node sites comprises collection related information superimposed on an anatomical image corresponding to the first position)
wherein the collection-related information includes collection number information which is information indicating the number of times the tissues have been collected at each of the first positions, and ([0051-0053] above; see also [0039] “Any suitable number and combination of lymph node sites can be selected. In some embodiments, step 230 involves selecting at least one, two, three, four, five, or more different lymph node sites to be biopsied (e.g., at least one, two, three, four, five, or more different lymph node stations). Optionally, step 230 can involve selecting a certain number of lymph nodes per station to be biopsied (e.g., at least one, two, three, or more lymph nodes);” Where output a graphical representation of the lymph node sites via a suitable graphical user interface by images or textual information comprises collection number information)
wherein the collection number information includes collection identification information which is information for identifying each of a plurality of collections of the tissues that have been performed at the first position. ([0051-0053] above; see also [0041] “the biopsy sequence can include sampling N3 nodes before N2 nodes, and N2 nodes before N1 nodes. As another example, the biopsy sequence can include sampling peripheral lymph node sites before central lymph node sites.”, where outputting textual, audio, or other instructions that direct the operator to navigate the biopsy device to the selected lymph node sites [comprises collection identification information] can use positional data to monitor whether the lymph node sites have been biopsied [comprising identifying each of a plurality of collections of the tissues that have been performed at the first positions])
Response to Arguments
Examiner thanks Applicant for the structured arguments and will address them in the order they were presented.
Regarding pages 2-3, Applicant’s arguments have been fully considered but are not persuasive.
Applicant traversed the 35 U.S.C. 101 rejection for claims 1, 2, 6, and 8-20, arguing that the ordered series of concrete technical steps do not describe the processing of information at an abstract level. The Examiner respectfully disagrees. The Applicant reiterates the claim language but does not clearly describe how the ordered steps amount to steps that amount to significantly more than the judicial exception. Therefore, the Examiner maintains that looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually.
Regarding pages 3-5, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that the BRI encompasses an unreasonably broad interpretation that strips away essential operations and output modalities. The Examiner respectfully disagrees. MPEP 2106.04(a)(2)(III)(A) states that a claimed invention is directed to a mental process if the identified claim elements contain limitations that the human mind is equipped to perform. Abstract ideas that have been held to be practically performable in the human mind include collection/analysis of data, and collection/comparison of data. The Examiner submits that Applicant’s claims fall within the mental process grouping of abstract ideas and the Applicant has not identified how specific claim language cannot be practically performed in the human mind. For instance, the method of “executing an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real- time image, thereby acquiring a determination result” depicts the same steps that a surgeon would interpret as they tracked a medical instrument in collecting a number of samples throughout an endoscopic procedure. Therefore, because the identified features of the claim can be practically performed in the human mind, the claims are directed to an abstract idea.
Regarding pages 5- 6, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that claim 1 specifies a sequential causal relationship among the processing steps. The Examiner respectfully disagrees. The Applicant reiterates the claim language but does not clearly describe how the ordered steps amount to steps that amount to significantly more than the judicial exception. Therefore, the Examiner maintains that there is no indication that the combination of elements improves the functioning of a computer or improves any other technology and that their collective functions merely provide conventional computer implementation.
Regarding pages 6-7, Applicant’s arguments have been fully considered but are not persuasive.
Applicant argues that the additional elements are meaningful and significant. Examiner respectfully disagrees. MPEP 2106.05(d)(I) indicates that in determining whether the additional elements represent are well-understood, routine, conventional activities, the Examiner should consider whether the additional elements (1) provide an improvement to the technological environment to which the claim is confined, (2) whether the additional elements are mere instructions to apply the judicial exception, or (3) whether the additional elements represent insignificant extra-solution activity. The additional elements of the claims do not provide significantly more based on this inquiry.
Taking these in turn, whether the additional elements of the claim provide an improvement was analyzed/addressed in the 2A2 analysis as insignificant additional activity. The technological environment to which the claims are confined (a general-purpose computer performing generic computer functions [see Spec. Para. 0142]) is recited at a high level of generality and has been found by the courts to be insufficient to provide a practical application (see MPEP 2106.05(d)(II); Alice Corp.). The additional elements of an endoscope, microphone, and display that were found to represent extra-solution activity were analyzed and determined to represent well-understood, routine, conventional activities in the field. This determination was viewed both individually and as an ordered combination, but the additional elements do not provide significantly more to the abstract idea and the claims are not subject matter eligible. Therefore, the Examiner maintains that there is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation.
Regarding pages 7-16, Applicant’s arguments have been fully considered but are not persuasive.
Applicant argues in a comparison table that Wagner does not disclose the elements of claim 1 because the disclosure is instead directed to a disclosure different in concept and granularity. Examiner respectfully disagrees.
Applicant argues that Wagner paragraph [0060] does not disclose any image recognition that determines, using an image of an endoscope, whether a needle is depicted. Paragraph [0060] of Wagner explains that the system visualizes the endoscope image data to map the movement of a medical instrument for the user in a surgical procedure. This endoscopic information is used to determine whether a biopsy is completed or missed. Image capture on a real-time image is integral in this process and therefore encompasses image recognition process. Paragraph [0053] also elaborates that Wagner’s positional data, which is integrated into the visualization system, includes the position, and shape of medical instruments, like biopsy needles.
Applicant also argues that Wagner does not disclose a configuration in which a voice instruction is received. Claim 1’s language “ generate, on the basis of at least one of the determination result or a voice instruction received by a microphone, collection-related information” does not limit the claim to include the microphone because the claim broadens the claim to include “at least one of” the limitations comprising the voice instruction which uses the microphone as a trigger. The Examiner does not concede that Wagner is silent on this limitation. Rather, the Examiner concludes that this limitation does not further limit the claim set and that Wagner generates the determination result based on collection related information. Examiner encourages Applicant to remove exclusionary language to necessitate Wagner’s teaching on receiving a voice instruction.
Applicant continues to state that Wagner discloses content that is different in nature than Applicant’s limitation and does not teach collection related information superimposing information indicating the number of times the tissues have been collected. Examiner, in part, uses paragraphs [0039-0041] and [0051-0053] to understand Wagner’s display of biopsy site information to the user. While Wagner’s depiction of opaque biopsy sites overlayed on the patient anatomy substantially indicates the selection of numbers and locations of the site to be biopsied, Wagner also indicates that the system can output textual instructions that direct the operator to biopsy lymph node sites in a particular order. Additionally, this particular order and numbering of biopsy sites necessarily requires the display of information indicating the number of times the tissues have been collected. Furthermore, the biopsy sites depicted visually on a patient anatomy associated with a numerically sequential order for biopsy identifies each number and location of biopsy from one another.
Applicant argues that Wagner does not specify the causal chain of events of claim 1. While Examiner maintains that the events of claim 1, Examiner finds paragraph [0047] to depict that a person having ordinary skill in the art would not be prohibited by reordering or omitting the steps of collection lymph node biopsies. Therefore, Examiner maintains that the steps teach the causal chain of events.
Applicant argues that Wagner teaches material at a high level of generality. Examiner turns towards MPEP 2111.01 to determine the Broadest Reasonable Interpretation. This states that “it is improper to import claim limitations from the specification”. While Examiner acknowledges that Wagner discloses details regarding navigational imaging within a user, Wagner also details substantial graphical interface features that teach the claim limitations as written. Applicant’s arguments elaborate upon image recognition processing and attempts to distinguish the claimed collection process from the mapped art. Examiner maintains that Applicant’s claim language “execute an image recognition process on the real time image so as to determine whether or not a puncture needle is depicted in the real-time image” does not distinguish how any image processing process depicts a needle in the image differently from Wagner nor does it detail how a user may distinguish a puncture needle collecting a sample of tissue from an organ from the existing prior art.
Examiner finds the arguments unpersuasive and for the reasons described above and elaborated upon in the 102 rejection, Examiner maintains that Wagner teaches the claim language, as written.
Additional Considerations
The prior art made of record and not relied upon that is considered pertinent to applicant’s disclosure can be found on PTO-892 of the prior office action dated September 18th, 2025.
Sheldon et al. (US20220331047) discloses a method comprising receiving a collected set of spatial information for a distal portion of an instrument at a plurality of locations within a set of anatomic passageways and receiving a set of position information for a reference portion of the instrument when the distal portion of the instrument is at each of the plurality of locations. The method also comprises determining a reference set of spatial information for the distal portion of the instrument based on the collected set of spatial information and the set of position information for the reference portion of the instrument and registering the reference set of spatial information with a set of anatomical model information.
Holsing et al. (US20130231556) discloses systems, methods and devices are provided for forming a respiratory-gated point cloud of a patient's respiratory system and for placing a localization element in an organ of a patient.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT ANTHONY SKROBARCZYK whose telephone number is (571)272-3301. The examiner can normally be reached Monday thru Friday 7:30AM -5PM CST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kambiz Abdi can be reached at (571) 272-6702. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.A.S/Examiner, Art Unit 3685
/MARC Q JIMENEZ/Supervisory Patent Examiner, Art Unit 3681